Bartke, Stephan; Martinát, Stanislav; Klusáček, Petr; Pizzol, Lisa; Alexandrescu, Filip; Frantál, Bohumil; Critto, Andrea; Zabeo, Alex
2016-12-15
Prioritizing brownfields for redevelopment in real estate portfolios can contribute to more sustainable regeneration and land management. Owners of large real estate and brownfield portfolios are challenged to allocate their limited resources to the development of the most critical or promising sites, in terms of time and cost efficiency. Authorities worried about the negative impacts of brownfields - in particular in the case of potential contamination - on the environment and society also need to prioritize their resources to those brownfields that most urgently deserve attention and intervention. Yet, numerous factors have to be considered for prioritizing actions, in particular when adhering to sustainability principles. Several multiple-criteria decision analysis (MCDA) approaches and tools have been suggested in order to support these actors in managing their brownfield portfolios. Based on lessons learned from the literature on success factors, sustainability assessment and MCDA approaches, researchers from a recent EU project have developed the web-based Timbre Brownfield Prioritization Tool (TBPT). It facilitates assessment and prioritization of a portfolio of sites on the basis of the probability of successful and sustainable regeneration or according to individually specified objectives. This paper introduces the challenges of brownfield portfolio management in general and reports about the application of the TBPT in five cases: practical test-uses by two large institutional land owners from Germany, a local and a regional administrative body from the Czech Republic, and an expert from a national environmental authority from Romania. Based on literature requirements for sustainability assessment tools and on the end-users' feedbacks from the practical tests, we discuss the TBPT's strengths and weaknesses in order to inform and give recommendations for future development of prioritization tools. Copyright © 2016 Elsevier Ltd. All rights reserved.
Blackwell, Brett R.; Ankley, Gerald T.; Corsi, Steven; DeCicco, Laura; Houck, Kieth A.; Judson, Richard S.; Li, Shibin; Martin, Matthew T.; Murphy, Elizabeth; Schroeder, Anthony L.; Smith, Edwin R.; Swintek, Joe; Villeneuve, Daniel L.
2017-01-01
Current environmental monitoring approaches focus primarily on chemical occurrence. However, based on concentration alone, it can be difficult to identify which compounds may be of toxicological concern and should be prioritized for further monitoring, in-depth testing, or management. This can be problematic because toxicological characterization is lacking for many emerging contaminants. New sources of high-throughput screening (HTS) data, such as the ToxCast database, which contains information for over 9000 compounds screened through up to 1100 bioassays, are now available. Integrated analysis of chemical occurrence data with HTS data offers new opportunities to prioritize chemicals, sites, or biological effects for further investigation based on concentrations detected in the environment linked to relative potencies in pathway-based bioassays. As a case study, chemical occurrence data from a 2012 study in the Great Lakes Basin along with the ToxCast effects database were used to calculate exposure–activity ratios (EARs) as a prioritization tool. Technical considerations of data processing and use of the ToxCast database are presented and discussed. EAR prioritization identified multiple sites, biological pathways, and chemicals that warrant further investigation. Prioritized bioactivities from the EAR analysis were linked to discrete adverse outcome pathways to identify potential adverse outcomes and biomarkers for use in subsequent monitoring efforts.
Information prioritization for control and automation of space operations
NASA Technical Reports Server (NTRS)
Ray, Asock; Joshi, Suresh M.; Whitney, Cynthia K.; Jow, Hong N.
1987-01-01
The applicability of a real-time information prioritization technique to the development of a decision support system for control and automation of Space Station operations is considered. The steps involved in the technique are described, including the definition of abnormal scenarios and of attributes, measures of individual attributes, formulation and optimization of a cost function, simulation of test cases on the basis of the cost function, and examination of the simulation scenerios. A list is given comparing the intrinsic importances of various Space Station information data.
ProphTools: general prioritization tools for heterogeneous biological networks.
Navarro, Carmen; Martínez, Victor; Blanco, Armando; Cano, Carlos
2017-12-01
Networks have been proven effective representations for the analysis of biological data. As such, there exist multiple methods to extract knowledge from biological networks. However, these approaches usually limit their scope to a single biological entity type of interest or they lack the flexibility to analyze user-defined data. We developed ProphTools, a flexible open-source command-line tool that performs prioritization on a heterogeneous network. ProphTools prioritization combines a Flow Propagation algorithm similar to a Random Walk with Restarts and a weighted propagation method. A flexible model for the representation of a heterogeneous network allows the user to define a prioritization problem involving an arbitrary number of entity types and their interconnections. Furthermore, ProphTools provides functionality to perform cross-validation tests, allowing users to select the best network configuration for a given problem. ProphTools core prioritization methodology has already been proven effective in gene-disease prioritization and drug repositioning. Here we make ProphTools available to the scientific community as flexible, open-source software and perform a new proof-of-concept case study on long noncoding RNAs (lncRNAs) to disease prioritization. ProphTools is robust prioritization software that provides the flexibility not present in other state-of-the-art network analysis approaches, enabling researchers to perform prioritization tasks on any user-defined heterogeneous network. Furthermore, the application to lncRNA-disease prioritization shows that ProphTools can reach the performance levels of ad hoc prioritization tools without losing its generality. © The Authors 2017. Published by Oxford University Press.
Toya, Waki
2017-01-01
DNA paternity testing has recently become more widely available in Japan. The aim of this paper is to examine the issues surrounding (1) the implementing agency, whether the testing is conducted in a commercial direct-to-consumer (DTC) setting or a judicial non-DTC setting, and (2) the implementation conditions and more specifically the legal capacity of the proband (test subject). Literature research in Japanese and English was conducted. Some countries prohibit commercial DNA testing without the consent of the proband or her or his legally authorized representative. But as in some cases, the results of DTC paternity testing have proven to be unreliable. I propose a complete prohibition of DTC DNA paternity testing in Japan. In many cases of paternity testing, the proband is a minor. This has led to debate about whether proxy consent is sufficient for paternity testing or whether additional safeguards (such as a court order) are required. In cases where commercial DNA testing has been conducted and the test results are produced in court as evidence, the court must judge whether or not to admit these results as evidence. Another important issue is whether or not paternity testing should be legally mandated in certain cases. If we come to the conclusion that DNA test results are the only way to conclusively establish a parent-child relationship, then our society may prioritize even more genetic relatedness over other conceptions of a parent-child relationship. This prioritization could adversely affect families created through assisted reproductive technology (ART), especially in situations where children are not aware of their biological parentage. This paper argues for a complete prohibition of DTC DNA paternity testing in Japan, and highlights that broader ethical and legal deliberation on such genetic services is required.
Rafael, Oana C; Aziz, Mohamed; Raftopoulos, Harry; Vele, Oana E; Xu, Weisheng; Sugrue, Chiara
2014-06-01
Subtyping of lung carcinoma with immunohistochemistry is essential for diagnosis, whereas molecular testing (MT) is required for therapy guidance. In the current study, the authors report on MT performed on fine-needle aspiration specimens at the study institution over a 2-year period preceding the April 2013 College of American Pathologists (CAP)/International Association for the Study of Lung Cancer (IASLC)/Association for Molecular Pathology (AMP) Molecular Testing Guideline (MTG) publication. The database of the study institution was retrospectively queried for cases of lung and thoracic/lower cervical lymph node fine-needle aspiration specimens for 2011 through 2012. Of 246 selected cases, 26 featured a limited amount of material in cell blocks. MT increased significantly between 2011 and 2012 and was requested in 39.4% of cases (97 of 246 cases): 86 of those cases had at least 1 MT result and 11 had insufficient material for any MT. Anaplastic lymphoma kinase (ALK) testing was performed in 9 cases in which DNA was insufficient for epidermal growth factor receptor (EGFR) testing. In addition, 13 cases of adenocarcinoma/non-small cell lung carcinoma had at least 1 MT canceled because of insufficient DNA, but at the same time had an average of 3.46 immunohistochemical stains performed. Of all the cytology specimens, 10.6% featured limited material; however, no universally accepted testing sequence priority was available at the time the study was performed. As per the MTG, MT should take precedence over immunohistochemistry in cases of adenocarcinoma/non-small cell lung carcinoma. Approximately 5.3% of the specimens in the current study had insufficient material for MT while having multiple stains performed instead. The MTG also recommend performing EGFR before ALK testing; the authors found 9 cases with insufficient material for EGFR testing that had ALK testing performed. The results of the current study underscore the need for a testing prioritization algorithm in view of the MTG publication to serve as reference for both clinicians and pathologists. © 2014 American Cancer Society.
Armsworth, Paul R; Jackson, Heather B; Cho, Seong-Hoon; Clark, Melissa; Fargione, Joseph E; Iacona, Gwenllian D; Kim, Taeyoung; Larson, Eric R; Minney, Thomas; Sutton, Nathan A
2017-12-21
Conservation organizations must redouble efforts to protect habitat given continuing biodiversity declines. Prioritization of future areas for protection is hampered by disagreements over what the ecological targets of conservation should be. Here we test the claim that such disagreements will become less important as conservation moves away from prioritizing areas for protection based only on ecological considerations and accounts for varying costs of protection using return-on-investment (ROI) methods. We combine a simulation approach with a case study of forests in the eastern United States, paying particular attention to how covariation between ecological benefits and economic costs influences agreement levels. For many conservation goals, agreement over spatial priorities improves with ROI methods. However, we also show that a reliance on ROI-based prioritization can sometimes exacerbate disagreements over priorities. As such, accounting for costs in conservation planning does not enable society to sidestep careful consideration of the ecological goals of conservation.
Prioritization in comparative effectiveness research: the CANCERGEN Experience.
Thariani, Rahber; Wong, William; Carlson, Josh J; Garrison, Louis; Ramsey, Scott; Deverka, Patricia A; Esmail, Laura; Rangarao, Sneha; Hoban, Carolyn J; Baker, Laurence H; Veenstra, David L
2012-05-01
Systematic approaches to stakeholder-informed research prioritization are a central focus of comparative effectiveness research. Genomic testing in cancer is an ideal area to refine such approaches given rapid innovation and potentially significant impacts on patient outcomes. To develop and pilot test a stakeholder-informed approach to prioritizing genomic tests for future study in collaboration with the cancer clinical trials consortium SWOG. We conducted a landscape analysis to identify genomic tests in oncology using a systematic search of published and unpublished studies, and expert consultation. Clinically valid tests suitable for evaluation in a comparative study were presented to an external stakeholder group. Domains to guide the prioritization process were identified with stakeholder input, and stakeholders ranked tests using multiple voting rounds. A stakeholder group was created including representatives from patient-advocacy groups, payers, test developers, regulators, policy makers, and community-based oncologists. We identified 9 domains for research prioritization with stakeholder feedback: population impact; current standard of care, strength of association; potential clinical benefits, potential clinical harms, economic impacts, evidence of need, trial feasibility, and market factors. The landscape analysis identified 635 studies; of 9 tests deemed to have sufficient clinical validity, 6 were presented to stakeholders. Two tests in lung cancer (ERCC1 and EGFR) and 1 test in breast cancer (CEA/CA15-3/CA27.29) were identified as top research priorities. Use of a diverse stakeholder group to inform research prioritization is feasible in a pragmatic and timely manner. Additional research is needed to optimize search strategies, stakeholder group composition, and integration with existing prioritization mechanisms.
Prioritization in Comparative Effectiveness Research: The CANCERGEN Experience in Cancer Genomics
Thariani, Rahber; Wong, William; Carlson, Josh J; Garrison, Louis; Ramsey, Scott; Deverka, Patricia A; Esmail, Laura; Rangarao, Sneha; Hoban, Carolyn J; Baker, Laurence H; Veenstra, David L
2012-01-01
Background Systematic approaches to stakeholder-informed research prioritization are a central focus of comparative effectiveness research. Genomic testing in cancer is an ideal area to refine such approaches given rapid innovation and potentially significant impacts on patient outcomes. Objective To develop and pilot-test a stakeholder-informed approach to prioritizing genomic tests for future study in collaboration with the cancer clinical trials consortium SWOG. Methods We conducted a landscape-analysis to identify genomic tests in oncology using a systematic search of published and unpublished studies, and expert consultation. Clinically valid tests suitable for evaluation in a comparative study were presented to an external stakeholder group. Domains to guide the prioritization process were identified with stakeholder input, and stakeholders ranked tests using multiple voting rounds. Results A stakeholder group was created including representatives from patient-advocacy groups, payers, test developers, regulators, policy-makers, and community-based oncologists. We identified nine domains for research prioritization with stakeholder feedback: population impact; current standard of care, strength of association; potential clinical benefits, potential clinical harms, economic impacts, evidence of need, trial feasibility, and market factors. The landscape-analysis identified 635 studies; of 9 tests deemed to have sufficient clinical validity, 6 were presented to stakeholders. Two tests in lung cancer (ERCC1 and EGFR) and one test in breast cancer (CEA/CA15-3/CA27.29) were identified as top research priorities. Conclusions Use of a diverse stakeholder group to inform research prioritization is feasible in a pragmatic and timely manner. Additional research is needed to optimize search strategies, stakeholder group composition and integration with existing prioritization mechanisms. PMID:22274803
Automated Generation and Assessment of Autonomous Systems Test Cases
NASA Technical Reports Server (NTRS)
Barltrop, Kevin J.; Friberg, Kenneth H.; Horvath, Gregory A.
2008-01-01
This slide presentation reviews some of the issues concerning verification and validation testing of autonomous spacecraft routinely culminates in the exploration of anomalous or faulted mission-like scenarios using the work involved during the Dawn mission's tests as examples. Prioritizing which scenarios to develop usually comes down to focusing on the most vulnerable areas and ensuring the best return on investment of test time. Rules-of-thumb strategies often come into play, such as injecting applicable anomalies prior to, during, and after system state changes; or, creating cases that ensure good safety-net algorithm coverage. Although experience and judgment in test selection can lead to high levels of confidence about the majority of a system's autonomy, it's likely that important test cases are overlooked. One method to fill in potential test coverage gaps is to automatically generate and execute test cases using algorithms that ensure desirable properties about the coverage. For example, generate cases for all possible fault monitors, and across all state change boundaries. Of course, the scope of coverage is determined by the test environment capabilities, where a faster-than-real-time, high-fidelity, software-only simulation would allow the broadest coverage. Even real-time systems that can be replicated and run in parallel, and that have reliable set-up and operations features provide an excellent resource for automated testing. Making detailed predictions for the outcome of such tests can be difficult, and when algorithmic means are employed to produce hundreds or even thousands of cases, generating predicts individually is impractical, and generating predicts with tools requires executable models of the design and environment that themselves require a complete test program. Therefore, evaluating the results of large number of mission scenario tests poses special challenges. A good approach to address this problem is to automatically score the results based on a range of metrics. Although the specific means of scoring depends highly on the application, the use of formal scoring - metrics has high value in identifying and prioritizing anomalies, and in presenting an overall picture of the state of the test program. In this paper we present a case study based on automatic generation and assessment of faulted test runs for the Dawn mission, and discuss its role in optimizing the allocation of resources for completing the test program.
A Decision Analytic Approach to Exposure-Based Chemical Prioritization
Mitchell, Jade; Pabon, Nicolas; Collier, Zachary A.; Egeghy, Peter P.; Cohen-Hubal, Elaine; Linkov, Igor; Vallero, Daniel A.
2013-01-01
The manufacture of novel synthetic chemicals has increased in volume and variety, but often the environmental and health risks are not fully understood in terms of toxicity and, in particular, exposure. While efforts to assess risks have generally been effective when sufficient data are available, the hazard and exposure data necessary to assess risks adequately are unavailable for the vast majority of chemicals in commerce. The US Environmental Protection Agency has initiated the ExpoCast Program to develop tools for rapid chemical evaluation based on potential for exposure. In this context, a model is presented in which chemicals are evaluated based on inherent chemical properties and behaviorally-based usage characteristics over the chemical’s life cycle. These criteria are assessed and integrated within a decision analytic framework, facilitating rapid assessment and prioritization for future targeted testing and systems modeling. A case study outlines the prioritization process using 51 chemicals. The results show a preliminary relative ranking of chemicals based on exposure potential. The strength of this approach is the ability to integrate relevant statistical and mechanistic data with expert judgment, allowing for an initial tier assessment that can further inform targeted testing and risk management strategies. PMID:23940664
Assessing the Robustness of Chemical Prioritizations Based on ToxCast Chemical Profiling
A central goal of the U.S. EPA’s ToxCast™ program is to provide empirical, scientific evidence to aid in prioritizing the toxicity testing of thousands of chemicals. The agency has developed a prioritization approach, the Toxicological Prioritization Index (ToxPi™), that calculat...
Severin, Franziska; Borry, Pascal; Cornel, Martina C; Daniels, Norman; Fellmann, Florence; Victoria Hodgson, Shirley; Howard, Heidi C; John, Jürgen; Kääriäinen, Helena; Kayserili, Hülya; Kent, Alastair; Koerber, Florian; Kristoffersson, Ulf; Kroese, Mark; Lewis, Celine; Marckmann, Georg; Meyer, Peter; Pfeufer, Arne; Schmidtke, Jörg; Skirton, Heather; Tranebjærg, Lisbeth; Rogowski, Wolf H
2015-01-01
Given the cost constraints of the European health-care systems, criteria are needed to decide which genetic services to fund from the public budgets, if not all can be covered. To ensure that high-priority services are available equitably within and across the European countries, a shared set of prioritization criteria would be desirable. A decision process following the accountability for reasonableness framework was undertaken, including a multidisciplinary EuroGentest/PPPC-ESHG workshop to develop shared prioritization criteria. Resources are currently too limited to fund all the beneficial genetic testing services available in the next decade. Ethically and economically reflected prioritization criteria are needed. Prioritization should be based on considerations of medical benefit, health need and costs. Medical benefit includes evidence of benefit in terms of clinical benefit, benefit of information for important life decisions, benefit for other people apart from the person tested and the patient-specific likelihood of being affected by the condition tested for. It may be subject to a finite time window. Health need includes the severity of the condition tested for and its progression at the time of testing. Further discussion and better evidence is needed before clearly defined recommendations can be made or a prioritization algorithm proposed. To our knowledge, this is the first time a clinical society has initiated a decision process about health-care prioritization on a European level, following the principles of accountability for reasonableness. We provide points to consider to stimulate this debate across the EU and to serve as a reference for improving patient management. PMID:25248395
Severin, Franziska; Borry, Pascal; Cornel, Martina C; Daniels, Norman; Fellmann, Florence; Victoria Hodgson, Shirley; Howard, Heidi C; John, Jürgen; Kääriäinen, Helena; Kayserili, Hülya; Kent, Alastair; Koerber, Florian; Kristoffersson, Ulf; Kroese, Mark; Lewis, Celine; Marckmann, Georg; Meyer, Peter; Pfeufer, Arne; Schmidtke, Jörg; Skirton, Heather; Tranebjærg, Lisbeth; Rogowski, Wolf H
2015-06-01
Given the cost constraints of the European health-care systems, criteria are needed to decide which genetic services to fund from the public budgets, if not all can be covered. To ensure that high-priority services are available equitably within and across the European countries, a shared set of prioritization criteria would be desirable. A decision process following the accountability for reasonableness framework was undertaken, including a multidisciplinary EuroGentest/PPPC-ESHG workshop to develop shared prioritization criteria. Resources are currently too limited to fund all the beneficial genetic testing services available in the next decade. Ethically and economically reflected prioritization criteria are needed. Prioritization should be based on considerations of medical benefit, health need and costs. Medical benefit includes evidence of benefit in terms of clinical benefit, benefit of information for important life decisions, benefit for other people apart from the person tested and the patient-specific likelihood of being affected by the condition tested for. It may be subject to a finite time window. Health need includes the severity of the condition tested for and its progression at the time of testing. Further discussion and better evidence is needed before clearly defined recommendations can be made or a prioritization algorithm proposed. To our knowledge, this is the first time a clinical society has initiated a decision process about health-care prioritization on a European level, following the principles of accountability for reasonableness. We provide points to consider to stimulate this debate across the EU and to serve as a reference for improving patient management.
NASA Astrophysics Data System (ADS)
Yuen, Kevin Kam Fung
2009-10-01
The most appropriate prioritization method is still one of the unsettled issues of the Analytic Hierarchy Process, although many studies have been made and applied. Interestingly, many AHP applications apply only Saaty's Eigenvector method as many studies have found that this method may produce rank reversals and have proposed various prioritization methods as alternatives. Some methods have been proved to be better than the Eigenvector method. However, these methods seem not to attract the attention of researchers. In this paper, eight important prioritization methods are reviewed. A Mixed Prioritization Operators Strategy (MPOS) is developed to select a vector which is prioritized by the most appropriate prioritization operator. To verify this new method, a case study of high school selection is revised using the proposed method. The contribution is that MPOS is useful for solving prioritization problems in the AHP.
Wu, Un-In; Wang, Jann-Tay; Chang, Shan-Chwen; Chuang, Yu-Chung; Lin, Wei-Ru; Lu, Min-Chi; Lu, Po-Liang; Hu, Fu-Chang; Chuang, Jen-Hsiang; Chen, Yee-Chun
2014-06-01
A multicenter, hospital-wide, clinical and epidemiological study was conducted to assess the effectiveness of the mass influenza vaccination program during the 2009 H1N1 influenza pandemic, and the impact of the prioritization strategy among people at different levels of risk. Among the 34 359 medically attended patients who displayed an influenza-like illness and had a rapid influenza diagnostic test (RIDT) at one of the three participating hospitals, 21.0% tested positive for influenza A. The highest daily number of RIDT-positive cases in each hospital ranged from 33 to 56. A well-fitted multiple linear regression time-series model (R(2)=0.89) showed that the establishment of special community flu clinics averted an average of nine cases daily (p=0.005), and an increment of 10% in daily mean level of population immunity against pH1N1 through vaccination prevented five cases daily (p<0.001). Moreover, the regression model predicted five-fold or more RIDT-positive cases if the mass influenza vaccination program had not been implemented, and 39.1% more RIDT-positive cases if older adults had been prioritized for vaccination above school-aged children. Mass influenza vaccination was an effective control measure, and school-aged children should be assigned a higher priority for vaccination than older adults during an influenza pandemic. Copyright © 2014 The Authors. Published by Elsevier Ltd.. All rights reserved.
Van Bossuyt, Melissa; Van Hoeck, Els; Raitano, Giuseppa; Manganelli, Serena; Braeken, Els; Ates, Gamze; Vanhaecke, Tamara; Van Miert, Sabine; Benfenati, Emilio; Mertens, Birgit; Rogiers, Vera
2017-04-01
Over the last years, more stringent safety requirements for an increasing number of chemicals across many regulatory fields (e.g. industrial chemicals, pharmaceuticals, food, cosmetics, …) have triggered the need for an efficient screening strategy to prioritize the substances of highest concern. In this context, alternative methods such as in silico (i.e. computational) techniques gain more and more importance. In the current study, a new prioritization strategy for identifying potentially mutagenic substances was developed based on the combination of multiple (quantitative) structure-activity relationship ((Q)SAR) tools. Non-evaluated substances used in printed paper and board food contact materials (FCM) were selected for a case study. By applying our strategy, 106 out of the 1723 substances were assigned 'high priority' as they were predicted mutagenic by 4 different (Q)SAR models. Information provided within the models allowed to identify 53 substances for which Ames mutagenicity prediction already has in vitro Ames test results. For further prioritization, additional support could be obtained by applying local i.e. specific models, as demonstrated here for aromatic azo compounds, typically found in printed paper and board FCM. The strategy developed here can easily be applied to other groups of chemicals facing the same need for priority ranking. Copyright © 2017 Elsevier Ltd. All rights reserved.
EPA is developing methods for utilizing computational chemistry, high-throughput screening (HTS)and genomic technologies to predict potential toxicity and prioritize the use of limited testing resources.
PERCH: A Unified Framework for Disease Gene Prioritization.
Feng, Bing-Jian
2017-03-01
To interpret genetic variants discovered from next-generation sequencing, integration of heterogeneous information is vital for success. This article describes a framework named PERCH (Polymorphism Evaluation, Ranking, and Classification for a Heritable trait), available at http://BJFengLab.org/. It can prioritize disease genes by quantitatively unifying a new deleteriousness measure called BayesDel, an improved assessment of the biological relevance of genes to the disease, a modified linkage analysis, a novel rare-variant association test, and a converted variant call quality score. It supports data that contain various combinations of extended pedigrees, trios, and case-controls, and allows for a reduced penetrance, an elevated phenocopy rate, liability classes, and covariates. BayesDel is more accurate than PolyPhen2, SIFT, FATHMM, LRT, Mutation Taster, Mutation Assessor, PhyloP, GERP++, SiPhy, CADD, MetaLR, and MetaSVM. The overall approach is faster and more powerful than the existing quantitative method pVAAST, as shown by the simulations of challenging situations in finding the missing heritability of a complex disease. This framework can also classify variants of unknown significance (variants of uncertain significance) by quantitatively integrating allele frequencies, deleteriousness, association, and co-segregation. PERCH is a versatile tool for gene prioritization in gene discovery research and variant classification in clinical genetic testing. © 2016 The Authors. **Human Mutation published by Wiley Periodicals, Inc.
Exposure Science for Chemical Prioritization and Toxicity Testing
Currently, a significant research effort is underway to apply new technologies to screen and prioritize chemicals for toxicity testing as well as to improve understanding of toxicity pathways (Dix et al. 2007, Toxicol Sci; NRC, 2007, Toxicity Testing in the 21st Century; Collins ...
ExpoCast: Exposure Science for Prioritization and Toxicity Testing (S)
The US EPA is completing the Phase I pilot for a chemical prioritization research program, called ToxCast. Here EPA is developing methods for using computational chemistry, high-throughput screening, and toxicogenomic technologies to predict potential toxicity and prioritize limi...
ExpoCast: Exposure Science for Prioritization and Toxicity Testing
The US EPA is completing the Phase I pilot for a chemical prioritization research program, called ToxCastTM. Here EPA is developing methods for using computational chemistry, high-throughput screening, and toxicogenomic technologies to predict potential toxicity and prioritize l...
THE TOXCAST PROGRAM FOR PRIORITIZING TOXICITY TESTING OF ENVIRONMENTAL CHEMICALS
The United States Environmental Protection Agency (EPA) is developing methods for utilizing computational chemistry, high-throughput screening (HTS) and various toxicogenomic technologies to predict potential for toxicity and prioritize limited testing resources towards chemicals...
One use of alternative methods is to target animal use at only those chemicals and tests that are absolutely necessary. We discuss prioritization of testing based on high-throughput screening assays (HTS), QSAR modeling, high-throughput toxicokinetics (HTTK), and exposure modelin...
Most of the over 2800 nanomaterials (NMs) in commerce lack hazard data. Efficient NM testing requires suitable toxicity tests for prioritization of NMs to be tested. The EPA’s ToxCast program is evaluating HTS assays to prioritize NMs for targeted testing. Au, Ag, CeO2, Cu(O2), T...
Prioritizing ToxCast Chemicals Across Multiple Sectors of Toxicity Using ToxPi
The Toxicological Prioritization Index (ToxPi™) framework was developed as a decision-support tool to aid in the rational prioritization of chemicals for integrated toxicity testing. ToxPi consolidates information from multiple domains—including ToxCast™ in vitro bioactivity prof...
EPAs National Center for Computational Toxicology is developing methods that apply computational chemistry, high-throughput screening (HTS) and genomic technologies to predict potential toxicity and prioritize the use of limited testing resources.
EPA is developing methods for utilizing computational chemistry, high-throughput screening (HTS) and various toxicogenomic technologies to predict potential for toxicity and prioritize limited testing resources towards chemicals that likely represent the greatest hazard to human ...
The Toxicological Prioritization Index (ToxPi™) framework was developed as a decision-support tool to aid in the prioritization of chemicals for integrated toxicity testing. ToxPi consolidates information from multiple domains - including ToxCast™ in vitro bioactivity profiles (a...
One of the strategic objectives of the Computational Toxicology Program is to develop approaches for prioritizing chemicals for subsequent screening and testing. Approaches currently available for this process require extensive resources. Therefore, less costly and time-extensi...
A model for prioritizing landfills for remediation and closure: A case study in Serbia.
Ubavin, Dejan; Agarski, Boris; Maodus, Nikola; Stanisavljevic, Nemanja; Budak, Igor
2018-01-01
The existence of large numbers of landfills that do not fulfill sanitary prerequisites presents a serious hazard for the environment in lower income countries. One of the main hazards is landfill leachate that contains various pollutants and presents a threat to groundwater. Groundwater pollution from landfills depends on various mutually interconnected factors such as the waste type and amount, the amount of precipitation, the landfill location characteristics, and operational measures, among others. Considering these factors, lower income countries face a selection problem where landfills urgently requiring remediation and closure must be identified from among a large number of sites. The present paper proposes a model for prioritizing landfills for closure and remediation based on multicriteria decision making, in which the hazards of landfill groundwater pollution are evaluated. The parameters for the prioritization of landfills are the amount of waste disposed, the amount of precipitation, the vulnerability index, and the rate of increase of the amount of waste in the landfill. Verification was performed using a case study in Serbia where all municipal landfills were included and 128 landfills were selected for prioritization. The results of the evaluation of Serbian landfills, prioritizing sites for closure and remediation, are presented for the first time. Critical landfills are identified, and prioritization ranks for the selected landfills are provided. Integr Environ Assess Manag 2018;14:105-119. © 2017 SETAC. © 2017 SETAC.
Kaplan, Barbara; Ura, Darla
2010-07-01
The student clinical experience is rich, yet challenges arise in providing experiences where leadership skills can be developed and used in nursing practice. To increase student confidence and enhance student ability to safely and effectively prioritize, delegate, and implement care for numerous patients, a simulation-based learning (SBL) experience was developed. The SBL experience involves multiple patient simulators, case study analysis, and a debriefing session. Ninety-seven senior nursing students participated in this program. Students reported through Likert surveys to either "agree" or "strongly agree" that the SBL was well organized (87%, n = 84), prompted realistic expectations (59%, n = 57), the scenarios were believable (73%, n = 71), case studies increased understanding (66%, n = 64), and that the SBL experience increased understanding of prioritizing and delegating care (69%, n = 67). Seventy-eight percent (n = 76) reported "more confidence in ability to work as a team" and 55% (n = 52) reported "more confidence in prioritizing and delegating care." Copyright 2010, SLACK Incorporated.
Settling the score: variant prioritization and Mendelian disease
Eilbeck, Karen; Quinlan, Aaron; Yandell, Mark
2018-01-01
When investigating Mendelian disease using exome or genome sequencing, distinguishing disease-causing genetic variants from the multitude of candidate variants is a complex, multidimensional task. Many prioritization tools and online interpretation resources exist, and professional organizations have offered clinical guidelines for review and return of prioritization results. In this Review, we describe the strengths and weaknesses of widely used computational approaches, explain their roles in the diagnostic and discovery process and discuss how they can inform (and misinform) expert reviewers. We place variant prioritization in the wider context of gene prioritization, burden testing and genotype–phenotype association, and we discuss opportunities and challenges introduced by whole-genome sequencing. PMID:28804138
DOT National Transportation Integrated Search
2011-12-01
Several agencies are applying asset management principles as a business tool and paradigm to help them define goals and prioritize agency resources in decision making. Previously, transportation asset management (TAM) has focused more on big ticke...
High Throughput Prioritization for Integrated Toxicity Testing Based on ToxCast Chemical Profiling
The rational prioritization of chemicals for integrated toxicity testing is a central goal of the U.S. EPA’s ToxCast™ program (http://epa.gov/ncct/toxcast/). ToxCast includes a wide-ranging battery of over 500 in vitro high-throughput screening assays which in Phase I was used to...
High-throughput, lower-cost, in vitro toxicity testing is currently being evaluated for use in prioritization and eventually for predicting in vivo toxicity. Interpreting in vitro data in the context of in vivo human relevance remains a formidable challenge. A key component in us...
Hmielowski, Tracy L; Carter, Sarah K; Spaul, Hannah; Helmers, David; Radeloff, Volker C; Zedler, Paul
2016-06-01
One challenge in the effort to conserve biodiversity is identifying where to prioritize resources for active land management. Cost-benefit analyses have been used successfully as a conservation tool to identify sites that provide the greatest conservation benefit per unit cost. Our goal was to apply cost-benefit analysis to the question of how to prioritize land management efforts, in our case the application of prescribed fire to natural landscapes in Wisconsin, USA. We quantified and mapped frequently burned communities and prioritized management units based on a suite of indices that captured ecological benefits, management effort, and the feasibility of successful long-term management actions. Data for these indices came from LANDFIRE, Wisconsin's Wildlife Action Plan, and a nationwide wildland-urban interface assessment. We found that the majority of frequently burned vegetation types occurred in the southern portion of the state. However, the highest priority areas for applying prescribed fire occurred in the central, northwest, and northeast portion of the state where frequently burned vegetation patches were larger and where identified areas of high biological importance area occurred. Although our focus was on the use of prescribed fire in Wisconsin, our methods can be adapted to prioritize other land management activities. Such prioritization is necessary to achieve the greatest possible benefits from limited funding for land management actions, and our results show that it is feasible at scales that are relevant for land management decisions.
Using needs-based frameworks for evaluating new technologies: an application to genetic tests.
Rogowski, Wolf H; Schleidgen, Sebastian
2015-02-01
Given the multitude of newly available genetic tests in the face of limited healthcare budgets, the European Society of Human Genetics assessed how genetic services can be prioritized fairly. Using (health) benefit maximizing frameworks for this purpose has been criticized on the grounds that rather than maximization, fairness requires meeting claims (e.g. based on medical need) equitably. This study develops a prioritization score for genetic tests to facilitate equitable allocation based on need-based claims. It includes attributes representing health need associated with hereditary conditions (severity and progression), a genetic service's suitability to alleviate need (evidence of benefit and likelihood of positive result) and costs to meet the needs. A case study for measuring the attributes is provided and a suggestion is made how need-based claims can be quantified in a priority function. Attribute weights can be informed by data from discrete-choice experiments. Further work is needed to measure the attributes across the multitude of genetic tests and to determine appropriate weights. The priority score is most likely to be considered acceptable if developed within a decision process which meets criteria of procedural fairness and if the priority score is interpreted as "strength of recommendation" rather than a fixed cut-off value. Copyright © 2014. Published by Elsevier Ireland Ltd.
Harries, Priscilla; Tomlinson, Christopher; Notley, Elizabeth; Davies, Miranda; Gilhooly, Kenneth
2012-01-01
In the community mental health field, occupational therapy students lack the capacity to prioritize referrals effectively. The purpose of this study was to test the effectiveness of a clinical decision-training aid on referral prioritization capacity. A double-blind, parallel-group, randomized controlled trial was conducted using a judgment analysis approach. Each participant used the World Wide Web to prioritize referral sets at baseline, immediate posttest, and 2-wk follow-up. The intervention group was provided with training after baseline testing; control group was purely given instructions to continue with the task. One hundred sixty-five students were randomly allocated to intervention (n = 87) or control (n = 81). Intervention. Written and graphical descriptions were given of an expert consensus standard explaining how referral information should be used to prioritize referrals. Participants' prioritization ratings were correlated with the experts' ratings of the same referrals at each stage of testing, as well as to examine the effect on mean group scores, regression weights, and the lens model indices. At baseline, no differences were found between control and intervention on rating capacity or demographic characteristics. Comparison of the difference in mean correlation baseline scores of the control and intervention group compared with immediate posttest showed a statistically significant result that was maintained at 2-wk follow-up. The effect size was classified as large. At immediate posttest and follow-up, the intervention group improved rating capacity, whereas the control group's capacity remained poor. The results of this study indicate that the decision-training aid has a positive effect on referral prioritization capacity. This freely available, Web-based decision-training aid will be a valuable adjunct to the education of these novice health professionals internationally.
Sutton, N J; Armsworth, P R
2014-12-01
Facing tight resource constraints, conservation organizations must allocate funds available for habitat protection as effectively as possible. Often, they combine spatially referenced economic and biodiversity data to prioritize land for protection. We tested how sensitive these prioritizations could be to differences in the spatial grain of these data by demonstrating how the conclusion of a classic debate in conservation planning between cost and benefit targeting was altered based on the available information. As a case study, we determined parcel-level acquisition costs and biodiversity benefits of land transactions recently undertaken by a nonprofit conservation organization that seeks to protect forests in the eastern United States. Then, we used hypothetical conservation plans to simulate the types of ex ante priorities that an organization could use to prioritize areas for protection. We found the apparent effectiveness of cost and benefit targeting depended on the spatial grain of the data used when prioritizing parcels based on local species richness. However, when accounting for complementarity, benefit targeting consistently was more efficient than a cost targeting strategy regardless of the spatial grain of the data involved. More pertinently for other studies, we found that combining data collected over different spatial grains inflated the apparent effectiveness of a cost targeting strategy and led to overestimation of the efficiency gain offered by adopting a more integrative return-on-investment approach. © 2014 Society for Conservation Biology.
Case study: Prioritization strategies for reforestation of minelands to benefit Cerulean Warblers
McDermott, Molly E.; Shumar, Matthew B.; Wood, Petra Bohall
2013-01-01
The central Appalachian landscape is being heavily altered by surface coal mining. The practice of Mountaintop Removal/Valley Fill (MTRVF) mining has transformed large areas of mature forest to non-forest and created much forest edge, affecting habitat quality for mature forest wildlife. The Appalachian Regional Reforestation Initiative is working to restore mined areas to native hardwood forest conditions, and strategies are needed to prioritize restoration efforts for wildlife. We present mineland reforestation guidelines for the imperiled Cerulean Warbler, considered a useful umbrella species, in its breeding range. In 2009, we surveyed forest predicted to have Cerulean Warblers near mined areas in the MTRVF region of West Virginia and Kentucky. We visited 36 transect routes and completed songbird surveys on 151 points along these routes. Cerulean Warblers were present at points with fewer large-scale canopy disturbances and more mature oak-hickory forest. We tested the accuracy of a predictive map for this species and demonstrated that it can be useful to guide reforestation efforts. We then developed a map of hot spot locations that can be used to determine potential habitat suitability. Restoration efforts would have greatest benefit for Cerulean Warblers and other mature forest birds if concentrated near a relative-abundance hot spot, on north- and east-facing ridgetops surrounded by mature deciduous forest, and prioritized to reduce edges and connect isolated forest patches. Our multi-scale approach for prioritizing restoration efforts using an umbrella species may be applied to restore habitat impacted by a variety of landscape disturbances.
Care in post-traumatic syndrome due to gender violence: a case report.
Sánchez-Herrero, Héctor; Duarte-Clíments, Gonzalo; González-Pérez, Teodoro; Sánchez-Gómez, María Begoña; Gomariz-Bolarín, David
This article describes a clinical case of a patient attended at a continuous care point for a generalized anxiety disorder, principally due to abuse suffered from her ex partner. The patient was followed up at family nursing clinic, and the appropriate nursing interventions were developed to cover a series of needs prioritized by nurses using the AREA method and taking into account the prioritization of the user herself. Copyright © 2017 Elsevier España, S.L.U. All rights reserved.
Component Prioritization Schema for Achieving Maximum Time and Cost Benefits from Software Testing
NASA Astrophysics Data System (ADS)
Srivastava, Praveen Ranjan; Pareek, Deepak
Software testing is any activity aimed at evaluating an attribute or capability of a program or system and determining that it meets its required results. Defining the end of software testing represents crucial features of any software development project. A premature release will involve risks like undetected bugs, cost of fixing faults later, and discontented customers. Any software organization would want to achieve maximum possible benefits from software testing with minimum resources. Testing time and cost need to be optimized for achieving a competitive edge in the market. In this paper, we propose a schema, called the Component Prioritization Schema (CPS), to achieve an effective and uniform prioritization of the software components. This schema serves as an extension to the Non Homogenous Poisson Process based Cumulative Priority Model. We also introduce an approach for handling time-intensive versus cost-intensive projects.
Kundrick, Avery; Huang, Zhuojie; Carran, Spencer; Kagoli, Matthew; Grais, Rebecca Freeman; Hurtado, Northan; Ferrari, Matthew
2018-06-15
Despite progress towards increasing global vaccination coverage, measles continues to be one of the leading, preventable causes of death among children worldwide. Whether and how to target sub-national areas for vaccination campaigns continues to remain a question. We analyzed three metrics for prioritizing target areas: vaccination coverage, susceptible birth cohort, and the effective reproductive ratio (R E ) in the context of the 2010 measles epidemic in Malawi. Using case-based surveillance data from the 2010 measles outbreak in Malawi, we estimated vaccination coverage from the proportion of cases reporting with a history of prior vaccination at the district and health facility catchment scale. Health facility catchments were defined as the set of locations closer to a given health facility than to any other. We combined these estimates with regional birth rates to estimate the size of the annual susceptible birth cohort. We also estimated the effective reproductive ratio, R E , at the health facility polygon scale based on the observed rate of exponential increase of the epidemic. We combined these estimates to identify spatial regions that would be of high priority for supplemental vaccination activities. The estimated vaccination coverage across all districts was 84%, but ranged from 61 to 99%. We found that 8 districts and 354 health facility catchments had estimated vaccination coverage below 80%. Areas that had highest birth cohort size were frequently large urban centers that had high vaccination coverage. The estimated R E ranged between 1 and 2.56. The ranking of districts and health facility catchments as priority areas varied depending on the measure used. Each metric for prioritization may result in discrete target areas for vaccination campaigns; thus, there are tradeoffs to choosing one metric over another. However, in some cases, certain areas may be prioritized by all three metrics. These areas should be treated with particular concern. Furthermore, the spatial scale at which each metric is calculated impacts the resulting prioritization and should also be considered when prioritizing areas for vaccination campaigns. These methods may be used to allocate effort for prophylactic campaigns or to prioritize response for outbreak response vaccination.
Li, Zhongshan; Liu, Zhenwei; Jiang, Yi; Chen, Denghui; Ran, Xia; Sun, Zhong Sheng; Wu, Jinyu
2017-01-01
Exome sequencing has been widely used to identify the genetic variants underlying human genetic disorders for clinical diagnoses, but the identification of pathogenic sequence variants among the huge amounts of benign ones is complicated and challenging. Here, we describe a new Web server named mirVAFC for pathogenic sequence variants prioritizations from clinical exome sequencing (CES) variant data of single individual or family. The mirVAFC is able to comprehensively annotate sequence variants, filter out most irrelevant variants using custom criteria, classify variants into different categories as for estimated pathogenicity, and lastly provide pathogenic variants prioritizations based on classifications and mutation effects. Case studies using different types of datasets for different diseases from publication and our in-house data have revealed that mirVAFC can efficiently identify the right pathogenic candidates as in original work in each case. Overall, the Web server mirVAFC is specifically developed for pathogenic sequence variant identifications from family-based CES variants using classification-based prioritizations. The mirVAFC Web server is freely accessible at https://www.wzgenomics.cn/mirVAFC/. © 2016 WILEY PERIODICALS, INC.
ToxCast: Developing Predictive Signatures of Chemically Induced Toxicity (S)
ToxCast, the United States Environmental Protection Agency’s chemical prioritization research program, is developing methods for utilizing computational chemistry, bioactivity profiling and toxicogenomic data to predict potential for toxicity and prioritize limited testing resour...
BUTIMBA: Intensifying the Hunt for Child TB in Swaziland through Household Contact Tracing
Alonso Ustero, Pilar; Golin, Rachel; Anabwani, Florence; Mzileni, Bulisile; Sikhondze, Welile; Stevens, Robert
2017-01-01
Background Limited data exists to inform contact tracing guidelines in children and HIV-affected populations. We evaluated the yield and additionality of household contact and source case investigations in Swaziland, a TB/HIV high-burden setting, while prioritizing identification of childhood TB. Methods In partnership with 7 local TB clinics, we implemented standardized contact tracing of index cases (IC) receiving TB treatment. Prioritizing child contacts and HIV-affected households, screening officers screened contacts for TB symptoms and to identify risk factors associated with TB. We ascertained factors moderating the yield of contact tracing and measured the impact of our program by additional notifications. Results From March 2013 to November 2015, 3,258 ICs (54% bacteriologically confirmed; 70% HIV-infected; 85% adults) were enrolled leading to evaluation of 12,175 contacts (median age 18 years, IQR 24–42; 45% children; 9% HIV-infected). Among contacts, 196 TB cases (56% bacteriologically confirmed) were diagnosed resulting in a program yield of 1.6% for all forms of TB. The number needed to screen (NNS) to identify a bacteriologically confirmed TB case or all forms TB case traced from a child IC <5 years was respectively 62% and 40% greater than the NNS for tracing from an adult IC. In year one, we demonstrated a 32% increase in detection of bacteriologically confirmed child TB. Contacts were more likely to have TB if <5 years (OR = 2.0), HIV-infected (OR = 4.9), reporting ≥1 TB symptoms (OR = 7.7), and sharing a bed (OR = 1.7) or home (OR = 1.4) with the IC. There was a 1.4 fold increased chance of detecting a TB case in households known to be HIV-affected. Conclusion Contact tracing prioritizing children is not only feasible in a TB/HIV high-burden setting but contributes to overall case detection. Our findings support WHO guidelines prioritizing contact tracing among children and HIV-infected populations while highlighting potential to integrate TB and HIV case finding. PMID:28107473
Gee, Melanie; Bhanbhro, Sadiq; Cook, Sarah; Killaspy, Helen
2017-08-01
The aim of this study was to identify the factors contributing to lasting change in practice following a recovery-based training intervention for inpatient mental health rehabilitation staff. Staff training may help nurses and other staff groups in inpatient mental health rehabilitative settings to increase their recovery-oriented practice. There are no published reviews on the effectiveness of such training and few long-term evaluations. This review informed a realist evaluation of a specific intervention (GetREAL). Rapid realist review methodology was used to generate and prioritize programme theories. ASSIA, CINAHL, Cochrane Library, Medline, PsycINFO, Scopus, Web of Science and grey literature searches were performed in September 2014-March 2015 with no date restrictions. Stakeholders suggested further documents. GetREAL project documentation was consulted. Programme theory development took place iteratively with literature identification. Stakeholders validated and prioritized emerging programme theories and the prioritized theories were refined using literature case studies. Fifty-one relevant documents fed into 49 programme theories articulating seven mechanisms for lasting change. Prioritized mechanisms were: staff receptiveness to change; and staff feeling encouraged, motivated and supported by colleagues and management to change. Seven programme theories were prioritized and refined using data from four case studies. Lasting change can be facilitated by collaborative action planning, regular collaborative meetings, appointing a change agent, explicit management endorsement and prioritization and modifying organizational structures. Conversely, a challenging organizational climate, or a prevalence of 'change fatigue', may block change. Pre-intervention exploration may help identify any potential barriers to embedding recovery in the organizational culture. © 2016 John Wiley & Sons Ltd.
Prioritized List of Research Needs to support MRWFD Case Study Flowsheet Advancement
DOE Office of Scientific and Technical Information (OSTI.GOV)
Law, Jack Douglas; Soelberg, Nicholas Ray
In FY-13, a case study evaluation was performed of full recycle technologies for both the processing of light-water reactor (LWR) used nuclear fuels as well as fast reactor (FR) fuel in the full recycle option. This effort focused on the identification of the case study processes and the initial preparation of material balance flowsheets for the identified technologies. In identifying the case study flowsheets, it was decided that two cases would be developed: one which identifies the flowsheet as currently developed and another near-term target flowsheet which identifies the flowsheet as envisioned within two years, pending the results of ongoingmore » research. The case study focus is on homogeneous aqueous recycle of the U/TRU resulting from the processing of LWR fuel as feed for metal fuel fabrication. The metal fuel is utilized in a sodium-cooled fast reactor, and the used fast reactor fuel is processed using electrochemical separations. The recovered U/TRU from electrochemical separations is recycled to fuel fabrication and the fast reactor. Waste streams from the aqueous and electrochemical processing are treated and prepared for disposition. Off-gas from the separations and waste processing are also treated. As part of the FY-13 effort, preliminary process unknowns and research needs to advance the near-term target flowsheets were identified. In FY-14, these research needs were updated, expanded and prioritized. This report again updates the prioritized list of research needs based upon results to date in FY-15. The research needs are listed for each of the main portions of the flowsheet: 1) Aqueous headend, 2) Headend tritium pretreatment off-gas, 3) Aqueous U/Pu/Np recovery, 4) Aqueous TRU product solidification, 5) Aqueous actinide/lanthanide separation, 6) Aqueous off-gas treatment, 7) Aqueous HLW management, 8) Treatment of aqueous process wastes, 9) E-chem actinide separations, 10) E-chem off-gas, 11) E-chem HLW management. The identified research needs were prioritized within each of these areas. No effort was made to perform an overall prioritization. This information will be used by the MRWFD Campaign leadership in research planning for FY-16. Additionally, this information will be incorporated into the next version of the Case Study Report scheduled to be issued September 2015.« less
ToxCast: Using high throughput screening to identify profiles of biological activity
ToxCast, the United States Environmental Protection Agency’s chemical prioritization research program, is developing methods for utilizing computational chemistry and bioactivity profiling to predict potential for toxicity and prioritize limited testing resources (www.epa.gov/toc...
Applications of high throughput screening to identify profiles of biological activity
ToxCast, the United States Environmental Protection Agency’s chemical prioritization research program, is developing methods for utilizing computational chemistry and bioactivity profiling to predict potential for toxicity and prioritize limited testing resources (www.epa.gov/toc...
Predictive In Vitro Screening of Environmental Chemicals – The ToxCast Project
ToxCast, the United States Environmental Protection Agency’s chemical prioritization research program, is developing methods for utilizing computational chemistry and bioactivity profiling to predict potential for toxicity and prioritize limited testing resources (www.epa.gov/toc...
Endocrine Profiling and Prioritization Using ToxCast Assays
The U.S. EPA's Endocrine Disruptor Screening Program (EDSP) is charged with screening pesticide chemicals and environmental contaminants for their potential to affect the endocrine systems of humans and wildlife (http://www.epa.gov/endo/). The prioritization of chemicals for test...
ExpoCast: Exposure Science for Prioritization and Toxicity Testing (T)
The US EPA National Center for Computational Toxicology (NCCT) has a mission to integrate modern computing and information technology with molecular biology to improve Agency prioritization of data requirements and risk assessment of chemicals. Recognizing the critical need for ...
A predictive data-driven framework for endocrine prioritization: a triazole fungicide case study.
Paul Friedman, Katie; Papineni, Sabitha; Marty, M Sue; Yi, Kun Don; Goetz, Amber K; Rasoulpour, Reza J; Kwiatkowski, Pat; Wolf, Douglas C; Blacker, Ann M; Peffer, Richard C
2016-10-01
The US Environmental Protection Agency Endocrine Disruptor Screening Program (EDSP) is a tiered screening approach to determine the potential for a chemical to interact with estrogen, androgen, or thyroid hormone systems and/or perturb steroidogenesis. Use of high-throughput screening (HTS) to predict hazard and exposure is shifting the EDSP approach to (1) prioritization of chemicals for further screening; and (2) targeted use of EDSP Tier 1 assays to inform specific data needs. In this work, toxicology data for three triazole fungicides (triadimefon, propiconazole, and myclobutanil) were evaluated, including HTS results, EDSP Tier 1 screening (and other scientifically relevant information), and EPA guideline mammalian toxicology study data. The endocrine-related bioactivity predictions from HTS and information that satisfied the EDSP Tier 1 requirements were qualitatively concordant. Current limitations in the available HTS battery for thyroid and steroidogenesis pathways were mitigated by inclusion of guideline toxicology studies in this analysis. Similar margins (3-5 orders of magnitude) were observed between HTS-predicted human bioactivity and exposure values and between in vivo mammalian bioactivity and EPA chronic human exposure estimates for these products' registered uses. Combined HTS hazard and human exposure predictions suggest low priority for higher-tiered endocrine testing of these triazoles. Comparison with the mammalian toxicology database indicated that this HTS-based prioritization would have been protective for any potential in vivo effects that form the basis of current risk assessment for these chemicals. This example demonstrates an effective, human health protective roadmap for EDSP evaluation of pesticide active ingredients via prioritization using HTS and guideline toxicology information.
A predictive data-driven framework for endocrine prioritization: a triazole fungicide case study
Paul Friedman, Katie; Papineni, Sabitha; Marty, M. Sue; Yi, Kun Don; Goetz, Amber K.; Rasoulpour, Reza J.; Kwiatkowski, Pat; Wolf, Douglas C.; Blacker, Ann M.; Peffer, Richard C.
2016-01-01
Abstract The US Environmental Protection Agency Endocrine Disruptor Screening Program (EDSP) is a tiered screening approach to determine the potential for a chemical to interact with estrogen, androgen, or thyroid hormone systems and/or perturb steroidogenesis. Use of high-throughput screening (HTS) to predict hazard and exposure is shifting the EDSP approach to (1) prioritization of chemicals for further screening; and (2) targeted use of EDSP Tier 1 assays to inform specific data needs. In this work, toxicology data for three triazole fungicides (triadimefon, propiconazole, and myclobutanil) were evaluated, including HTS results, EDSP Tier 1 screening (and other scientifically relevant information), and EPA guideline mammalian toxicology study data. The endocrine-related bioactivity predictions from HTS and information that satisfied the EDSP Tier 1 requirements were qualitatively concordant. Current limitations in the available HTS battery for thyroid and steroidogenesis pathways were mitigated by inclusion of guideline toxicology studies in this analysis. Similar margins (3–5 orders of magnitude) were observed between HTS-predicted human bioactivity and exposure values and between in vivo mammalian bioactivity and EPA chronic human exposure estimates for these products’ registered uses. Combined HTS hazard and human exposure predictions suggest low priority for higher-tiered endocrine testing of these triazoles. Comparison with the mammalian toxicology database indicated that this HTS-based prioritization would have been protective for any potential in vivo effects that form the basis of current risk assessment for these chemicals. This example demonstrates an effective, human health protective roadmap for EDSP evaluation of pesticide active ingredients via prioritization using HTS and guideline toxicology information. PMID:27347635
Cost-Effective Fuel Treatment Planning
NASA Astrophysics Data System (ADS)
Kreitler, J.; Thompson, M.; Vaillant, N.
2014-12-01
The cost of fighting large wildland fires in the western United States has grown dramatically over the past decade. This trend will likely continue with growth of the WUI into fire prone ecosystems, dangerous fuel conditions from decades of fire suppression, and a potentially increasing effect from prolonged drought and climate change. Fuel treatments are often considered the primary pre-fire mechanism to reduce the exposure of values at risk to wildland fire, and a growing suite of fire models and tools are employed to prioritize where treatments could mitigate wildland fire damages. Assessments using the likelihood and consequence of fire are critical because funds are insufficient to reduce risk on all lands needing treatment, therefore prioritization is required to maximize the effectiveness of fuel treatment budgets. Cost-effectiveness, doing the most good per dollar, would seem to be an important fuel treatment metric, yet studies or plans that prioritize fuel treatments using costs or cost-effectiveness measures are absent from the literature. Therefore, to explore the effect of using costs in fuel treatment planning we test four prioritization algorithms designed to reduce risk in a case study examining fuel treatments on the Sisters Ranger District of central Oregon. For benefits we model sediment retention and standing biomass, and measure the effectiveness of each algorithm by comparing the differences among treatment and no treat alternative scenarios. Our objective is to maximize the averted loss of net benefits subject to a representative fuel treatment budget. We model costs across the study landscape using the My Fuel Treatment Planner software, tree list data, local mill prices, and GIS-measured site characteristics. We use fire simulations to generate burn probabilities, and estimate fire intensity as conditional flame length at each pixel. Two prioritization algorithms target treatments based on cost-effectiveness and show improvements over those that use only benefits. Variations across the heterogeneous surfaces of costs and benefits create opportunities for fuel treatments to maximize the expected averted loss of benefits. By targeting these opportunities we demonstrate how incorporating costs in fuel treatment prioritization can improve the outcome of fuel treatment planning.
Current environmental monitoring approaches focus primarily on chemical occurrence. However, based on chemical concentration alone, it can be difficult to identify which compounds may be of toxicological concern for prioritization for further monitoring or management. This can be...
ToxCast, the United States Environmental Protection Agency’s chemical prioritization research program, is developing methods for utilizing computational chemistry and bioactivity profiling to predict potential for toxicity and prioritize limited testing resources (www.epa.gov/toc...
Computational Approaches and Tools for Exposure Prioritization and Biomonitoring Data Interpretation
The ability to describe the source-environment-exposure-dose-response continuum is essential for identifying exposures of greater concern to prioritize chemicals for toxicity testing or risk assessment, as well as for interpreting biomarker data for better assessment of exposure ...
Andersen, Melvin E.; Clewell, Harvey J.; Carmichael, Paul L.; Boekelheide, Kim
2013-01-01
The 2007 report “Toxicity Testing in the 21st Century: A Vision and A Strategy” argued for a change in toxicity testing for environmental agents and discussed federal funding mechanisms that could be used to support this transformation within the USA. The new approach would test for in vitro perturbations of toxicity pathways using human cells with high throughput testing platforms. The NRC report proposed a deliberate timeline, spanning about 20 years, to implement a wholesale replacement of current in-life toxicity test approaches focused on apical responses with in vitro assays. One approach to accelerating implementation is to focus on well-studied prototype compounds with known toxicity pathway targets. Through a series of carefully executed case studies with four or five pathway prototypes, the various steps required for implementation of an in vitro toxicity pathway approach to risk assessment could be developed and refined. In this article, we discuss alternative approaches for implementation and also outline advantages of a case study approach and the manner in which the cases studies could be pursued using current methodologies. A case study approach would be complementary to recently proposed efforts to map the human toxome, while representing a significant extension toward more formal risk assessment compared to the profiling and prioritization approaches offered by programs such as the EPA’s ToxCast effort. PMID:21993955
NASA Astrophysics Data System (ADS)
Tarigan, A. P. M.; Rahmad, D.; Sembiring, R. A.; Iskandar, R.
2018-02-01
This paper illustrates an application of Analytical Hierarchy Process (AHP) as a potential decision-making method in water resource management related to drainage rehabilitation. The prioritization problem of urban drainage rehabilitation in Medan City due to limited budget is used as a study case. A hierarchical structure is formed for the prioritization criteria and the alternative drainages to be rehabilitated. Based on the AHP, the prioritization criteria are ranked and a descending-order list of drainage is made in order to select the most favorable drainages to have rehabilitation. A sensitivity analysis is then conducted to check the consistency of the final decisions in case of minor changes in judgements. The results of AHP computed manually are compared with that using the software Expert Choice. It is observed that the top three ranked drainages are consistent, and both results of the AHP methods, calculated manually and performed using Expert Choice, are in agreement. It is hoped that the application of the AHP will help the decision-making process by the city government in the problem of urban drainage rehabilitation.
These novel modeling approaches for screening, evaluating and classifying chemicals based on the potential for biologically-relevant human exposures will inform toxicity testing and prioritization for chemical risk assessment. The new modeling approach is derived from the Stocha...
Exposure Considerations for Chemical Prioritization and Toxicity Testing
Globally there is a need to characterize potential risk to human health and the environment that arises from the manufacture and use of tens of thousands of chemicals. Currently, a significant research effort is underway to apply new technologies to screen and prioritize chemica...
ToxCast, the United States Environmental Protection Agency’s chemical prioritization research program, is developing methods for utilizing computational chemistry, bioactivity profiling and toxicogenomic data to predict potential for toxicity and prioritize limited testing resour...
A Survey of UML Based Regression Testing
NASA Astrophysics Data System (ADS)
Fahad, Muhammad; Nadeem, Aamer
Regression testing is the process of ensuring software quality by analyzing whether changed parts behave as intended, and unchanged parts are not affected by the modifications. Since it is a costly process, a lot of techniques are proposed in the research literature that suggest testers how to build regression test suite from existing test suite with minimum cost. In this paper, we discuss the advantages and drawbacks of using UML diagrams for regression testing and analyze that UML model helps in identifying changes for regression test selection effectively. We survey the existing UML based regression testing techniques and provide an analysis matrix to give a quick insight into prominent features of the literature work. We discuss the open research issues like managing and reducing the size of regression test suite, prioritization of the test cases that would be helpful during strict schedule and resources that remain to be addressed for UML based regression testing.
Testing the idea of privileged awareness of self-relevant information.
Stein, Timo; Siebold, Alisha; van Zoest, Wieske
2016-03-01
Self-relevant information is prioritized in processing. Some have suggested the mechanism driving this advantage is akin to the automatic prioritization of physically salient stimuli in information processing (Humphreys & Sui, 2015). Here we investigate whether self-relevant information is prioritized for awareness under continuous flash suppression (CFS), as has been found for physical salience. Gabor patches with different orientations were first associated with the labels You or Other. Participants were more accurate in matching the self-relevant association, replicating previous findings of self-prioritization. However, breakthrough into awareness from CFS did not differ between self- and other-associated Gabors. These findings demonstrate that self-relevant information has no privileged access to awareness. Rather than modulating the initial visual processes that precede and lead to awareness, the advantage of self-relevant information may better be characterized as prioritization at later processing stages. (c) 2016 APA, all rights reserved).
An ensemble rank learning approach for gene prioritization.
Lee, Po-Feng; Soo, Von-Wun
2013-01-01
Several different computational approaches have been developed to solve the gene prioritization problem. We intend to use the ensemble boosting learning techniques to combine variant computational approaches for gene prioritization in order to improve the overall performance. In particular we add a heuristic weighting function to the Rankboost algorithm according to: 1) the absolute ranks generated by the adopted methods for a certain gene, and 2) the ranking relationship between all gene-pairs from each prioritization result. We select 13 known prostate cancer genes in OMIM database as training set and protein coding gene data in HGNC database as test set. We adopt the leave-one-out strategy for the ensemble rank boosting learning. The experimental results show that our ensemble learning approach outperforms the four gene-prioritization methods in ToppGene suite in the ranking results of the 13 known genes in terms of mean average precision, ROC and AUC measures.
An alternative is to perform a set of relatively inexpensive and rapid high throughput screening (HTS) assays, derive signatures predictive of effects or modes of chemical toxicity from the HTS data, then use these predictions to prioritize chemicals for more detailed analysis. T...
LaLone, Carlie A.; Berninger, Jason P.; Villeneuve, Daniel L.; Ankley, Gerald T.
2014-01-01
Medicinal innovation has led to the discovery and use of thousands of human and veterinary drugs. With this comes the potential for unintended effects on non-target organisms exposed to pharmaceuticals inevitably entering the environment. The impracticality of generating whole-organism chronic toxicity data representative of all species in the environment has necessitated prioritization of drugs for focused empirical testing as well as field monitoring. Current prioritization strategies typically emphasize likelihood for exposure (i.e. predicted/measured environmental concentrations), while incorporating only rather limited consideration of potential effects of the drug to non-target organisms. However, substantial mammalian pharmacokinetic and mechanism/mode of action (MOA) data are produced during drug development to understand drug target specificity and efficacy for intended consumers. An integrated prioritization strategy for assessing risks of human and veterinary drugs would leverage available pharmacokinetic and toxicokinetic data for evaluation of the potential for adverse effects to non-target organisms. In this reiview, we demonstrate the utility of read-across approaches to leverage mammalian absorption, distribution, metabolism and elimination data; analyse cross-species molecular target conservation and translate therapeutic MOA to an adverse outcome pathway(s) relevant to aquatic organisms as a means to inform prioritization of drugs for focused toxicity testing and environmental monitoring. PMID:25405975
NASA Astrophysics Data System (ADS)
Kim, Y.; Chung, E. S.
2014-12-01
This study suggests a robust prioritization framework for climate change adaptation strategies under multiple climate change scenarios with a case study of selecting sites for reusing treated wastewater (TWW) in a Korean urban watershed. The framework utilizes various multi-criteria decision making techniques, including the VIKOR method and the Shannon entropy-based weights. In this case study, the sustainability of TWW use is quantified with indicator-based approaches with the DPSIR framework, which considers both hydro-environmental and socio-economic aspects of the watershed management. Under the various climate change scenarios, the hydro-environmental responses to reusing TWW in potential alternative sub-watersheds are determined using the Hydrologic Simulation Program in Fortran (HSPF). The socio-economic indicators are obtained from the statistical databases. Sustainability scores for multiple scenarios are estimated individually and then integrated with the proposed approach. At last, the suggested framework allows us to prioritize adaptation strategies in a robust manner with varying levels of compromise between utility-based and regret-based strategies.
Dennis, Jessica; Medina-Rivera, Alejandra; Truong, Vinh; Antounians, Lina; Zwingerman, Nora; Carrasco, Giovana; Strug, Lisa; Wells, Phil; Trégouët, David-Alexandre; Morange, Pierre-Emmanuel; Wilson, Michael D; Gagnon, France
2017-07-01
Tissue factor pathway inhibitor (TFPI) regulates the formation of intravascular blood clots, which manifest clinically as ischemic heart disease, ischemic stroke, and venous thromboembolism (VTE). TFPI plasma levels are heritable, but the genetics underlying TFPI plasma level variability are poorly understood. Herein we report the first genome-wide association scan (GWAS) of TFPI plasma levels, conducted in 251 individuals from five extended French-Canadian Families ascertained on VTE. To improve discovery, we also applied a hypothesis-driven (HD) GWAS approach that prioritized single nucleotide polymorphisms (SNPs) in (1) hemostasis pathway genes, and (2) vascular endothelial cell (EC) regulatory regions, which are among the highest expressers of TFPI. Our GWAS identified 131 SNPs with suggestive evidence of association (P-value < 5 × 10 -8 ), but no SNPs reached the genome-wide threshold for statistical significance. Hemostasis pathway genes were not enriched for TFPI plasma level associated SNPs (global hypothesis test P-value = 0.147), but EC regulatory regions contained more TFPI plasma level associated SNPs than expected by chance (global hypothesis test P-value = 0.046). We therefore stratified our genome-wide SNPs, prioritizing those in EC regulatory regions via stratified false discovery rate (sFDR) control, and reranked the SNPs by q-value. The minimum q-value was 0.27, and the top-ranked SNPs did not show association evidence in the MARTHA replication sample of 1,033 unrelated VTE cases. Although this study did not result in new loci for TFPI, our work lays out a strategy to utilize epigenomic data in prioritization schemes for future GWAS studies. © 2017 WILEY PERIODICALS, INC.
Elms, Heather; Berman, Shawn; Wicks, Andrew C
2002-10-01
This paper utilizes a qualitative case study of the health care industry and a recent legal case to demonstrate that stakeholder theory's focus on ethics, without recognition of the effects of incentives, severely limits the theory's ability to provide managerial direction and explain managerial behavior. While ethics provide a basis for stakeholder prioritization, incentives influence whether managerial action is consistent with that prioritization. Our health care examples highlight this and other limitations of stakeholder theory and demonstrate the explanatory and directive power added by the inclusion of the interactive effects of ethics and incentives in stakeholder ordering.
Environmental risk analysis and prioritization of pharmaceuticals in a developing world context.
Mansour, Fatima; Al-Hindi, Mahmoud; Saad, Walid; Salam, Darine
2016-07-01
The impact of residual pharmaceuticals on the aquatic environment has gained widespread attention over the past years. Various studies have established the occurrence of pharmaceutical compounds in different water bodies throughout the world. In view of the absence of occurrence data in a number of developing world countries, and given the limited availability of analytical resources in these countries, it is prudent to devise methodologies to prioritize pharmaceuticals for environmental monitoring purposes that are site specific. In this work, several prioritization approaches are used to rank the 88 most commonly consumed pharmaceuticals in Lebanon. A simultaneous multi-criteria decision analysis method utilizing the exposure, persistence, bioaccumulation, and toxicity (EPBT) approach is applied to a smaller subset of the original list (69 pharmaceuticals). Several base cases are investigated and sensitivity analysis is applied to one of these base case runs. The similarities and differences in the overall ranking of individual, and classes of, pharmaceuticals for the base cases and the sensitivity runs are elucidated. An environmental risk assessment (ERA), where predicted environmental concentrations (PEC) and risk quotients (RQ) are determined at different dilution factors, is performed as an alternative method of prioritization for a total of 84 pharmaceuticals. The ERA results indicate that metformin and amoxicillin have the highest PECs while 17β-estradiol, naftidrofuryl and dimenhydrinate have the highest RQs. The two approaches, EPBT prioritization and ERA, are compared and a priority list consisting of 26 pharmaceuticals of various classes is developed. Nervous system and alimentary tract and metabolism pharmaceuticals (9/26 and 5/26 respectively) constitute more than half of the numbers on the priority list with the balance consisting of anti-infective (4/26), musculo-skeletal (3/26), genito-urinary (2/26), respiratory (2/26) and cardiovascular (1/26) pharmaceuticals. This list will serve as a basis for the selection of candidate compounds to focus on for future monitoring campaigns. Copyright © 2016 Elsevier B.V. All rights reserved.
Endocrine Profiling and Prioritization of Environmental Chemicals Using ToxCast Data
The prioritization of chemicals for toxicity testing is a primary goal of the U.S. EPA’s ToxCast™ program. Phase I of ToxCast utilized a battery of 467 in vitro, high-throughput screening assays to assess 309 environmental chemicals. One important mode of action leading to toxici...
Across several EPA Program Offices (e.g., OPPTS, OW, OAR), there is a clear need to develop strategies and methods to screen large numbers of chemicals for potential toxicity, and to use the resulting information to prioritize the use of testing resources towards those entities a...
Park, Peter Y; Young, Jason
2012-03-01
An important potential benefit of a jurisdiction developing an upper-level traffic safety policy statement, such as a strategic highway safety plan (SHSP) or a traffic safety action plan, is the creation of a manageable number of focus areas, known as emphasis areas. The responsible agencies in the jurisdiction can then direct their finite resources in a systematic and strategic way designed to maximize the effort to reduce the number and severity of roadway collisions. In the United States, the federal government through AASHTO has suggested 22 potential emphasis areas. In Canada, CCMTA's 10 potential emphasis areas have been listed for consideration. This study reviewed the SHSP and traffic safety action plan of 53 jurisdictions in North America, and conducted descriptive data analyses to clarify the issues that currently affect the selection and prioritization process of jurisdiction-specific emphasis areas. We found that the current process relies heavily on high-level collision data analysis and communication among the SHSP stakeholders, but may not be the most efficient and effective way of selecting and prioritizing the emphasis areas and allocating safety improvement resources. This study then formulated a formal collision diagnosis test, known as the beta-binomial test, to clarify and illuminate the selection and the prioritization of jurisdiction-specific emphasis areas. We developed numerical examples to demonstrate how engineers can apply the proposed diagnosis test to improve the selection and prioritization of individual jurisdictions' emphasis areas. Copyright © 2011 Elsevier Ltd. All rights reserved.
Integrated rare variant-based risk gene prioritization in disease case-control sequencing studies.
Lin, Jhih-Rong; Zhang, Quanwei; Cai, Ying; Morrow, Bernice E; Zhang, Zhengdong D
2017-12-01
Rare variants of major effect play an important role in human complex diseases and can be discovered by sequencing-based genome-wide association studies. Here, we introduce an integrated approach that combines the rare variant association test with gene network and phenotype information to identify risk genes implicated by rare variants for human complex diseases. Our data integration method follows a 'discovery-driven' strategy without relying on prior knowledge about the disease and thus maintains the unbiased character of genome-wide association studies. Simulations reveal that our method can outperform a widely-used rare variant association test method by 2 to 3 times. In a case study of a small disease cohort, we uncovered putative risk genes and the corresponding rare variants that may act as genetic modifiers of congenital heart disease in 22q11.2 deletion syndrome patients. These variants were missed by a conventional approach that relied on the rare variant association test alone.
Identifying multidrug resistant tuberculosis transmission hotspots using routinely collected data12
Manjourides, Justin; Lin, Hsien-Ho; Shin, Sonya; Jeffery, Caroline; Contreras, Carmen; Cruz, Janeth Santa; Jave, Oswaldo; Yagui, Martin; Asencios, Luis; Pagano, Marcello; Cohen, Ted
2012-01-01
SUMMARY In most countries with large drug resistant tuberculosis epidemics, only those cases that are at highest risk of having MDRTB receive a drug sensitivity test (DST) at the time of diagnosis. Because of this prioritized testing, identification of MDRTB transmission hotspots in communities where TB cases do not receive DST is challenging, as any observed aggregation of MDRTB may reflect systematic differences in how testing is distributed in communities. We introduce a new disease mapping method, which estimates this missing information through probability–weighted locations, to identify geographic areas of increased risk of MDRTB transmission. We apply this method to routinely collected data from two districts in Lima, Peru over three consecutive years. This method identifies an area in the eastern part of Lima where previously untreated cases have increased risk of MDRTB. This may indicate an area of increased transmission of drug resistant disease, a finding that may otherwise have been missed by routine analysis of programmatic data. The risk of MDR among retreatment cases is also highest in these probable transmission hotspots, though a high level of MDR among retreatment cases is present throughout the study area. Identifying potential multidrug resistant tuberculosis (MDRTB) transmission hotspots may allow for targeted investigation and deployment of resources. PMID:22401962
Stark, Zornitza; Dashnow, Harriet; Lunke, Sebastian; Tan, Tiong Y; Yeung, Alison; Sadedin, Simon; Thorne, Natalie; Macciocca, Ivan; Gaff, Clara; Oshlack, Alicia; White, Susan M; James, Paul A
2017-11-01
Rapid identification of clinically significant variants is key to the successful application of next generation sequencing technologies in clinical practice. The Melbourne Genomics Health Alliance (MGHA) variant prioritization framework employs a gene prioritization index based on clinician-generated a priori gene lists, and a variant prioritization index (VPI) based on rarity, conservation and protein effect. We used data from 80 patients who underwent singleton whole exome sequencing (WES) to test the ability of the framework to rank causative variants highly, and compared it against the performance of other gene and variant prioritization tools. Causative variants were identified in 59 of the patients. Using the MGHA prioritization framework the average rank of the causative variant was 2.24, with 76% ranked as the top priority variant, and 90% ranked within the top five. Using clinician-generated gene lists resulted in ranking causative variants an average of 8.2 positions higher than prioritization based on variant properties alone. This clinically driven prioritization approach significantly outperformed purely computational tools, placing a greater proportion of causative variants top or in the top 5 (permutation P-value=0.001). Clinicians included 40 of the 49 WES diagnoses in their a priori list of differential diagnoses (81%). The lists generated by PhenoTips and Phenomizer contained 14 (29%) and 18 (37%) of these diagnoses respectively. These results highlight the benefits of clinically led variant prioritization in increasing the efficiency of singleton WES data analysis and have important implications for developing models for the funding and delivery of genomic services.
Chapter 15: Disease Gene Prioritization
Bromberg, Yana
2013-01-01
Disease-causing aberrations in the normal function of a gene define that gene as a disease gene. Proving a causal link between a gene and a disease experimentally is expensive and time-consuming. Comprehensive prioritization of candidate genes prior to experimental testing drastically reduces the associated costs. Computational gene prioritization is based on various pieces of correlative evidence that associate each gene with the given disease and suggest possible causal links. A fair amount of this evidence comes from high-throughput experimentation. Thus, well-developed methods are necessary to reliably deal with the quantity of information at hand. Existing gene prioritization techniques already significantly improve the outcomes of targeted experimental studies. Faster and more reliable techniques that account for novel data types are necessary for the development of new diagnostics, treatments, and cure for many diseases. PMID:23633938
Chughtai, Abrar Ahmad; MacIntyre, C. Raina
2017-01-01
Abstract The 2014 Ebola virus disease (EVD) outbreak affected several countries worldwide, including six West African countries. It was the largest Ebola epidemic in the history and the first to affect multiple countries simultaneously. Significant national and international delay in response to the epidemic resulted in 28,652 cases and 11,325 deaths. The aim of this study was to develop a risk analysis framework to prioritize rapid response for situations of high risk. Based on findings from the literature, sociodemographic features of the affected countries, and documented epidemic data, a risk scoring framework using 18 criteria was developed. The framework includes measures of socioeconomics, health systems, geographical factors, cultural beliefs, and traditional practices. The three worst affected West African countries (Guinea, Sierra Leone, and Liberia) had the highest risk scores. The scores were much lower in developed countries that experienced Ebola compared to West African countries. A more complex risk analysis framework using 18 measures was compared with a simpler one with 10 measures, and both predicted risk equally well. A simple risk scoring system can incorporate measures of hazard and impact that may otherwise be neglected in prioritizing outbreak response. This framework can be used by public health personnel as a tool to prioritize outbreak investigation and flag outbreaks with potentially catastrophic outcomes for urgent response. Such a tool could mitigate costly delays in epidemic response. PMID:28810081
EPA sought advice from stakeholders regarding potential case studies, stakeholder were invited to provide suggestions and refinements to the prioritization of criteria and information listed in Table 1 of the document.
Attitudes of Germans towards distributive issues in the German health system.
Ahlert, Marlies; Pfarr, Christian
2016-05-01
Social health care systems are inevitably confronted with the scarcity of resources and the resulting distributional challenges. Since prioritization implies distributional effects, decisions regarding respective rules should take citizens' preferences into account. In this study we concentrate on two distributive issues in the German health system: firstly, we analyze the acceptance of prioritizing decisions concerning the treatment of certain patient groups, in this case patients who all need a heart operation. We focus on the patient criteria smoking behavior, age and whether the patient has or does not have young children. Secondly, we investigate Germans' opinions towards income-dependent health services. The results reveal the strong effects of individuals' attitudes regarding general aspects of the health system on priorities, e.g. that individuals with an unhealthy lifestyle should not be prioritized. In addition, experience of limited access to health services is found to have a strong influence on citizens' attitudes, too. Finally, decisions on different prioritization criteria are found to be not independent.
Carlson, Josh J; Thariani, Rahber; Roth, Josh; Gralow, Julie; Henry, N Lynn; Esmail, Laura; Deverka, Pat; Ramsey, Scott D; Baker, Laurence; Veenstra, David L
2013-05-01
The objective of this study was to evaluate the feasibility and outcomes of incorporating value-of-information (VOI) analysis into a stakeholder-driven research prioritization process in a US-based setting. . Within a program to prioritize comparative effectiveness research areas in cancer genomics, over a period of 7 months, we developed decision-analytic models and calculated upper-bound VOI estimates for 3 previously selected genomic tests. Thirteen stakeholders representing patient advocates, payers, test developers, regulators, policy makers, and community-based oncologists ranked the tests before and after receiving VOI results. The stakeholders were surveyed about the usefulness and impact of the VOI findings. The estimated upper-bound VOI ranged from $33 million to $2.8 billion for the 3 research areas. Seven stakeholders indicated the results modified their rankings, 9 stated VOI data were useful, and all indicated they would support its use in future prioritization processes. Some stakeholders indicated expected value of sampled information might be the preferred choice when evaluating specific Limitations. Our study was limited by the size and the potential for selection bias in the composition of the external stakeholder group, lack of a randomized design to assess effect of VOI data on rankings, and the use of expected value of perfect information v. expected value of sample information methods. Value of information analyses may have a meaningful role in research topic prioritization for comparative effectiveness research in the United States, particularly when large differences in VOI across topic areas are identified. Additional research is needed to facilitate the use of more complex value of information analyses in this setting.
Endocrine Profiling and Prioritization of Environmental Chemicals Using ToxCast Data
Reif, David M.; Martin, Matthew T.; Tan, Shirlee W.; Houck, Keith A.; Judson, Richard S.; Richard, Ann M.; Knudsen, Thomas B.; Dix, David J.; Kavlock, Robert J.
2010-01-01
Background The prioritization of chemicals for toxicity testing is a primary goal of the U.S. Environmental Protection Agency (EPA) ToxCast™ program. Phase I of ToxCast used a battery of 467 in vitro, high-throughput screening assays to assess 309 environmental chemicals. One important mode of action leading to toxicity is endocrine disruption, and the U.S. EPA’s Endocrine Disruptor Screening Program (EDSP) has been charged with screening pesticide chemicals and environmental contaminants for their potential to affect the endocrine systems of humans and wildlife. Objective The goal of this study was to develop a flexible method to facilitate the rational prioritization of chemicals for further evaluation and demonstrate its application as a candidate decision-support tool for EDSP. Methods Focusing on estrogen, androgen, and thyroid pathways, we defined putative endocrine profiles and derived a relative rank or score for the entire ToxCast library of 309 unique chemicals. Effects on other nuclear receptors and xenobiotic metabolizing enzymes were also considered, as were pertinent chemical descriptors and pathways relevant to endocrine-mediated signaling. Results Combining multiple data sources into an overall, weight-of-evidence Toxicological Priority Index (ToxPi) score for prioritizing further chemical testing resulted in more robust conclusions than any single data source taken alone. Conclusions Incorporating data from in vitro assays, chemical descriptors, and biological pathways in this prioritization schema provided a flexible, comprehensive visualization and ranking of each chemical’s potential endocrine activity. Importantly, ToxPi profiles provide a transparent visualization of the relative contribution of all information sources to an overall priority ranking. The method developed here is readily adaptable to diverse chemical prioritization tasks. PMID:20826373
Priority Queuing Models for Hospital Intensive Care Units and Impacts to Severe Case Patients
Hagen, Matthew S.; Jopling, Jeffrey K; Buchman, Timothy G; Lee, Eva K.
2013-01-01
This paper examines several different queuing models for intensive care units (ICU) and the effects on wait times, utilization, return rates, mortalities, and number of patients served. Five separate intensive care units at an urban hospital are analyzed and distributions are fitted for arrivals and service durations. A system-based simulation model is built to capture all possible cases of patient flow after ICU admission. These include mortalities and returns before and after hospital exits. Patients are grouped into 9 different classes that are categorized by severity and length of stay (LOS). Each queuing model varies by the policies that are permitted and by the order the patients are admitted. The first set of models does not prioritize patients, but examines the advantages of smoothing the operating schedule for elective surgeries. The second set analyzes the differences between prioritizing admissions by expected LOS or patient severity. The last set permits early ICU discharges and conservative and aggressive bumping policies are contrasted. It was found that prioritizing patients by severity considerably reduced delays for critical cases, but also increased the average waiting time for all patients. Aggressive bumping significantly raised the return and mortality rates, but more conservative methods balance quality and efficiency with lowered wait times without serious consequences. PMID:24551379
Overcoming barriers to exercise among parents: A social cognitive theory perspective
Mailey, Emily L.; Phillips, Siobhan M.; Dlugonski, Deirdre; Conroy, David E.
2017-01-01
Parents face numerous barriers to exercise and exhibit high levels of inactivity. Examining theory-based determinants of exercise among parents may inform interventions for this population. The purpose of this study was to test a social-cognitive model of parental exercise participation over a 12-month period. Mothers (n=226) and fathers (n=70) of children <16 completed measures of exercise, barriers self-efficacy, perceived barriers, and exercise planning at baseline and one year later. Panel analyses were used to test the hypothesized relationships. Barriers self-efficacy was related to exercise directly and indirectly through perceived barriers and prioritization/planning. Prioritization and planning also mediated the relationship between perceived barriers and exercise. These paths remained significant at 12 months. These results suggest efforts to increase exercise in parents should focus on improving confidence to overcome exercise barriers, reducing perceptions of barriers, and helping parents make specific plans for prioritizing and engaging in exercise. PMID:27108160
Qi, Xiao-Wen; Zhang, Jun-Ling; Zhao, Shu-Ping; Liang, Chang-Yong
2017-10-02
In order to be prepared against potential balance-breaking risks affecting economic development, more and more countries have recognized emergency response solutions evaluation (ERSE) as an indispensable activity in their governance of sustainable development. Traditional multiple criteria group decision making (MCGDM) approaches to ERSE have been facing simultaneous challenging characteristics of decision hesitancy and prioritization relations among assessing criteria, due to the complexity in practical ERSE problems. Therefore, aiming at the special type of ERSE problems that hold the two characteristics, we investigate effective MCGDM approaches by hiring interval-valued dual hesitant fuzzy set (IVDHFS) to comprehensively depict decision hesitancy. To exploit decision information embedded in prioritization relations among criteria, we firstly define an fuzzy entropy measure for IVDHFS so that its derivative decision models can avoid potential information distortion in models based on classic IVDHFS distance measures with subjective supplementing mechanism; further, based on defined entropy measure, we develop two fundamental prioritized operators for IVDHFS by extending Yager's prioritized operators. Furthermore, on the strength of above methods, we construct two hesitant fuzzy MCGDM approaches to tackle complex scenarios with or without known weights for decision makers, respectively. Finally, case studies have been conducted to show effectiveness and practicality of our proposed approaches.
Qi, Xiao-Wen; Zhang, Jun-Ling; Zhao, Shu-Ping; Liang, Chang-Yong
2017-01-01
In order to be prepared against potential balance-breaking risks affecting economic development, more and more countries have recognized emergency response solutions evaluation (ERSE) as an indispensable activity in their governance of sustainable development. Traditional multiple criteria group decision making (MCGDM) approaches to ERSE have been facing simultaneous challenging characteristics of decision hesitancy and prioritization relations among assessing criteria, due to the complexity in practical ERSE problems. Therefore, aiming at the special type of ERSE problems that hold the two characteristics, we investigate effective MCGDM approaches by hiring interval-valued dual hesitant fuzzy set (IVDHFS) to comprehensively depict decision hesitancy. To exploit decision information embedded in prioritization relations among criteria, we firstly define an fuzzy entropy measure for IVDHFS so that its derivative decision models can avoid potential information distortion in models based on classic IVDHFS distance measures with subjective supplementing mechanism; further, based on defined entropy measure, we develop two fundamental prioritized operators for IVDHFS by extending Yager’s prioritized operators. Furthermore, on the strength of above methods, we construct two hesitant fuzzy MCGDM approaches to tackle complex scenarios with or without known weights for decision makers, respectively. Finally, case studies have been conducted to show effectiveness and practicality of our proposed approaches. PMID:28974045
Dead or alive: animal sampling during Ebola hemorrhagic fever outbreaks in humans
Olson, Sarah H.; Reed, Patricia; Cameron, Kenneth N.; Ssebide, Benard J.; Johnson, Christine K.; Morse, Stephen S.; Karesh, William B.; Mazet, Jonna A. K.; Joly, Damien O.
2012-01-01
There are currently no widely accepted animal surveillance guidelines for human Ebola hemorrhagic fever (EHF) outbreak investigations to identify potential sources of Ebolavirus (EBOV) spillover into humans and other animals. Animal field surveillance during and following an outbreak has several purposes, from helping identify the specific animal source of a human case to guiding control activities by describing the spatial and temporal distribution of wild circulating EBOV, informing public health efforts, and contributing to broader EHF research questions. Since 1976, researchers have sampled over 10,000 individual vertebrates from areas associated with human EHF outbreaks and tested for EBOV or antibodies. Using field surveillance data associated with EHF outbreaks, this review provides guidance on animal sampling for resource-limited outbreak situations, target species, and in some cases which diagnostics should be prioritized to rapidly assess the presence of EBOV in animal reservoirs. In brief, EBOV detection was 32.7% (18/55) for carcasses (animals found dead) and 0.2% (13/5309) for live captured animals. Our review indicates that for the purposes of identifying potential sources of transmission from animals to humans and isolating suspected virus in an animal in outbreak situations, (1) surveillance of free-ranging non-human primate mortality and morbidity should be a priority, (2) any wildlife morbidity or mortality events should be investigated and may hold the most promise for locating virus or viral genome sequences, (3) surveillance of some bat species is worthwhile to isolate and detect evidence of exposure, and (4) morbidity, mortality, and serology studies of domestic animals should prioritize dogs and pigs and include testing for virus and previous exposure. PMID:22558004
The effect of requirements prioritization on avionics system conceptual design
NASA Astrophysics Data System (ADS)
Lorentz, John
This dissertation will provide a detailed approach and analysis of a new collaborative requirements prioritization methodology that has been used successfully on four Coast Guard avionics acquisition and development programs valued at $400M+. A statistical representation of participant study results will be discussed and analyzed in detail. Many technically compliant projects fail to deliver levels of performance and capability that the customer desires. Some of these systems completely meet "threshold" levels of performance; however, the distribution of resources in the process devoted to the development and management of the requirements does not always represent the voice of the customer. This is especially true for technically complex projects such as modern avionics systems. A simplified facilitated process for prioritization of system requirements will be described. The collaborative prioritization process, and resulting artifacts, aids the systems engineer during early conceptual design. All requirements are not the same in terms of customer priority. While there is a tendency to have many thresholds inside of a system design, there is usually a subset of requirements and system performance that is of the utmost importance to the design. These critical capabilities and critical levels of performance typically represent the reason the system is being built. The systems engineer needs processes to identify these critical capabilities, the associated desired levels of performance, and the risks associated with the specific requirements that define the critical capability. The facilitated prioritization exercise is designed to collaboratively draw out these critical capabilities and levels of performance so they can be emphasized in system design. Developing the purpose, scheduling and process for prioritization events are key elements of systems engineering and modern project management. The benefits of early collaborative prioritization flow throughout the project schedule, resulting in greater success during system deployment and operational testing. This dissertation will discuss the data and findings from participant studies, present a literature review of systems engineering and design processes, and test the hypothesis that the prioritization process had no effect on stakeholder sentiment related to the conceptual design. In addition, the "Requirements Rationalization" process will be discussed in detail. Avionics, like many other systems, has transitioned from a discrete electronics engineering, hard engineering discipline to incorporate software engineering as a core process of the technology development cycle. As with other software-based systems, avionics now has significant soft system attributes that must be considered in the design process. The boundless opportunities that exist in software design demand prioritization to focus effort onto the critical functions that the software must provide. This has been a well documented and understood phenomenon in the software development community for many years. This dissertation will attempt to link the effect of software integrated avionics to the benefits of prioritization of requirements in the problem space and demonstrate the sociological and technical benefits of early prioritization practices.
Ajisegiri, Whenayon Simeon; Chughtai, Abrar Ahmad; MacIntyre, C Raina
2018-03-01
The 2014 Ebola virus disease (EVD) outbreak affected several countries worldwide, including six West African countries. It was the largest Ebola epidemic in the history and the first to affect multiple countries simultaneously. Significant national and international delay in response to the epidemic resulted in 28,652 cases and 11,325 deaths. The aim of this study was to develop a risk analysis framework to prioritize rapid response for situations of high risk. Based on findings from the literature, sociodemographic features of the affected countries, and documented epidemic data, a risk scoring framework using 18 criteria was developed. The framework includes measures of socioeconomics, health systems, geographical factors, cultural beliefs, and traditional practices. The three worst affected West African countries (Guinea, Sierra Leone, and Liberia) had the highest risk scores. The scores were much lower in developed countries that experienced Ebola compared to West African countries. A more complex risk analysis framework using 18 measures was compared with a simpler one with 10 measures, and both predicted risk equally well. A simple risk scoring system can incorporate measures of hazard and impact that may otherwise be neglected in prioritizing outbreak response. This framework can be used by public health personnel as a tool to prioritize outbreak investigation and flag outbreaks with potentially catastrophic outcomes for urgent response. Such a tool could mitigate costly delays in epidemic response. © 2017 The Authors Risk Analysis published by Wiley Periodicals, Inc. on behalf of Society for Risk Analysis.
Heinrichs, Julie; Aldridge, Cameron L.; O'Donnell, Michael; Schumaker, Nathan
2017-01-01
Prioritizing habitats for conservation is a challenging task, particularly for species with fluctuating populations and seasonally dynamic habitat needs. Although the use of resource selection models to identify and prioritize habitat for conservation is increasingly common, their ability to characterize important long-term habitats for dynamic populations are variable. To examine how habitats might be prioritized differently if resource selection was directly and dynamically linked with population fluctuations and movement limitations among seasonal habitats, we constructed a spatially explicit individual-based model for a dramatically fluctuating population requiring temporally varying resources. Using greater sage-grouse (Centrocercus urophasianus) in Wyoming as a case study, we used resource selection function maps to guide seasonal movement and habitat selection, but emergent population dynamics and simulated movement limitations modified long-term habitat occupancy. We compared priority habitats in RSF maps to long-term simulated habitat use. We examined the circumstances under which the explicit consideration of movement limitations, in combination with population fluctuations and trends, are likely to alter predictions of important habitats. In doing so, we assessed the future occupancy of protected areas under alternative population and habitat conditions. Habitat prioritizations based on resource selection models alone predicted high use in isolated parcels of habitat and in areas with low connectivity among seasonal habitats. In contrast, results based on more biologically-informed simulations emphasized central and connected areas near high-density populations, sometimes predicted to be low selection value. Dynamic models of habitat use can provide additional biological realism that can extend, and in some cases, contradict habitat use predictions generated from short-term or static resource selection analyses. The explicit inclusion of population dynamics and movement propensities via spatial simulation modeling frameworks may provide an informative means of predicting long-term habitat use, particularly for fluctuating populations with complex seasonal habitat needs. Importantly, our results indicate the possible need to consider habitat selection models as a starting point rather than the common end point for refining and prioritizing habitats for protection for cyclic and highly variable populations.
Uses of research evidence among US state legislators who prioritize behavioral health issues
Purtle, Jonathan; Dodson, Elizabeth A.; Brownson, Ross C.
2016-01-01
Objective Disseminating behavioral health (BH) research to legislators (i.e., elected policy makers) is widely acknowledged as a priority, but little is known about how research evidence is used and sought by this audience. The primary aim of this exploratory study was to identify the research dissemination preferences and research seeking practices of legislators who prioritize BH issues and describe the role research plays in determining their policy priorities. The secondary aim was to assess if these legislators differ from legislators who do not prioritize BH issues. Methods A telephone-based survey was conducted with 862 US state legislators (response rate 50%). A validated survey instrument was used to assess legislators’ priorities and the factors that determine them, research dissemination preferences, and research seeking practices. Bivariate analyses were conducted to characterize the study population and compare legislators who prioritized BH issues to legislators who did not. Results Legislators who prioritized BH issues were significantly more likely to identify research evidence as a factor that determined policy priorities than legislators who did not prioritize these issues (odds ratio=1.91, 95% CI=1.25–2.90, p=.002). Legislators who prioritized BH issues also attributed more importance to 10-of-12 features of disseminated research (e.g., research being unbiased [p=.014], research telling a story [p=.033]) and engaged in 8-of-11 research seeking and utilization practices (e.g., attending research presentations [p=.012]) more often. Conclusions Legislators who prioritize BH issues actively seek, have distinct preferences for, and are particularly influenced by research evidence. Testing legislator-focused BH research dissemination strategies is an area for future research. PMID:27364817
International Technical Working Group Round Robin Tests
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dudder, Gordon B.; Hanlen, Richard C.; Herbillion, Georges M.
The goal of nuclear forensics is to develop a preferred approach to support illicit trafficking investigations. This approach must be widely understood and accepted as credible. The principal objectives of the Round Robin Tests are to prioritize forensic techniques and methods, evaluate attribution capabilities, and examine the utility of database. The HEU (Highly Enriched Uranium) Round Robin, and previous Plutonium Round Robin, have made tremendous contributions to fulfilling these goals through a collaborative learning experience that resulted from the outstanding efforts of the nine participating internal laboratories. A prioritized list of techniques and methods has been developed based on thismore » exercise. Current work is focused on the extent to which the techniques and methods can be generalized. The HEU Round Robin demonstrated a rather high level of capability to determine the important characteristics of the materials and processes using analytical methods. When this capability is combined with the appropriate knowledge/database, it results in a significant capability to attribute the source of the materials to a specific process or facility. A number of shortfalls were also identified in the current capabilities including procedures for non-nuclear forensics and the lack of a comprehensive network of data/knowledge bases. The results of the Round Robin will be used to develop guidelines or a ''recommended protocol'' to be made available to the interested authorities and countries to use in real cases.« less
Joint Forces Command - Operation United Assistance Case Study: Lessons and Best Practices
2016-07-01
additional and prioritized computers and access in the operations center for these mission requirements are essential. 127 JFC-OUA CASE STUDY Issue...this publication is welcomed and highly encouraged. Joint Forces Command – Operation United Assistance Case Study JFC-OUA CASE STUDY iii Foreword...Based on information drawn from various sources including after action reports, lessons learned, case studies , umbrella-week visits, and key-leader
Gagliano, Sarah A; Ravji, Reena; Barnes, Michael R; Weale, Michael E; Knight, Jo
2015-08-24
Although technology has triumphed in facilitating routine genome sequencing, new challenges have been created for the data-analyst. Genome-scale surveys of human variation generate volumes of data that far exceed capabilities for laboratory characterization. By incorporating functional annotations as predictors, statistical learning has been widely investigated for prioritizing genetic variants likely to be associated with complex disease. We compared three published prioritization procedures, which use different statistical learning algorithms and different predictors with regard to the quantity, type and coding. We also explored different combinations of algorithm and annotation set. As an application, we tested which methodology performed best for prioritizing variants using data from a large schizophrenia meta-analysis by the Psychiatric Genomics Consortium. Results suggest that all methods have considerable (and similar) predictive accuracies (AUCs 0.64-0.71) in test set data, but there is more variability in the application to the schizophrenia GWAS. In conclusion, a variety of algorithms and annotations seem to have a similar potential to effectively enrich true risk variants in genome-scale datasets, however none offer more than incremental improvement in prediction. We discuss how methods might be evolved for risk variant prediction to address the impending bottleneck of the new generation of genome re-sequencing studies.
NCI Pediatric Preclinical Testing Consortium
NCI has awarded grants to five research teams to participate in its Pediatric Preclinical Testing Consortium, which is intended to help to prioritize which agents to pursue in pediatric clinical trials.
Predictive Modeling of Developmental Toxicity
The use of alternative methods in conjunction with traditional in vivo developmental toxicity testing has the potential to (1) reduce cost and increase throughput of testing the chemical universe, (2) prioritize chemicals for further targeted toxicity testing and risk assessment,...
Sharafi, Seyedeh Mahdieh; Moilanen, Atte; White, Matt; Burgman, Mark
2012-12-15
Gap analysis is used to analyse reserve networks and their coverage of biodiversity, thus identifying gaps in biodiversity representation that may be filled by additional conservation measures. Gap analysis has been used to identify priorities for species and habitat types. When it is applied to identify gaps in the coverage of environmental variables, it embodies the assumption that combinations of environmental variables are effective surrogates for biodiversity attributes. The question remains of how to fill gaps in conservation systems efficiently. Conservation prioritization software can identify those areas outside existing conservation areas that contribute to the efficient covering of gaps in biodiversity features. We show how environmental gap analysis can be implemented using high-resolution information about environmental variables and ecosystem condition with the publicly available conservation prioritization software, Zonation. Our method is based on the conversion of combinations of environmental variables into biodiversity features. We also replicated the analysis by using Species Distribution Models (SDMs) as biodiversity features to evaluate the robustness and utility of our environment-based analysis. We apply the technique to a planning case study of the state of Victoria, Australia. Copyright © 2012 Elsevier Ltd. All rights reserved.
West, Howard
2017-09-01
The current standard of care for molecular marker testing in patients with advanced non-small cell lung cancer (NSCLC) has been evolving over several years and is a product of the quality of the evidence supporting a targeted therapy for a specific molecular marker, the pre-test probability of that marker in the population, and the magnitude of benefit seen with that treatment. Among the markers that have one or more matched targeted therapies, only a few are in the subset for which they should be considered as most clearly worthy of prioritizing to detect in the first line setting in order to have them supplant other first line alternatives, and in only a subset of patients, as defined currently by NSCLC histology. Specifically, this currently includes testing for an activating epidermal growth factor receptor ( EGFR ) mutation or an anaplastic lymphoma kinase ( ALK ) or ROS1 rearrangement. This article reviews the history and data supporting the prioritization of these markers in patients with non-squamous NSCLC, a histologically selected population in whom the probability of these markers combined with the anticipated efficacy of targeted therapies against them is high enough to favor these treatments in the first line setting. In reviewing the evidence supporting this very limited core subset of most valuable molecular markers to detect in the initial workup of such patients, we can also see the criteria by which other actionable markers need to reach in order to be widely recognized as reliably valuable enough to warrant prioritization to detect in the initial workup of advanced NSCLC as well.
Overcoming barriers to exercise among parents: a social cognitive theory perspective.
Mailey, Emily L; Phillips, Siobhan M; Dlugonski, Deirdre; Conroy, David E
2016-08-01
Parents face numerous barriers to exercise and exhibit high levels of inactivity. Examining theory-based determinants of exercise among parents may inform interventions for this population. The purpose of this study was to test a social-cognitive model of parental exercise participation over a 12-month period. Mothers (n = 226) and fathers (n = 70) of children <16 completed measures of exercise, barriers self-efficacy, perceived barriers, and exercise planning at baseline and 1 year later. Panel analyses were used to test the hypothesized relationships. Barriers self-efficacy was related to exercise directly and indirectly through perceived barriers and prioritization/planning. Prioritization and planning also mediated the relationship between perceived barriers and exercise. These paths remained significant at 12 months. These results suggest efforts to increase exercise in parents should focus on improving confidence to overcome exercise barriers, reducing perceptions of barriers, and helping parents make specific plans for prioritizing and engaging in exercise.
Klann, Jeffrey G; Anand, Vibha; Downs, Stephen M
2013-12-01
Over 8 years, we have developed an innovative computer decision support system that improves appropriate delivery of pediatric screening and care. This system employs a guidelines evaluation engine using data from the electronic health record (EHR) and input from patients and caregivers. Because guideline recommendations typically exceed the scope of one visit, the engine uses a static prioritization scheme to select recommendations. Here we extend an earlier idea to create patient-tailored prioritization. We used Bayesian structure learning to build networks of association among previously collected data from our decision support system. Using area under the receiver-operating characteristic curve (AUC) as a measure of discriminability (a sine qua non for expected value calculations needed for prioritization), we performed a structural analysis of variables with high AUC on a test set. Our source data included 177 variables for 29 402 patients. The method produced a network model containing 78 screening questions and anticipatory guidance (107 variables total). Average AUC was 0.65, which is sufficient for prioritization depending on factors such as population prevalence. Structure analysis of seven highly predictive variables reveals both face-validity (related nodes are connected) and non-intuitive relationships. We demonstrate the ability of a Bayesian structure learning method to 'phenotype the population' seen in our primary care pediatric clinics. The resulting network can be used to produce patient-tailored posterior probabilities that can be used to prioritize content based on the patient's current circumstances. This study demonstrates the feasibility of EHR-driven population phenotyping for patient-tailored prioritization of pediatric preventive care services.
Brenner, Darren R.; Amos, Christopher I.; Brhane, Yonathan; Timofeeva, Maria N.; Caporaso, Neil; Wang, Yufei; Christiani, David C.; Bickeböller, Heike; Yang, Ping; Albanes, Demetrius; Stevens, Victoria L.; Gapstur, Susan; McKay, James; Boffetta, Paolo; Zaridze, David; Szeszenia-Dabrowska, Neonilia; Lissowska, Jolanta; Rudnai, Peter; Fabianova, Eleonora; Mates, Dana; Bencko, Vladimir; Foretova, Lenka; Janout, Vladimir; Krokan, Hans E.; Skorpen, Frank; Gabrielsen, Maiken E.; Vatten, Lars; Njølstad, Inger; Chen, Chu; Goodman, Gary; Lathrop, Mark; Vooder, Tõnu; Välk, Kristjan; Nelis, Mari; Metspalu, Andres; Broderick, Peter; Eisen, Timothy; Wu, Xifeng; Zhang, Di; Chen, Wei; Spitz, Margaret R.; Wei, Yongyue; Su, Li; Xie, Dong; She, Jun; Matsuo, Keitaro; Matsuda, Fumihiko; Ito, Hidemi; Risch, Angela; Heinrich, Joachim; Rosenberger, Albert; Muley, Thomas; Dienemann, Hendrik; Field, John K.; Raji, Olaide; Chen, Ying; Gosney, John; Liloglou, Triantafillos; Davies, Michael P.A.; Marcus, Michael; McLaughlin, John; Orlow, Irene; Han, Younghun; Li, Yafang; Zong, Xuchen; Johansson, Mattias; Liu, Geoffrey; Tworoger, Shelley S.; Le Marchand, Loic; Henderson, Brian E.; Wilkens, Lynne R.; Dai, Juncheng; Shen, Hongbing; Houlston, Richard S.; Landi, Maria T.; Brennan, Paul; Hung, Rayjean J.
2015-01-01
Large-scale genome-wide association studies (GWAS) have likely uncovered all common variants at the GWAS significance level. Additional variants within the suggestive range (0.0001> P > 5×10−8) are, however, still of interest for identifying causal associations. This analysis aimed to apply novel variant prioritization approaches to identify additional lung cancer variants that may not reach the GWAS level. Effects were combined across studies with a total of 33456 controls and 6756 adenocarcinoma (AC; 13 studies), 5061 squamous cell carcinoma (SCC; 12 studies) and 2216 small cell lung cancer cases (9 studies). Based on prior information such as variant physical properties and functional significance, we applied stratified false discovery rates, hierarchical modeling and Bayesian false discovery probabilities for variant prioritization. We conducted a fine mapping analysis as validation of our methods by examining top-ranking novel variants in six independent populations with a total of 3128 cases and 2966 controls. Three novel loci in the suggestive range were identified based on our Bayesian framework analyses: KCNIP4 at 4p15.2 (rs6448050, P = 4.6×10−7) and MTMR2 at 11q21 (rs10501831, P = 3.1×10−6) with SCC, as well as GAREM at 18q12.1 (rs11662168, P = 3.4×10−7) with AC. Use of our prioritization methods validated two of the top three loci associated with SCC (P = 1.05×10−4 for KCNIP4, represented by rs9799795) and AC (P = 2.16×10−4 for GAREM, represented by rs3786309) in the independent fine mapping populations. This study highlights the utility of using prior functional data for sequence variants in prioritization analyses to search for robust signals in the suggestive range. PMID:26363033
Brenner, Darren R; Amos, Christopher I; Brhane, Yonathan; Timofeeva, Maria N; Caporaso, Neil; Wang, Yufei; Christiani, David C; Bickeböller, Heike; Yang, Ping; Albanes, Demetrius; Stevens, Victoria L; Gapstur, Susan; McKay, James; Boffetta, Paolo; Zaridze, David; Szeszenia-Dabrowska, Neonilia; Lissowska, Jolanta; Rudnai, Peter; Fabianova, Eleonora; Mates, Dana; Bencko, Vladimir; Foretova, Lenka; Janout, Vladimir; Krokan, Hans E; Skorpen, Frank; Gabrielsen, Maiken E; Vatten, Lars; Njølstad, Inger; Chen, Chu; Goodman, Gary; Lathrop, Mark; Vooder, Tõnu; Välk, Kristjan; Nelis, Mari; Metspalu, Andres; Broderick, Peter; Eisen, Timothy; Wu, Xifeng; Zhang, Di; Chen, Wei; Spitz, Margaret R; Wei, Yongyue; Su, Li; Xie, Dong; She, Jun; Matsuo, Keitaro; Matsuda, Fumihiko; Ito, Hidemi; Risch, Angela; Heinrich, Joachim; Rosenberger, Albert; Muley, Thomas; Dienemann, Hendrik; Field, John K; Raji, Olaide; Chen, Ying; Gosney, John; Liloglou, Triantafillos; Davies, Michael P A; Marcus, Michael; McLaughlin, John; Orlow, Irene; Han, Younghun; Li, Yafang; Zong, Xuchen; Johansson, Mattias; Liu, Geoffrey; Tworoger, Shelley S; Le Marchand, Loic; Henderson, Brian E; Wilkens, Lynne R; Dai, Juncheng; Shen, Hongbing; Houlston, Richard S; Landi, Maria T; Brennan, Paul; Hung, Rayjean J
2015-11-01
Large-scale genome-wide association studies (GWAS) have likely uncovered all common variants at the GWAS significance level. Additional variants within the suggestive range (0.0001> P > 5×10(-8)) are, however, still of interest for identifying causal associations. This analysis aimed to apply novel variant prioritization approaches to identify additional lung cancer variants that may not reach the GWAS level. Effects were combined across studies with a total of 33456 controls and 6756 adenocarcinoma (AC; 13 studies), 5061 squamous cell carcinoma (SCC; 12 studies) and 2216 small cell lung cancer cases (9 studies). Based on prior information such as variant physical properties and functional significance, we applied stratified false discovery rates, hierarchical modeling and Bayesian false discovery probabilities for variant prioritization. We conducted a fine mapping analysis as validation of our methods by examining top-ranking novel variants in six independent populations with a total of 3128 cases and 2966 controls. Three novel loci in the suggestive range were identified based on our Bayesian framework analyses: KCNIP4 at 4p15.2 (rs6448050, P = 4.6×10(-7)) and MTMR2 at 11q21 (rs10501831, P = 3.1×10(-6)) with SCC, as well as GAREM at 18q12.1 (rs11662168, P = 3.4×10(-7)) with AC. Use of our prioritization methods validated two of the top three loci associated with SCC (P = 1.05×10(-4) for KCNIP4, represented by rs9799795) and AC (P = 2.16×10(-4) for GAREM, represented by rs3786309) in the independent fine mapping populations. This study highlights the utility of using prior functional data for sequence variants in prioritization analyses to search for robust signals in the suggestive range. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
Mercer, Laina D; Safdar, Rana M; Ahmed, Jamal; Mahamud, Abdirahman; Khan, M Muzaffar; Gerber, Sue; O'Leary, Aiden; Ryan, Mike; Salet, Frank; Kroiss, Steve J; Lyons, Hil; Upfill-Brown, Alexander; Chabot-Couture, Guillaume
2017-10-11
Pakistan is one of only three countries where poliovirus circulation remains endemic. For the Pakistan Polio Eradication Program, identifying high risk districts is essential to target interventions and allocate limited resources. Using a hierarchical Bayesian framework we developed a spatial Poisson hurdle model to jointly model the probability of one or more paralytic polio cases, and the number of cases that would be detected in the event of an outbreak. Rates of underimmunization, routine immunization, and population immunity, as well as seasonality and a history of cases were used to project future risk of cases. The expected number of cases in each district in a 6-month period was predicted using indicators from the previous 6-months and the estimated coefficients from the model. The model achieves an average of 90% predictive accuracy as measured by area under the receiver operating characteristic (ROC) curve, for the past 3 years of cases. The risk of poliovirus has decreased dramatically in many of the key reservoir areas in Pakistan. The results of this model have been used to prioritize sub-national areas in Pakistan to receive additional immunization activities, additional monitoring, or other special interventions.
NDRC: A Disease-Causing Genes Prioritized Method Based on Network Diffusion and Rank Concordance.
Fang, Minghong; Hu, Xiaohua; Wang, Yan; Zhao, Junmin; Shen, Xianjun; He, Tingting
2015-07-01
Disease-causing genes prioritization is very important to understand disease mechanisms and biomedical applications, such as design of drugs. Previous studies have shown that promising candidate genes are mostly ranked according to their relatedness to known disease genes or closely related disease genes. Therefore, a dangling gene (isolated gene) with no edges in the network can not be effectively prioritized. These approaches tend to prioritize those genes that are highly connected in the PPI network while perform poorly when they are applied to loosely connected disease genes. To address these problems, we propose a new disease-causing genes prioritization method that based on network diffusion and rank concordance (NDRC). The method is evaluated by leave-one-out cross validation on 1931 diseases in which at least one gene is known to be involved, and it is able to rank the true causal gene first in 849 of all 2542 cases. The experimental results suggest that NDRC significantly outperforms other existing methods such as RWR, VAVIEN, DADA and PRINCE on identifying loosely connected disease genes and successfully put dangling genes as potential candidate disease genes. Furthermore, we apply NDRC method to study three representative diseases, Meckel syndrome 1, Protein C deficiency and Peroxisome biogenesis disorder 1A (Zellweger). Our study has also found that certain complex disease-causing genes can be divided into several modules that are closely associated with different disease phenotype.
Demographic stability metrics for conservation prioritization of isolated populations.
Finn, Debra S; Bogan, Michael T; Lytle, David A
2009-10-01
Systems of geographically isolated habitat patches house species that occur naturally as small, disjunct populations. Many of these species are of conservation concern, particularly under the interacting influences of isolation and rapid global change. One potential conservation strategy is to prioritize the populations most likely to persist through change and act as sources for future recolonization of less stable localities. We propose an approach to classify long-term population stability (and, presumably, future persistence potential) with composite demographic metrics derived from standard population-genetic data. Stability metrics can be related to simple habitat measures for a straightforward method of classifying localities to inform conservation management. We tested these ideas in a system of isolated desert headwater streams with mitochondrial sequence data from 16 populations of a flightless aquatic insect. Populations exhibited a wide range of stability scores, which were significantly predicted by dry-season aquatic habitat size. This preliminary test suggests strong potential for our proposed method of classifying isolated populations according to persistence potential. The approach is complementary to existing methods for prioritizing local habitats according to diversity patterns and should be tested further in other systems and with additional loci to inform composite demographic stability scores.
Fisher, Carla L.; Nussbaum, Jon F.
2015-01-01
Interpersonal communication is a fundamental part of being and key to health. Interactions within family are especially critical to wellness across time. Family communication is a central means of adaptation to stress, coping, and successful aging. Still, no theoretical argument in the discipline exists that prioritizes kin communication in health. Theoretical advances can enhance interventions and policies that improve family life. This article explores socioemotional selectivity theory (SST), which highlights communication in our survival. Communication partner choice is based on one's time perspective, which affects our prioritization of goals to survive—goals sought socially. This is a first test of SST in a family communication study on women's health and aging. More than 300 women of varying ages and health status participated. Two time factors, later adulthood and late-stage breast cancer, lead women to prioritize family communication. Findings provide a theoretical basis for prioritizing family communication issues in health reform. PMID:26997920
Fisher, Carla L; Nussbaum, Jon F
Interpersonal communication is a fundamental part of being and key to health. Interactions within family are especially critical to wellness across time. Family communication is a central means of adaptation to stress, coping, and successful aging. Still, no theoretical argument in the discipline exists that prioritizes kin communication in health. Theoretical advances can enhance interventions and policies that improve family life. This article explores socioemotional selectivity theory (SST), which highlights communication in our survival. Communication partner choice is based on one's time perspective, which affects our prioritization of goals to survive-goals sought socially. This is a first test of SST in a family communication study on women's health and aging. More than 300 women of varying ages and health status participated. Two time factors, later adulthood and late-stage breast cancer, lead women to prioritize family communication. Findings provide a theoretical basis for prioritizing family communication issues in health reform.
CSMA Versus Prioritized CSMA for Air-Traffic-Control Improvement
NASA Technical Reports Server (NTRS)
Robinson, Daryl C.
2001-01-01
OPNET version 7.0 simulations are presented involving an important application of the Aeronautical Telecommunications Network (ATN), Controller Pilot Data Link Communications (CPDLC) over the Very High Frequency Data Link, Mode 2 (VDL-2). Communication is modeled for essentially all incoming and outgoing nonstop air-traffic for just three United States cities: Cleveland, Cincinnati, and Detroit. There are 32 airports in the simulation, 29 of which are either sources or destinations for the air-traffic of the aforementioned three airports. The simulation involves 111 Air Traffic Control (ATC) ground stations, and 1,235 equally equipped aircraft-taking off, flying realistic free-flight trajectories, and landing in a 24-hr period. Collisionless, Prioritized Carrier Sense Multiple Access (CSMA) is successfully tested and compared with the traditional CSMA typically associated with VDL-2. The performance measures include latency, throughput, and packet loss. As expected, Prioritized CSMA is much quicker and more efficient than traditional CSMA. These simulation results show the potency of Prioritized CSMA for implementing low latency, high throughput, and efficient connectivity.
2013-01-01
Background Differential gene expression (DGE) analysis is commonly used to reveal the deregulated molecular mechanisms of complex diseases. However, traditional DGE analysis (e.g., the t test or the rank sum test) tests each gene independently without considering interactions between them. Top-ranked differentially regulated genes prioritized by the analysis may not directly relate to the coherent molecular changes underlying complex diseases. Joint analyses of co-expression and DGE have been applied to reveal the deregulated molecular modules underlying complex diseases. Most of these methods consist of separate steps: first to identify gene-gene relationships under the studied phenotype then to integrate them with gene expression changes for prioritizing signature genes, or vice versa. It is warrant a method that can simultaneously consider gene-gene co-expression strength and corresponding expression level changes so that both types of information can be leveraged optimally. Results In this paper, we develop a gene module based method for differential gene expression analysis, named network-based differential gene expression (nDGE) analysis, a one-step integrative process for prioritizing deregulated genes and grouping them into gene modules. We demonstrate that nDGE outperforms existing methods in prioritizing deregulated genes and discovering deregulated gene modules using simulated data sets. When tested on a series of smoker and non-smoker lung adenocarcinoma data sets, we show that top differentially regulated genes identified by the rank sum test in different sets are not consistent while top ranked genes defined by nDGE in different data sets significantly overlap. nDGE results suggest that a differentially regulated gene module, which is enriched for cell cycle related genes and E2F1 targeted genes, plays a role in the molecular differences between smoker and non-smoker lung adenocarcinoma. Conclusions In this paper, we develop nDGE to prioritize deregulated genes and group them into gene modules by simultaneously considering gene expression level changes and gene-gene co-regulations. When applied to both simulated and empirical data, nDGE outperforms the traditional DGE method. More specifically, when applied to smoker and non-smoker lung cancer sets, nDGE results illustrate the molecular differences between smoker and non-smoker lung cancer. PMID:24341432
Arnaud, Mickael; Bégaud, Bernard; Thiessard, Frantz; Jarrion, Quentin; Bezin, Julien; Pariente, Antoine; Salvo, Francesco
2018-04-01
Signal detection from healthcare databases is possible, but is not yet used for routine surveillance of drug safety. One challenge is to develop methods for selecting signals that should be assessed with priority. The aim of this study was to develop an automated system combining safety signal detection and prioritization from healthcare databases and applicable to drugs used in chronic diseases. Patients present in the French EGB healthcare database for at least 1 year between 2005 and 2015 were considered. Noninsulin glucose-lowering drugs (NIGLDs) were selected as a case study, and hospitalization data were used to select important medical events (IME). Signal detection was performed quarterly from 2008 to 2015 using sequence symmetry analysis. NIGLD/IME associations were screened if one or more exposed case was identified in the quarter, and three or more exposed cases were identified in the population at the date of screening. Detected signals were prioritized using the Longitudinal-SNIP (L-SNIP) algorithm based on strength (S), novelty (N), and potential impact of signal (I), and pattern of drug use (P). Signals scored in the top 10% were identified as of high priority. A reference set was built based on NIGLD summaries of product characteristics (SPCs) to compute the performance of the developed system. A total of 815 associations were screened and 241 (29.6%) were detected as signals; among these, 58 (24.1%) were prioritized. The performance for signal detection was sensitivity = 47%; specificity = 80%; positive predictive value (PPV) 33%; negative predictive value = 82%. The use of the L-SNIP algorithm increased the early identification of positive controls, restricted to those mentioned in the SPCs after 2008: PPV = 100% versus PPV = 14% with its non-use. The system revealed a strong new signal with dipeptidylpeptidase-4 inhibitors and venous thromboembolism. The developed system seems promising for the routine use of healthcare data for safety surveillance of drugs used in chronic diseases.
Case series analysis of hindfoot injuries sustained by drivers in frontal motor vehicle crashes.
Ye, Xin; Funk, James; Forbes, Aaron; Hurwitz, Shepard; Shaw, Greg; Crandall, Jeff; Freeth, Rob; Michetti, Chris; Rudd, Rodney; Scarboro, Mark
2015-09-01
Improvements to vehicle frontal crashworthiness have led to reductions in toe pan and instrument panel intrusions as well as leg, foot, and ankle loadings in standardized crash tests. Current field data, however, suggests the proportion of foot and ankle injuries sustained by drivers in frontal crashes has not decreased over the past two decades. To explain the inconsistency between crash tests results and real world lower limb injury prevalence, this study investigated the injury causation scenario for the specific hind-foot injury patterns observed in frontal vehicle crashes. Thirty-four cases with leg, foot, and ankle injuries were selected from the Crash Injury Research and Engineering Network (CIREN) database. Talus fractures were present in 20 cases, representing the most frequent hind-foot skeletal injuries observed among the reviewed cases. While axial compression was the predominant loading mechanism causing 18 injuries, 11 injured ankles involved inversion or eversion motion, and 5 involved dorsiflexion as the injury mechanism. Injured ankles of drivers were more biased towards the right aspect with foot pedals contributing to injuries in 13 of the 34 cases. Combined, the results suggest that despite recent advancement of vehicle performance in crash tests, efforts to reduce axial forces sustained in lower extremity should be prioritized. The analysis of injury mechanisms in this study could aid in crash reconstructions and the development of safety systems for vehicles. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.
Stakeholder participation in health impact assessment: A multicultural approach
DOE Office of Scientific and Technical Information (OSTI.GOV)
Negev, Maya, E-mail: mayane@tau.ac.il; Davidovitch, Nadav, E-mail: nadavd@bgu.ac.il; Garb, Yaakov, E-mail: ygarb@bgu.ac.il
2013-11-15
The literature on impact assessment (HIA) registers the importance of stakeholder participation in the assessment process, but still lacks a model for engaging stakeholders of diverse ethnic, professional and sectorial backgrounds. This paper suggests that the multicultural approach can contribute to HIA through a revision of the generic 5-step HIA model, and its implementation in a metropolitan plan in Southern Israel. The health issue scoped by the stakeholders in the HIA is related to land uses in the vicinity of the national hazardous industry and hazardous waste site. The stakeholders were representatives of the diverse populations at stake, including ruralmore » Bedouins and Jewish city dwellers, as well as representatives from the public sector, private sector, non-governmental organizations and academia. The case study revealed that a multicultural stakeholder participation process helps to uncover health issues known to the community which were not addressed in the original plan, and provides local knowledge regarding health conditions that is especially valuable when scientific data is uncertain or absent. It enables diverse stakeholders to prioritize the health issues that will be assessed. The case study also reveals ways in which the model needs revisions and improvements such as in recruitment of diverse participants. This paper presents a multicultural model of HIA and discusses some of the challenges that are faced when HIA is implemented in the context of current decision-making culture. -- Highlights: • We revised the generic HIA model in light of the multicultural approach. • We tested the model in a case study of zoning a hazardous industry site. • Multicultural stakeholder participation uncovers health issues known to communities. • It enables community prioritization of health issues. • We present a model for multicultural stakeholder participation in HIA.« less
Sleep restriction can attenuate prioritization benefits on declarative memory consolidation.
Lo, June C; Bennion, Kelly A; Chee, Michael W L
2016-12-01
As chronic sleep restriction is a widespread problem among adolescents, the present study investigated the effects of a 1-week sleep restriction (SR) versus control period on the consolidation of long-term memory for prose passages. We also determined whether the benefit of prioritization on memory is modulated by adequate sleep occurring during consolidation. Fifty-six healthy adolescents (25 male, aged 15-19 years) were instructed to remember a prose passage in which half of the content was highlighted (prioritized), and were told that they would receive an additional bonus for remembering highlighted content. Following an initial free recall test, participants underwent a 7-night period in which they received either a 5-h (SR) or 9-h (control) nightly sleep opportunity, monitored by polysomnography on selected nights. Free recall of the passage was tested at the end of the sleep manipulation period (1 week after encoding), and again 6 weeks after encoding. Recall of highlighted content was superior to that of non-highlighted content at all three time-points (initial, 1 week, 6 weeks). This beneficial effect of prioritization on memory was stronger 1 week relative to a few minutes after encoding for the control, but not the SR group. N3 duration was similar in the control and SR groups. Overall, the present study shows that the benefits of prioritization on memory are enhanced over time, requiring time and sleep to unfold fully. Partial sleep deprivation (i.e. 5-h nocturnal sleep opportunity) may attenuate such benefits, but this may be offset by preservation of N3 sleep duration. © 2016 The Authors. Journal of Sleep Research published by John Wiley & Sons Ltd on behalf of European Sleep Research Society.
Cheng, Yi-Ru; Martin, Thomas E.
2012-01-01
Different body components are thought to trade off in their growth and development rates, but the causes for relative prioritization of any trait remains a critical question. Offspring of species at higher risk of predation might prioritize development of locomotor traits that facilitate escaping risky environments over growth of mass. We tested this possibility in 12 altricial passerine species that differed in their risk of nest predation. We found that rates of growth and development of mass, wings, and endothermy increased with nest predation risk across species. In particular, species with higher nest predation risk exhibited relatively faster growth of wings than of mass, fledged with relatively larger wing sizes and smaller mass, and developed endothermy earlier at relatively smaller mass. This differential development can facilitate both escape from predators and survival outside of the nest environment. Tarsus growth was not differentially prioritized with respect to nest predation risk, and instead all species achieved adult tarsus size by age of fledging. We also tested whether different foraging modes (aerial, arboreal, and ground foragers) might explain the variation of differential growth of locomotor modules, but we found that little residual variation was explained. Our results suggest that differences in nest predation risk among species are associated with relative prioritization of body components to facilitate escape from the risky nest environment.
Padula, William V; Millis, M Andrew; Worku, Aelaf D; Pronovost, Peter J; Bridges, John F P; Meltzer, David O
2017-03-01
To develop cases of preference-sensitive care and analyze the individualized cost-effectiveness of respecting patient preference compared to guidelines. Four cases were analyzed comparing patient preference to guidelines: (a) high-risk cancer patient preferring to forgo colonoscopy; (b) decubitus patient preferring to forgo air-fluidized bed use; (c) anemic patient preferring to forgo transfusion; (d) end-of-life patient requesting all resuscitative measures. Decision trees were modeled to analyze cost-effectiveness of alternative treatments that respect preference compared to guidelines in USD per quality-adjusted life year (QALY) at a $100,000/QALY willingness-to-pay threshold from patient, provider and societal perspectives. Forgoing colonoscopy dominates colonoscopy from patient, provider, and societal perspectives. Forgoing transfusion and air-fluidized bed are cost-effective from all three perspectives. Palliative care is cost-effective from provider and societal perspectives, but not from the patient perspective. Prioritizing incorporation of patient preferences within guidelines holds good value and should be prioritized when developing new guidelines.
Ram-Tiktin, Efrat
2017-07-01
Natural disasters in populated areas may result in massive casualties and extensive destruction of infrastructure. Humanitarian aid delegations may have to cope with the complicated issue of patient prioritization under conditions of severe resource scarcity. A triage model, consisting of five principles, is proposed for the prioritization of patients, and it is argued that rational and reasonable agents would agree upon them. The Israel Defense Force's humanitarian mission to Haiti following the 2010 earthquake serves as a case study for the various considerations taken into account when designing the ethical-clinical policy of field hospitals. The discussion focuses on three applications: the decision to include an intensive care unit, the decision to include obstetrics and neonatal units, and the treatment policy for compound fractures. © 2017 John Wiley & Sons Ltd.
Townsend, Kristen; Corry, James M; Quigley, Beth Hogan; George, Maureen
2012-02-01
Allergic asthma is common in urban minority children and evidence suggests that remediation tailored to the child's allergic profile is the most effective management strategy. The purpose of this pilot study therefore was to examine the caregiver's recall of their child's skin test results and the accuracy of planned remediation ∼4 months after testing. Caregivers were asked to recall their child's skin test results ∼4 months after their skin testing but before any follow-up visit. A Q-sort was then used to determine the knowledge of the recommended remediation. In this Q-sort, caregivers placed 52 cards, each representing one intervention for an indoor allergen, on a response board that prioritized the interventions. At the conclusion of the Q-sort, caregivers received feedback on the accuracy of their recall and prioritization. African American caregivers (5 females; mean age 33.6) of 5 children (4 males; mean age 7.8) were enrolled. No caregiver's recall of skin test results was concordant with the actual results for type or number of allergens. Caregiver's accuracy in prioritizing strategies was 33-100% for cat dander, 40-70% for molds, 70-87% for dust mite allergens, and 100% for the one dog allergic child. Subjects preferred Q-sort to traditional methods of receiving remediation education. Caregivers do not accurately recall skin test results and this may, in part, impede their ability to implement appropriate interventions. A low-literacy game-style approach is a novel strategy to provide complex teaching that warrants further investigation.
Developmental neurotoxicity testing (DNT) is perceived by many stakeholders to be an area in critical need of alternative methods to current animal testing protocols and gUidelines. An immediate goal is to develop test methods that are capable of screening large numbers of chemic...
Wangdell, Johanna; Fridén, Jan
2011-06-01
To investigate the correlation between perceived performance in prioritized activities and physical conditions related to grip reconstruction. Retrospective clinical outcome study. Forty-seven individuals with tetraplegia were included in the study. Each participant underwent tendon transfer surgery in the hand between November 2002 and April 2009 and had a complete 1-year follow-up. Functional characteristics and performance data were collected from our database and medical records. Patients' perceived performances in prioritized activities were recorded using the Canadian Occupational Performance Measurement. Preoperative data included age at surgery, time since injury, severity of injury, sensibility and hand dominance. At 1-year follow-up, grip strength, key pinch strength, finger pulp-to-palm distance, distance between thumb and index finger and wrist flexion were measured. Correlation rank coefficient was used to test the possible relationship between physical data and activity performance. There were improvements in both functional factors and in rated performance of prioritized activities after surgery. There was no correlation between performance change and any of the physical functions, the factors known before surgery, or the functional outcome factors. No correlation exists between a single functional outcome parameter and the patients' perceived performance of their prioritized goals in reconstructive hand surgery in tetraplegia.
Case Study 3: Species vulnerability assessment for the Middle Rio Grande, New Mexico
Deborah M. Finch; Megan Friggens; Karen Bagne
2011-01-01
This case study describes a method for scoring terrestrial species that have potential to be vulnerable to climate change. The assessment tool seeks to synthesize complex information related to projected climate changes into a predictive tool for species conservation. The tool was designed to aid managers in prioritizing species management actions in response to...
Andrews, Tessa C.; Lemons, Paula P.
2015-01-01
Despite many calls for undergraduate biology instructors to incorporate active learning into lecture courses, few studies have focused on what it takes for instructors to make this change. We sought to investigate the process of adopting and sustaining active-learning instruction. As a framework for our research, we used the innovation-decision model, a generalized model of how individuals adopt innovations. We interviewed 17 biology instructors who were attempting to implement case study teaching and conducted qualitative text analysis on interview data. The overarching theme that emerged from our analysis was that instructors prioritized personal experience—rather than empirical evidence—in decisions regarding case study teaching. We identified personal experiences that promote case study teaching, such as anecdotal observations of student outcomes, and those that hinder case study teaching, such as insufficient teaching skills. By analyzing the differences between experienced and new case study instructors, we discovered that new case study instructors need support to deal with unsupportive colleagues and to develop the skill set needed for an active-learning classroom. We generated hypotheses that are grounded in our data about effectively supporting instructors in adopting and sustaining active-learning strategies. We also synthesized our findings with existing literature to tailor the innovation-decision model. PMID:25713092
Modeling a Civil Event Case Study for Consequence Management Using the IMPRINT Forces Module
NASA Technical Reports Server (NTRS)
Gacy, Marc; Gosakan, Mala; Eckdahl, Angela; Miller, Jeffrey R.
2012-01-01
A critical challenge in the Consequence Management (CM) domain is the appropriate allocation of necessary and skilled military and civilian personnel and materiel resources in unexpected emergencies. To aid this process we used the Forces module in the Improved Performance Research Integration Tool (IMPRINT). This module enables analysts to enter personnel and equipment capabilities, prioritized schedules and numbers available, along with unexpected emergency requirements in order to assess force response requirements. Using a suspected terrorist threat on a college campus, we developed a test case model which exercised the capabilities of the module, including the scope and scale of operations. The model incorporates data from multiple sources, including daily schedules and frequency of events such as fire calls. Our preliminary results indicate that the model can predict potential decreases in civilian emergency response coverage due to an involved unplanned incident requiring significant portions of police, fire and civil responses teams.
A support system for assessing local vulnerability to weather and climate
Coletti, Alex; Howe, Peter D.; Yarnal, Brent; Wood, Nathan J.
2013-01-01
The changing number and nature of weather- and climate-related natural hazards is causing more communities to need to assess their vulnerabilities. Vulnerability assessments, however, often require considerable expertise and resources that are not available or too expensive for many communities. To meet the need for an easy-to-use, cost-effective vulnerability assessment tool for communities, a prototype online vulnerability assessment support system was built and tested. This prototype tool guides users through a stakeholder-based vulnerability assessment that breaks the process into four easy-to-implement steps. Data sources are integrated in the online environment so that perceived risks—defined and prioritized qualitatively by users—can be compared and discussed against the impacts that past events have had on the community. The support system is limited in scope, and the locations of the case studies do not provide a sufficiently broad range of sample cases. The addition of more publically available hazard databases combined with future improvements in the support system architecture and software will expand opportunities for testing and fully implementing the support system.
Non-linear assessment and deficiency of linear relationship for healthcare industry
NASA Astrophysics Data System (ADS)
Nordin, N.; Abdullah, M. M. A. B.; Razak, R. C.
2017-09-01
This paper presents the development of the non-linear service satisfaction model that assumes patients are not necessarily satisfied or dissatisfied with good or poor service delivery. With that, compliment and compliant assessment is considered, simultaneously. Non-linear service satisfaction instrument called Kano-Q and Kano-SS is developed based on Kano model and Theory of Quality Attributes (TQA) to define the unexpected, hidden and unspoken patient satisfaction and dissatisfaction into service quality attribute. A new Kano-Q and Kano-SS algorithm for quality attribute assessment is developed based satisfaction impact theories and found instrumentally fit the reliability and validity test. The results were also validated based on standard Kano model procedure before Kano model and Quality Function Deployment (QFD) is integrated for patient attribute and service attribute prioritization. An algorithm of Kano-QFD matrix operation is developed to compose the prioritized complaint and compliment indexes. Finally, the results of prioritized service attributes are mapped to service delivery category to determine the most prioritized service delivery that need to be improved at the first place by healthcare service provider.
Safety first: Instrumentality for reaching safety determines attention allocation under threat.
Vogt, Julia; Koster, Ernst H W; De Houwer, Jan
2017-04-01
Theories of attention to emotional information suggest that attentional processes prioritize threatening information. In this article, we suggest that attention will prioritize the events that are most instrumental to a goal in any given context, which in threatening situations is typically reaching safety. To test our hypotheses, we used an attentional cueing paradigm that contained cues signaling imminent threat (i.e., aversive noises) as well as cues that allowed participants to avoid threat (instrumental safety signals). Correct reactions to instrumental safety signals seemingly allowed participants to lower the presentation rate of the threat. Experiment 1 demonstrates that attention prioritizes instrumental safety signals over threat signals. Experiment 2 replicates this finding and additionally compares instrumental safety signals to other action-relevant signals controlling for action relevance as cause of the effects. Experiment 3 demonstrates that when actions toward threat signals permit to avoid threat, attention prioritizes threat signals. Taken together, these results support the view that instrumentality for reaching safety determines the allocation of attention under threat. (PsycINFO Database Record (c) 2017 APA, all rights reserved).
In Vitro Testing of Engineered Nanomaterials in the EPA’s ToxCast Program (WC9)
High-throughput and high-content screens are attractive approaches for prioritizing nanomaterial hazards and informing targeted testing due to the impracticality of using traditional toxicological testing on the large numbers and varieties of nanomaterials. The ToxCast program a...
Most nanomaterials (NMs) in commerce lack hazard data. Efficient NM testing requires suitable toxicity tests for prioritization of NMs to be tested. The EPA’s ToxCast program is screening NM bioactivities and ranking NMs by their bioactivities to inform targeted testing planning....
NASA Technical Reports Server (NTRS)
Bolcar, Matthew R.; Feinberg, Lee; France, Kevin; Rauscher, Bernard J.; Redding, David; Schiminovich, David
2016-01-01
The NASA Astrophysics Division's 30-Year Roadmap prioritized a future large-aperture space telescope operating in the ultra-violet/optical/infrared wavelength regime. The Association of Universities for Research in Astronomy envisioned a similar observatory, the High Definition Space Telescope. And a multi-institution group also studied the Advanced Technology Large Aperture Space Telescope. In all three cases, a broad science case is outlined, combining general astrophysics with the search for biosignatures via direct-imaging and spectroscopic characterization of habitable exoplanets. We present an initial technology assessment that enables such an observatory that is currently being studied for the 2020 Decadal Survey by the Large UV/Optical/Infrared (LUVOIR) surveyor Science and Technology Definition Team. We present here the technology prioritization for the 2016 technology cycle and define the required technology capabilities and current state-of-the-art performance. Current, planned, and recommended technology development efforts are also reported.
Initial Technology Assessment for the Large UV-Optical-Infrared (LUVOIR) Mission Concept Study
NASA Technical Reports Server (NTRS)
Bolcar, Matthew R.; Feinberg, Lee D.; France, Kevin; Rauscher, Bernard J.; Redding, David; Schiminovich, David
2016-01-01
The NASA Astrophysics Divisions 30-Year Roadmap prioritized a future large-aperture space telescope operating in the ultra-violet-optical-infrared wavelength regime. The Association of Universities for Research in Astronomy envisioned a similar observatory, the High Definition Space Telescope. And a multi-institution group also studied the Advanced Technology Large Aperture Space Telescope. In all three cases, a broad science case is outlined, combining general astrophysics with the search for bio-signatures via direct-imaging and spectroscopic characterization of habitable exo-planets. We present an initial technology assessment that enables such an observatory that is currently being studied for the 2020 Decadal Survey by the Large UV-Optical Infrared (LUVOIR) surveyor Science and Technology Definition Team. We present here the technology prioritization for the 2016 technology cycle and define the required technology capabilities and current state-of-the-art performance. Current, planned, and recommended technology development efforts are also reported.
Miller, Thaddeus L; Hilsenrath, Peter; Lykens, Kristine; McNabb, Scott J N; Moonan, Patrick K; Weis, Stephen E
2006-04-01
Evaluation improves efficiency and effectiveness. Current U.S. tuberculosis (TB) control policies emphasize the treatment of latent TB infection (LTBI). However, this policy, if not targeted, may be inefficient. We determined the efficiency of a state-law mandated TB screening program and a non state-law mandated one in terms of cost, morbidity, treatment, and disease averted. We evaluated two publicly funded metropolitan TB prevention and control programs through retrospective analyses and modeling. Main outcomes measured were TB incidence and prevalence, TB cases averted, and cost. A non state-law mandated TB program for homeless persons in Tarrant County screened 4.5 persons to identify one with LTBI and 82 persons to identify one with TB. A state-law mandated TB program for jail inmates screened 109 persons to identify one with LTBI and 3274 persons to identify one with TB. The number of patients with LTBI treated to prevent one TB case was 12.1 and 15.3 for the homeless and jail inmate TB programs, respectively. Treatment of LTBI by the homeless and jail inmate TB screening programs will avert 11.9 and 7.9 TB cases at a cost of 14,350 US dollars and 34,761 US dollars per TB case, respectively. Mandated TB screening programs should be risk-based, not population-based. Non mandated targeted testing for TB in congregate settings for the homeless was more efficient than state-law mandated targeted testing for TB among jailed inmates.
An integrative formal model of motivation and decision making: The MGPM*.
Ballard, Timothy; Yeo, Gillian; Loft, Shayne; Vancouver, Jeffrey B; Neal, Andrew
2016-09-01
We develop and test an integrative formal model of motivation and decision making. The model, referred to as the extended multiple-goal pursuit model (MGPM*), is an integration of the multiple-goal pursuit model (Vancouver, Weinhardt, & Schmidt, 2010) and decision field theory (Busemeyer & Townsend, 1993). Simulations of the model generated predictions regarding the effects of goal type (approach vs. avoidance), risk, and time sensitivity on prioritization. We tested these predictions in an experiment in which participants pursued different combinations of approach and avoidance goals under different levels of risk. The empirical results were consistent with the predictions of the MGPM*. Specifically, participants pursuing 1 approach and 1 avoidance goal shifted priority from the approach to the avoidance goal over time. Among participants pursuing 2 approach goals, those with low time sensitivity prioritized the goal with the larger discrepancy, whereas those with high time sensitivity prioritized the goal with the smaller discrepancy. Participants pursuing 2 avoidance goals generally prioritized the goal with the smaller discrepancy. Finally, all of these effects became weaker as the level of risk increased. We used quantitative model comparison to show that the MGPM* explained the data better than the original multiple-goal pursuit model, and that the major extensions from the original model were justified. The MGPM* represents a step forward in the development of a general theory of decision making during multiple-goal pursuit. (PsycINFO Database Record (c) 2016 APA, all rights reserved).
Bennion, Kelly A; Payne, Jessica D; Kensinger, Elizabeth A
2016-06-01
Prior research has demonstrated that sleep enhances memory for future-relevant information, including memory for information that is salient due to emotion, reward, or knowledge of a later memory test. Although sleep has been shown to prioritize information with any of these characteristics, the present study investigates the novel question of how sleep prioritizes information when multiple salience cues exist. Participants encoded scenes that were future-relevant based on emotion (emotional vs. neutral), reward (rewarded vs. unrewarded), and instructed learning (intentionally vs. incidentally encoded), preceding a delay consisting of a nap, an equivalent time period spent awake, or a nap followed by wakefulness (to control for effects of interference). Recognition testing revealed that when multiple dimensions of future relevance co-occur, sleep prioritizes top-down, goal-directed cues (instructed learning, and to a lesser degree, reward) over bottom-up, stimulus-driven characteristics (emotion). Further, results showed that these factors interact; the effect of a nap on intentionally encoded information was especially strong for neutral (relative to emotional) information, suggesting that once one cue for future relevance is present, there are diminishing returns with additional cues. Sleep may binarize information based on whether it is future-relevant or not, preferentially consolidating memory for the former category. Potential neural mechanisms underlying these selective effects and the implications of this research for educational and vocational domains are discussed. (PsycINFO Database Record (c) 2016 APA, all rights reserved).
NASA Astrophysics Data System (ADS)
Qaradaghi, Mohammed
Complexity of the capital intensive oil and gas portfolio investments is continuously growing. It is manifested in the constant increase in the type, number and degree of risks and uncertainties, which consequently lead to more challenging decision making problems. A typical complex decision making problem in petroleum exploration and production (E&P) is the selection and prioritization of oilfields/projects in a portfolio investment. Prioritizing oilfields maybe required for different purposes, including the achievement of a targeted production and allocation of limited available development resources. These resources cannot be distributed evenly nor can they be allocated based on the oilfield size or production capacity alone since various other factors need to be considered simultaneously. These factors may include subsurface complexity, size of reservoir, plateau production and needed infrastructure in addition to other issues of strategic concern, such as socio-economic, environmental and fiscal policies, particularly when the decision making involves governments or national oil companies. Therefore, it would be imperative to employ decision aiding tools that not only address these factors, but also incorporate the decision makers' preferences clearly and accurately. However, the tools commonly used in project portfolio selection and optimization, including intuitive approaches, vary in their focus and strength in addressing the different criteria involved in such decision problems. They are also disadvantaged by a number of drawbacks, which may include lacking the capacity to address multiple and interrelated criteria, uncertainty and risk, project relationship with regard to value contribution and optimum resource utilization, non-monetary attributes, decision maker's knowledge and expertise, in addition to varying levels of ease of use and other practical and theoretical drawbacks. These drawbacks have motivated researchers to investigate other tools and techniques that can provide more flexibility and inclusiveness in the decision making process, such as Multi-Criteria Decision Making (MCDM) methods. However, it can be observed that the MCDM literature: 1) is primarily focused on suggesting certain MCDM techniques to specific problems without providing sufficient evidence for their selection, 2) is inadequate in addressing MCDM in E&P portfolio selection and prioritization compared with other fields, and 3) does not address prioritizing brownfields (i.e., developed oilfields). This research study aims at addressing the above drawbacks through combining three MCDM methods (i.e., AHP, PROMETHEE and TOPSIS) into a single decision making tool that can support optimal oilfield portfolio investment decisions by helping determine the share of each oilfield of the total development resources allocated. Selecting these methods is reinforced by a pre-deployment and post-deployment validation framework. In addition, this study proposes a two-dimensional consistency test to verify the output coherence or prioritization stability of the MCDM methods in comparison with an intuitive approach. Nine scenarios representing all possible outcomes of the internal and external consistency tests are further proposed to reach a conclusion. The methodology is applied to a case study of six major oilfields in Iraq to generate percentage shares of each oilfield of a total production target that is in line with Iraq's aspiration to increase oil production. However, the methodology is intended to be applicable to other E&P portfolio investment prioritization scenarios by taking the specific contextual characteristics into consideration.
Testing of environmental and industrial chemicals for toxicity potential is a daunting task because of the wide range of possible toxicity mechanisms. Although animal testing is one means of achieving broad toxicity coverage, evaluation of large numbers of chemicals is challengin...
Testing of environmental and industrial chemicals for toxicity potential is a daunting task because of the wide range of possible toxicity mechanisms. Although animal testing is one means of achieving broad toxicity coverage, evaluation of large numbers of chemicals is challengin...
CHEK2, MGMT, SULT1E1 and SULT1A1 polymorphisms and endometrial cancer risk.
O'Mara, Tracy A; Ferguson, Kaltin; Fahey, Paul; Marquart, Louise; Yang, Hannah P; Lissowska, Jolanta; Chanock, Stephen; Garcia-Closas, Montserrat; Thompson, Deborah J; Healey, Catherine S; Dunning, Alison M; Easton, Douglas F; Webb, Penelope M; Spurdle, Amanda B
2011-08-01
Several single nucleotide polymorphisms (SNPs) in candidate genes of DNA repair and hormone pathways have been reported to be associated with endometrial cancer risk. We sought to confirm these associations in two endometrial cancer case-control sample sets and used additional data from an existing genome-wide association study to prioritize an additional SNP for further study. Five SNPs from the CHEK2, MGMT, SULT1E1 and SULT1A1 genes, genotyped in a total of 1597 cases and 1507 controls from two case-control studies, the Australian National Endometrial Cancer Study and the Polish Endometrial Cancer Study, were assessed for association with endometrial cancer risk using logistic regression analysis. Imputed data was drawn for CHEK2 rs8135424 for 666 cases from the Study of Epidemiology and Risk factors in Cancer Heredity study and 5190 controls from the Wellcome Trust Case Control Consortium. We observed no association between SNPs in the MGMT, SULT1E1 and SULT1A1 genes and endometrial cancer risk. The A allele of the rs8135424 CHEK2 SNP was associated with decreased risk of endometrial cancer (adjusted per-allele OR 0.83; 95%CI 0.70-0.98; p = .03) however this finding was opposite to that previously published. Imputed data for CHEK2 rs8135424 supported the direction of effect reported in this study (OR 0.85; 95% CI 0.65-1.10). Previously reported endometrial cancer risk associations with SNPs from in genes involved in estrogen metabolism and DNA repair were not replicated in our larger study population. This study highlights the need for replication of candidate gene SNP studies using large sample groups, to confirm risk associations and better prioritize downstream studies to assess the causal relationship between genetic variants and cancer risk. Our findings suggest that the CHEK2 SNP rs8135424 be prioritized for further study as a genetic factor associated with risk of endometrial cancer.
The price of conserving avian phylogenetic diversity: a global prioritization approach
Nunes, Laura A.; Turvey, Samuel T.; Rosindell, James
2015-01-01
The combination of rapid biodiversity loss and limited funds available for conservation represents a major global concern. While there are many approaches for conservation prioritization, few are framed as financial optimization problems. We use recently published avian data to conduct a global analysis of the financial resources required to conserve different quantities of phylogenetic diversity (PD). We introduce a new prioritization metric (ADEPD) that After Downlisting a species gives the Expected Phylogenetic Diversity at some future time. Unlike other metrics, ADEPD considers the benefits to future PD associated with downlisting a species (e.g. moving from Endangered to Vulnerable in the International Union for Conservation of Nature Red List). Combining ADEPD scores with data on the financial cost of downlisting different species provides a cost–benefit prioritization approach for conservation. We find that under worst-case spending $3915 can save 1 year of PD, while under optimal spending $1 can preserve over 16.7 years of PD. We find that current conservation spending patterns are only expected to preserve one quarter of the PD that optimal spending could achieve with the same total budget. Maximizing PD is only one approach within the wider goal of biodiversity conservation, but our analysis highlights more generally the danger involved in uninformed spending of limited resources. PMID:25561665
The price of conserving avian phylogenetic diversity: a global prioritization approach.
Nunes, Laura A; Turvey, Samuel T; Rosindell, James
2015-02-19
The combination of rapid biodiversity loss and limited funds available for conservation represents a major global concern. While there are many approaches for conservation prioritization, few are framed as financial optimization problems. We use recently published avian data to conduct a global analysis of the financial resources required to conserve different quantities of phylogenetic diversity (PD). We introduce a new prioritization metric (ADEPD) that After Downlisting a species gives the Expected Phylogenetic Diversity at some future time. Unlike other metrics, ADEPD considers the benefits to future PD associated with downlisting a species (e.g. moving from Endangered to Vulnerable in the International Union for Conservation of Nature Red List). Combining ADEPD scores with data on the financial cost of downlisting different species provides a cost-benefit prioritization approach for conservation. We find that under worst-case spending $3915 can save 1 year of PD, while under optimal spending $1 can preserve over 16.7 years of PD. We find that current conservation spending patterns are only expected to preserve one quarter of the PD that optimal spending could achieve with the same total budget. Maximizing PD is only one approach within the wider goal of biodiversity conservation, but our analysis highlights more generally the danger involved in uninformed spending of limited resources.
An "EAR" on environmental surveillance and monitoring: A ...
Current environmental monitoring approaches focus primarily on chemical occurrence. However, based on chemical concentration alone, it can be difficult to identify which compounds may be of toxicological concern for prioritization for further monitoring or management. This can be problematic because toxicological characterization is lacking for many emerging contaminants. New sources of high throughput screening data like the ToxCast™ database, which contains data for over 9,000 compounds screened through up to 1,100 assays, are now available. Integrated analysis of chemical occurrence data with HTS data offers new opportunities to prioritize chemicals, sites, or biological effects for further investigation based on concentrations detected in the environment linked to relative potencies in pathway-based bioassays. As a case study, chemical occurrence data from a 2012 study in the Great Lakes Basin along with the ToxCast™ effects database were used to calculate exposure-activity ratios (EARs) as a prioritization tool. Technical considerations of data processing and use of the ToxCast™ database are presented and discussed. EAR prioritization identified multiple sites, biological pathways, and chemicals that warrant further investigation. Biological pathways were then linked to adverse outcome pathways to identify potential adverse outcomes and biomarkers for use in subsequent monitoring efforts. Anthropogenic contaminants are frequently reported in environm
Grewal, Nivit; Singh, Shailendra; Chand, Trilok
2017-01-01
Owing to the innate noise in the biological data sources, a single source or a single measure do not suffice for an effective disease gene prioritization. So, the integration of multiple data sources or aggregation of multiple measures is the need of the hour. The aggregation operators combine multiple related data values to a single value such that the combined value has the effect of all the individual values. In this paper, an attempt has been made for applying the fuzzy aggregation on the network-based disease gene prioritization and investigate its effect under noise conditions. This study has been conducted for a set of 15 blood disorders by fusing four different network measures, computed from the protein interaction network, using a selected set of aggregation operators and ranking the genes on the basis of the aggregated value. The aggregation operator-based rankings have been compared with the "Random walk with restart" gene prioritization method. The impact of noise has also been investigated by adding varying proportions of noise to the seed set. The results reveal that for all the selected blood disorders, the Mean of Maximal operator has relatively outperformed the other aggregation operators for noisy as well as non-noisy data.
Immunotoxicant screening and prioritization in the 21st century
Current immunotoxicity testing guidance for drugs, high production volume chemicals and pesticides specifies the use of animal models that measure immune function or evaluation of general indicators of immune system health generated in routine toxicity testing. The assays are ...
Immunotoxicant screening and prioritization in the 21st century*
Current immunotoxicity testing guidance for drugs, high production volume chemicals and pesticides specifies the use of animal models that measure immune function or evaluation of general indicators of immune system health generated in routine toxicity testing. The assays are r...
Burns, Emily E.; Thomas-Oates, Jane; Kolpin, Dana W.; Furlong, Edward T.; Boxall, Alistair B.A.
2017-01-01
Prioritization methodologies are often used for identifying those pharmaceuticals that pose the greatest risk to the natural environment and to focus laboratory testing or environmental monitoring toward pharmaceuticals of greatest concern. Risk-based prioritization approaches, employing models to derive exposure concentrations, are commonly used, but the reliability of these models is unclear. The present study evaluated the accuracy of exposure models commonly used for pharmaceutical prioritization. Targeted monitoring was conducted for 95 pharmaceuticals in the Rivers Foss and Ouse in the City of York (UK). Predicted environmental concentration (PEC) ranges were estimated based on localized prescription, hydrological data, reported metabolism, and wastewater treatment plant (WWTP) removal rates, and were compared with measured environmental concentrations (MECs). For the River Foss, PECs, obtained using highest metabolism and lowest WWTP removal, were similar to MECs. In contrast, this trend was not observed for the River Ouse, possibly because of pharmaceutical inputs unaccounted for by our modeling. Pharmaceuticals were ranked by risk based on either MECs or PECs. With 2 exceptions (dextromethorphan and diphenhydramine), risk ranking based on both MECs and PECs produced similar results in the River Foss. Overall, these findings indicate that PECs may well be appropriate for prioritization of pharmaceuticals in the environment when robust and local data on the system of interest are available and reflective of most source inputs.
Monteiro, João Filipe G; Galea, Sandro; Flanigan, Timothy; Monteiro, Maria de Lourdes; Friedman, Samuel R; Marshall, Brandon D L
2015-05-01
We used an individual-based model to evaluate the effects of hypothetical prevention interventions on HIV incidence trajectories in a concentrated, mixed epidemic setting from 2011 to 2021, and using Cabo Verde as an example. Simulations were conducted to evaluate the extent to which early HIV treatment and optimization of care, HIV testing, condom distribution, and substance abuse treatment could eliminate new infections (i.e., reduce incidence to less than 10 cases per 10,000 person-years) among non-drug users, female sex workers (FSW), and people who use drugs (PWUD). Scaling up all four interventions resulted in the largest decreases in HIV, with estimates ranging from 1.4 (95 % CI 1.36-1.44) per 10,000 person-years among non-drug users to 8.2 (95 % CI 7.8-8.6) per 10,000 person-years among PWUD in 2021. Intervention scenarios prioritizing FWS and PWUD also resulted in HIV incidence estimates at or below 10 per 10,000 person-years by 2021 for all population sub-groups. Our results suggest that scaling up multiple interventions among entire population is necessary to achieve elimination. However, prioritizing key populations with this combination prevention strategy may also result in a substantial decrease in total incidence.
External Corrosion Direct Assessment for Unique Threats to Underground Pipelines
DOT National Transportation Integrated Search
2007-11-01
External corrosion direct assessment process (ECDA) implemented in accordance with the NACE Recommended Practice RP0502-02 relies on above ground DA techniques to prioritize locations at risk for corrosion. Two special cases warrant special considera...
FIELD-DRIVEN APPROACHES TO SUBSURFACE CONTAMINANT TRANSPORT MODELING.
Observations from field sites provide a means for prioritizing research activities. In the case of petroleum releases, observations may include spiking of concentration distributions that may be related to water table fluctuation, co-location of contaminant plumes with geochemi...
Understanding the effects of different social data on selecting priority conservation areas.
Karimi, Azadeh; Tulloch, Ayesha I T; Brown, Greg; Hockings, Marc
2017-12-01
Conservation success is contingent on assessing social and environmental factors so that cost-effective implementation of strategies and actions can be placed in a broad social-ecological context. Until now, the focus has been on how to include spatially explicit social data in conservation planning, whereas the value of different kinds of social data has received limited attention. In a regional systematic conservation planning case study in Australia, we examined the spatial concurrence of a range of spatially explicit social values and land-use preferences collected using a public participation geographic information system and biological data. We used Zonation to integrate the social data with the biological data in a series of spatial-prioritization scenarios to determine the effect of the different types of social data on spatial prioritization compared with biological data alone. The type of social data (i.e., conservation opportunities or constraints) significantly affected spatial prioritization outcomes. The integration of social values and land-use preferences under different scenarios was highly variable and generated spatial prioritizations 1.2-51% different from those based on biological data alone. The inclusion of conservation-compatible values and preferences added relatively few new areas to conservation priorities, whereas including noncompatible economic values and development preferences as costs significantly changed conservation priority areas (48.2% and 47.4%, respectively). Based on our results, a multifaceted conservation prioritization approach that combines spatially explicit social data with biological data can help conservation planners identify the type of social data to collect for more effective and feasible conservation actions. © 2017 Society for Conservation Biology.
The challenges of detecting and responding to a Lassa fever outbreak in an Ebola-affected setting.
Hamblion, E L; Raftery, P; Wendland, A; Dweh, E; Williams, G S; George, R N C; Soro, L; Katawera, V; Clement, P; Gasasira, A N; Musa, E; Nagbe, T K
2018-01-01
Lassa fever (LF), a priority emerging pathogen likely to cause major epidemics, is endemic in much of West Africa and is difficult to distinguish from other viral hemorrhagic fevers, including Ebola virus disease (EVD). Definitive diagnosis requires laboratory confirmation, which is not widely available in affected settings. The public health action to contain a LF outbreak and the challenges encountered in an EVD-affected setting are reported herein. In February 2016, a rapid response team was deployed in Liberia in response to a cluster of LF cases. Active case finding, case investigation, contact tracing, laboratory testing, environmental investigation, risk communication, and community awareness raising were undertaken. From January to June 2016, 53 suspected LF cases were reported through the Integrated Disease Surveillance and Response system (IDSR). Fourteen cases (26%) were confirmed for LF, 14 (26%) did not have a sample tested, and 25 (47%) were classified as not a case following laboratory analysis. The case fatality rate in the confirmed cases was 29%. One case of international exportation was reported from Sweden. Difficulties were identified in timely specimen collection, packaging, and transportation (in confirmed cases, the time from sample collection to sample result ranged from 2 to 64 days) and a lack of response interventions for early cases. The delay in response to this outbreak could have been related to a number of challenges in this EVD-affected setting: a need to strengthen the IDSR system, develop preparedness plans, train rapid response teams, and build laboratory capacity. Prioritizing these actions will aid in the timely response to future outbreaks. Copyright © 2017 The Authors. Published by Elsevier Ltd.. All rights reserved.
Valentini, Giorgio; Paccanaro, Alberto; Caniza, Horacio; Romero, Alfonso E; Re, Matteo
2014-06-01
In the context of "network medicine", gene prioritization methods represent one of the main tools to discover candidate disease genes by exploiting the large amount of data covering different types of functional relationships between genes. Several works proposed to integrate multiple sources of data to improve disease gene prioritization, but to our knowledge no systematic studies focused on the quantitative evaluation of the impact of network integration on gene prioritization. In this paper, we aim at providing an extensive analysis of gene-disease associations not limited to genetic disorders, and a systematic comparison of different network integration methods for gene prioritization. We collected nine different functional networks representing different functional relationships between genes, and we combined them through both unweighted and weighted network integration methods. We then prioritized genes with respect to each of the considered 708 medical subject headings (MeSH) diseases by applying classical guilt-by-association, random walk and random walk with restart algorithms, and the recently proposed kernelized score functions. The results obtained with classical random walk algorithms and the best single network achieved an average area under the curve (AUC) across the 708 MeSH diseases of about 0.82, while kernelized score functions and network integration boosted the average AUC to about 0.89. Weighted integration, by exploiting the different "informativeness" embedded in different functional networks, outperforms unweighted integration at 0.01 significance level, according to the Wilcoxon signed rank sum test. For each MeSH disease we provide the top-ranked unannotated candidate genes, available for further bio-medical investigation. Network integration is necessary to boost the performances of gene prioritization methods. Moreover the methods based on kernelized score functions can further enhance disease gene ranking results, by adopting both local and global learning strategies, able to exploit the overall topology of the network. Copyright © 2014 The Authors. Published by Elsevier B.V. All rights reserved.
High-Throughput/High-Content Screening Assays with Engineered Nanomaterials in ToxCast
High-throughput and high-content screens are attractive approaches for prioritizing nanomaterial hazards and informing targeted testing due to the impracticality of using traditional toxicological testing on the large numbers and varieties of nanomaterials. The ToxCast program a...
Current protocols for developmental neurotoxicity testing are insufficient to test thousands of commercial chemicals. Thus, development of highthroughput screens (HTS) to detect and prioritize chemicals that may cause developmental neurotoxicity is needed to improve protection of...
Zhou, Xiaoying; Schoenung, Julie M
2009-12-15
There are two quantitative indicators that are most widely used to assess the extent of compliance of industrial facilities with environmental regulations: the quantity of hazardous waste generated and the amount of toxics released. These indicators, albeit useful in terms of some environmental monitoring, fail to account for direct or indirect effects on human and environmental health, especially when aggregating total quantity of releases for a facility or industry sector. Thus, there is a need for a more comprehensive approach that can prioritize a particular chemical (or industry sector) on the basis of its relevant environmental performance and impact on human health. Accordingly, the objective of the present study is to formulate an aggregation of tools that can simultaneously capture multiple effects and several environmental impact categories. This approach allows us to compare and combine results generated with the aid of select U.S.-based quantitative impact assessment tools, thereby supplementing compliance-based metrics such as data from the U.S. Toxic Release Inventory. A case study, which presents findings for the U.S. chemical manufacturing industry, is presented to illustrate the aggregation of these tools. Environmental impacts due to both upstream and manufacturing activities are also evaluated for each industry sector. The proposed combinatorial analysis allows for a more robust evaluation for rating and prioritizing the environmental impacts of industrial waste.
Hammond, Davyda; Conlon, Kathryn; Barzyk, Timothy; Chahine, Teresa; Zartarian, Valerie; Schultz, Brad
2011-03-01
Communities are concerned over pollution levels and seek methods to systematically identify and prioritize the environmental stressors in their communities. Geographic information system (GIS) maps of environmental information can be useful tools for communities in their assessment of environmental-pollution-related risks. Databases and mapping tools that supply community-level estimates of ambient concentrations of hazardous pollutants, risk, and potential health impacts can provide relevant information for communities to understand, identify, and prioritize potential exposures and risk from multiple sources. An assessment of existing databases and mapping tools was conducted as part of this study to explore the utility of publicly available databases, and three of these databases were selected for use in a community-level GIS mapping application. Queried data from the U.S. EPA's National-Scale Air Toxics Assessment, Air Quality System, and National Emissions Inventory were mapped at the appropriate spatial and temporal resolutions for identifying risks of exposure to air pollutants in two communities. The maps combine monitored and model-simulated pollutant and health risk estimates, along with local survey results, to assist communities with the identification of potential exposure sources and pollution hot spots. Findings from this case study analysis will provide information to advance the development of new tools to assist communities with environmental risk assessments and hazard prioritization. © 2010 Society for Risk Analysis.
Stuart, Robyn M; Kerr, Cliff C; Haghparast-Bidgoli, Hassan; Estill, Janne; Grobicki, Laura; Baranczuk, Zofia; Prieto, Lorena; Montañez, Vilma; Reporter, Iyanoosh; Gray, Richard T; Skordis-Worrall, Jolene; Keiser, Olivia; Cheikh, Nejma; Boonto, Krittayawan; Osornprasop, Sutayut; Lavadenz, Fernando; Benedikt, Clemens J; Martin-Hughes, Rowan; Hussain, S Azfar; Kelly, Sherrie L; Kedziora, David J; Wilson, David P
2017-01-01
Prioritizing investments across health interventions is complicated by the nonlinear relationship between intervention coverage and epidemiological outcomes. It can be difficult for countries to know which interventions to prioritize for greatest epidemiological impact, particularly when budgets are uncertain. We examined four case studies of HIV epidemics in diverse settings, each with different characteristics. These case studies were based on public data available for Belarus, Peru, Togo, and Myanmar. The Optima HIV model and software package was used to estimate the optimal distribution of resources across interventions associated with a range of budget envelopes. We constructed "investment staircases", a useful tool for understanding investment priorities. These were used to estimate the best attainable cost-effectiveness of the response at each investment level. We find that when budgets are very limited, the optimal HIV response consists of a smaller number of 'core' interventions. As budgets increase, those core interventions should first be scaled up, and then new interventions introduced. We estimate that the cost-effectiveness of HIV programming decreases as investment levels increase, but that the overall cost-effectiveness remains below GDP per capita. It is important for HIV programming to respond effectively to the overall level of funding availability. The analytic tools presented here can help to guide program planners understand the most cost-effective HIV responses and plan for an uncertain future.
Heiger-Bernays, Wendy J; Wegner, Susanna; Dix, David J
2018-01-16
The presence of industrial chemicals, consumer product chemicals, and pharmaceuticals is well documented in waters in the U.S. and globally. Most of these chemicals lack health-protective guidelines and many have been shown to have endocrine bioactivity. There is currently no systematic or national prioritization for monitoring waters for chemicals with endocrine disrupting activity. We propose ambient water bioactivity concentrations (AWBCs) generated from high throughput data as a health-based screen for endocrine bioactivity of chemicals in water. The U.S. EPA ToxCast program has screened over 1800 chemicals for estrogen receptor (ER) and androgen receptor (AR) pathway bioactivity. AWBCs are calculated for 110 ER and 212 AR bioactive chemicals using high throughput ToxCast data from in vitro screening assays and predictive pathway models, high-throughput toxicokinetic data, and data-driven assumptions about consumption of water. Chemical-specific AWBCs are compared with measured water concentrations in data sets from the greater Denver area, Minnesota lakes, and Oregon waters, demonstrating a framework for identifying endocrine bioactive chemicals. This approach can be used to screen potential cumulative endocrine activity in drinking water and to inform prioritization of future monitoring, chemical testing and pollution prevention efforts.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fernández, Alberto; Rallo, Robert; Giralt, Francesc
2015-10-15
Ready biodegradability is a key property for evaluating the long-term effects of chemicals on the environment and human health. As such, it is used as a screening test for the assessment of persistent, bioaccumulative and toxic substances. Regulators encourage the use of non-testing methods, such as in silico models, to save money and time. A dataset of 757 chemicals was collected to assess the performance of four freely available in silico models that predict ready biodegradability. They were applied to develop a new consensus method that prioritizes the use of each individual model according to its performance on chemical subsetsmore » driven by the presence or absence of different molecular descriptors. This consensus method was capable of almost eliminating unpredictable chemicals, while the performance of combined models was substantially improved with respect to that of the individual models. - Highlights: • Consensus method to predict ready biodegradability by prioritizing multiple QSARs. • Consensus reduced the amount of unpredictable chemicals to less than 2%. • Performance increased with the number of QSAR models considered. • The absence of 2D atom pairs contributed significantly to the consensus model.« less
Cameron, David S; Bertenshaw, Emma J; Sheeran, Paschal
2018-02-01
The present research tested whether incidental positive affect promotes pursuit of physical activity goals. Four key features of goal pursuit were examined - setting physical activity goals (Study 1), goal activation (Study 2), and goal prioritization and goal attainment (Study 3). Participants (N s = 80, 81, and 59, in Studies 1-3, respectively) were randomized to positive affect (joy, hope) or neutral affect (control) conditions in each study. Questionnaire measures of goal level, goal commitment, and means selection (Study 1); a lexical decision task indexed goal activation (Study 2), a choice task captured goal prioritization and MET minutes quantified goal attainment (Study 3). Study 1 showed that positive affect led to a greater number of intended physical activities, and that joy engendered greater willingness to try activities. In Study 2, a positive affect induction led to heightened activation of the physical activity goal compared to the control condition. The joy induction in Study 3 led to greater physical activity, and a trend towards greater goal prioritization. These findings suggest that positive affect enhances the pursuit of physical activity goals. Implications for health behavior theories and interventions are outlined.
The effects of resistance training prioritization in NCAA Division I Football summer training.
Smith, Robert A; Martin, Gerard J; Szivak, Tunde K; Comstock, Brett A; Dunn-Lewis, Courtenay; Hooper, David R; Flanagan, Shawn D; Looney, David P; Volek, Jeff S; Maresh, Carl M; Kraemer, William J
2014-01-01
Resistance training (RT) is an integral part of National Collegiate Athletic Association (NCAA) Division I Football performance programs. In the sport of football, there are several components that a strength and conditioning coach must be aware of. These include body mass, size, strength, power, speed, conditioning, and injury prevention, among others. The purpose of this study was to investigate if the RT component of a performance program could be prioritized for specific results using a nonlinear training model, grouping athletes by eligibility year. The NCAA Division I football student athletes were placed into 3 separate groups based on the playing year. All subjects participated in a 10-week, 4 days·week-1 off-season summer resistance training program. The training of group 1 (n = 20, age: 18.95 ± 0.76 years, height: 186.63 ± 7.21 cm, body mass: 97.66 ± 18.17 kg, playing year: 1.05 ± 0.22 years) prioritized hypertrophy-based RT to gain body mass. The training of group 2 (n = 20, age: 20.05 ± 1.05 years, height: 189.42 ± 5.49 cm, body mass: 106.99 ± 13.53 kg, and playing year: 2.35 ± 0.75 years) prioritized strength-based RT to gain strength. The training of group 3 (n = 20, age: 21.05 ± 1.10 years, height: 186.56 ± 6.73 cm, body mass: 109.8 ± 19.96 kg, playing year: 4.4 ± 0.50 years) prioritized power-based RT to gain power. Performance tests were evaluated during the first weeks of March (Spring) and August (Fall). The test measures included body mass (kilograms), 1-repetition maximum (1RM) bench press (kilograms), 1RM back squat (kilograms), 1RM power clean (kilograms), and countermovement vertical jump (CMVJ) height (centimeters). The primary findings of this investigation were as follows: group 1 saw significant increases in bench press maximum, back squat maximum, and power clean maximum (p ≤ 0.05). Group 2 saw significant increases in bench press maximum, back squat maximum, and power clean maximum (p ≤ 0.05). Group 3 saw a significant increase in power clean maximum (p ≤ 0.05). Group 1's significant increases were expected because of their low training age relatively shorter training history when compared with Groups 2 and 3. Group 1 did not see significant increases in body mass, with 7 out of 20 subjects being nonresponders. Group 2 and 3's significant increases were expected. Unexpectedly, no group saw significant increases in maximum CMVJ height. With so many factors that go into a football performance program contributing to football performance programing, it seems difficult to prioritize 1 RT goal over another without neglecting others during 10-week summer training program. Prioritization of strength appears to have the best overall affect on the RT portion of an off-season football performance program. Nonlinear periodization allows for the prioritization of 1 training goal without disregarding others with a smaller risk of neglecting other important components. This investigation showed that a performance program with a nonlinear model and prioritization on strength had produced the most desirable results.
Incorporating Human Dosimetry and Exposure into High-Throughput In Vitro Toxicity Screening
Many chemicals in commerce today have undergone limited or no safety testing. To reduce the number of untested chemicals and prioritize limited testing resources, several governmental programs are using high-throughput in vitro screens for assessing chemical effects across multip...
Caldwell, Daniel J; Mastrocco, Frank; Margiotta-Casaluci, Luigi; Brooks, Bryan W
2014-11-01
Numerous active pharmaceutical ingredients (APIs), approved prior to enactment of detailed environmental risk assessment (ERA) guidance in the EU in 2006, have been detected in surface waters as a result of advancements in analytical technologies. Without adequate knowledge of the potential hazards these APIs may pose, assessing their environmental risk is challenging. As it would be impractical to commence hazard characterization and ERA en masse, several approaches to prioritizing substances for further attention have been published. Here, through the combination of three presentations given at a recent conference, "Pharmaceuticals in the Environment, Is there a problem?" (Nîmes, France, June 2013) we review several of these approaches, identify salient components, and present available techniques and tools that could facilitate a pragmatic, scientifically sound approach to prioritizing APIs for advanced study or ERA and, where warranted, fill critical data gaps through targeted, intelligent testing. We further present a modest proposal to facilitate future prioritization efforts and advanced research studies that incorporates mammalian pharmacology data (e.g., adverse outcomes pathways and the fish plasma model) and modeled exposure data based on pharmaceutical use. Copyright © 2014 Elsevier Ltd. All rights reserved.
A case study evaluation of the use of video technology in concrete pavement evaluation.
DOT National Transportation Integrated Search
2000-01-01
This report presents the results of an evaluation of video technology as a possible solution to the problem of safely collecting objective condition data for prioritizing concrete pavement rehabilitation needs in Virginia. The study involved the eval...
Emergency nursing management of the multiple trauma patient.
Kosmos, C A
1989-01-01
This case study reinforces key principles in caring for multiply injured trauma victims. The Primary Survey is a tool developed to allow those caring for trauma patients to prioritize injuries. Those injuries identified in the Primary Survey will be the most life threatening.
Hawkins, Kenneth R; Cantera, Jason L; Storey, Helen L; Leader, Brandon T; de Los Santos, Tala
2016-12-01
Global efforts to address schistosomiasis and soil-transmitted helminthiases (STH) include deworming programs for school-aged children that are made possible by large-scale drug donations. Decisions on these mass drug administration (MDA) programs currently rely on microscopic examination of clinical specimens to determine the presence of parasite eggs. However, microscopy-based methods are not sensitive to the low-intensity infections that characterize populations that have undergone MDA. Thus, there has been increasing recognition within the schistosomiasis and STH communities of the need for improved diagnostic tools to support late-stage control program decisions, such as when to stop or reduce MDA. Failure to adequately address the need for new diagnostics could jeopardize achievement of the 2020 London Declaration goals. In this report, we assess diagnostic needs and landscape potential solutions and determine appropriate strategies to improve diagnostic testing to support control and elimination programs. Based upon literature reviews and previous input from experts in the schistosomiasis and STH communities, we prioritized two diagnostic use cases for further exploration: to inform MDA-stopping decisions and post-MDA surveillance. To this end, PATH has refined target product profiles (TPPs) for schistosomiasis and STH diagnostics that are applicable to these use cases. We evaluated the limitations of current diagnostic methods with regards to these use cases and identified candidate biomarkers and diagnostics with potential application as new tools. Based on this analysis, there is a need to develop antigen-detecting rapid diagnostic tests (RDTs) with simplified, field-deployable sample preparation for schistosomiasis. Additionally, there is a need for diagnostic tests that are more sensitive than the current methods for STH, which may include either a field-deployable molecular test or a simple, low-cost, rapid antigen-detecting test.
[Strategy for molecular testing in pulmonary carcinoma].
Penault-Llorca, Frédérique; Tixier, Lucie; Perrot, Loïc; Cayre, Anne
2016-01-01
Nowadays, the analysis of theranostic molecular markers is central in the management of lung cancer. As those tumors are diagnosed in two third of the cases at an advanced stage, molecular screening is frequently performed on "small samples". The screening strategy starts by an accurate histopathological characterization, including on biopsies or cytological specimens. WHO 2015 provided a new classification for small biopsy and cytology, defining categories such as non-small cell carcinoma (NSCC), favor adenocarcinoma (TTF1 positive), or favor squamous cell carcinoma (p40 positive). Only the NSCC tumors, non-squamous, are eligible to molecular testing. A strategy aiming at tissue sparing for the small biopsies has to be organized. Tests corresponding to available drugs are prioritized. Blank slides will be prepared for immunohistochemistry and in situ hybridization based tests such as ALK. DNA will then be extracted for the other tests, EGFR mutation screening first associated or not to KRAS. Then, the emerging biomarkers (HER2, ROS1, RET, BRAF…) as well as potentially other markers in case of clinical trials, can been tested. The spread of next generation sequencing technologies, with a very sensitive all-in-one approach will allow the identification of minority clones. Eventually, the development of liquid biopsies will provide the opportunity to monitor the apparition of resistance clones during treatment. This non-invasive approach allows patients with a contraindication to perform biopsy or with non-relevant biopsies to access to molecular screening. Copyright © 2016. Published by Elsevier Masson SAS.
Burns, Emily E; Thomas-Oates, Jane; Kolpin, Dana W; Furlong, Edward T; Boxall, Alistair B A
2017-10-01
Prioritization methodologies are often used for identifying those pharmaceuticals that pose the greatest risk to the natural environment and to focus laboratory testing or environmental monitoring toward pharmaceuticals of greatest concern. Risk-based prioritization approaches, employing models to derive exposure concentrations, are commonly used, but the reliability of these models is unclear. The present study evaluated the accuracy of exposure models commonly used for pharmaceutical prioritization. Targeted monitoring was conducted for 95 pharmaceuticals in the Rivers Foss and Ouse in the City of York (UK). Predicted environmental concentration (PEC) ranges were estimated based on localized prescription, hydrological data, reported metabolism, and wastewater treatment plant (WWTP) removal rates, and were compared with measured environmental concentrations (MECs). For the River Foss, PECs, obtained using highest metabolism and lowest WWTP removal, were similar to MECs. In contrast, this trend was not observed for the River Ouse, possibly because of pharmaceutical inputs unaccounted for by our modeling. Pharmaceuticals were ranked by risk based on either MECs or PECs. With 2 exceptions (dextromethorphan and diphenhydramine), risk ranking based on both MECs and PECs produced similar results in the River Foss. Overall, these findings indicate that PECs may well be appropriate for prioritization of pharmaceuticals in the environment when robust and local data on the system of interest are available and reflective of most source inputs. Environ Toxicol Chem 2017;36:2823-2832. © 2017 SETAC. © 2017 SETAC.
Nanomaterial (NM) bioactivity profiling by ToxCast high-throughput screening (HTS)
Rapidly increasing numbers of new NMs and their uses demand efficient tests of NM bioactivity for safety assessment. The EPA’s ToxCast program uses HTS assays to prioritize for targeted testing, identify biological pathways affected, and aid in linking NM properties and potential...
High-throughput and high-content screens are attractive approaches for prioritizing nanomaterial hazards and informing targeted testing due to the impracticality of using traditional toxicological testing on the large numbers and varieties of nanomaterials. The ToxCast program a...
In vitro models may be useful for the rapid toxicological screening of large numbers of chemicals for their potential to produce toxicity. Such screening could facilitate prioritization of resources needed for in vivo toxicity testing towards those chemicals most likely to resul...
Modeling Reproductive Toxicity for Chemical Prioritization into an Integrated Testing Strategy
The EPA ToxCast research program uses a high-throughput screening (HTS) approach for predicting the toxicity of large numbers of chemicals. Phase-I tested 309 well-characterized chemicals in over 500 assays of different molecular targets, cellular responses and cell-states. Of th...
Kaushik, Abhinav; Ali, Shakir; Gupta, Dinesh
2017-01-01
Gene connection rewiring is an essential feature of gene network dynamics. Apart from its normal functional role, it may also lead to dysregulated functional states by disturbing pathway homeostasis. Very few computational tools measure rewiring within gene co-expression and its corresponding regulatory networks in order to identify and prioritize altered pathways which may or may not be differentially regulated. We have developed Altered Pathway Analyzer (APA), a microarray dataset analysis tool for identification and prioritization of altered pathways, including those which are differentially regulated by TFs, by quantifying rewired sub-network topology. Moreover, APA also helps in re-prioritization of APA shortlisted altered pathways enriched with context-specific genes. We performed APA analysis of simulated datasets and p53 status NCI-60 cell line microarray data to demonstrate potential of APA for identification of several case-specific altered pathways. APA analysis reveals several altered pathways not detected by other tools evaluated by us. APA analysis of unrelated prostate cancer datasets identifies sample-specific as well as conserved altered biological processes, mainly associated with lipid metabolism, cellular differentiation and proliferation. APA is designed as a cross platform tool which may be transparently customized to perform pathway analysis in different gene expression datasets. APA is freely available at http://bioinfo.icgeb.res.in/APA. PMID:28084397
Duong, Veasna; Tarantola, Arnaud; Ong, Sivuth; Mey, Channa; Choeung, Rithy; Ly, Sowath; Bourhy, Hervé; Dussart, Philippe; Buchy, Philippe
2016-05-01
The diagnosis of dog-mediated rabies in humans and animals has greatly benefited from technical advances in the laboratory setting. Approaches to diagnosis now include the detection of rabies virus (RABV), RABV RNA, or RABV antigens. These assays are important tools in the current efforts aimed at the global elimination of dog-mediated rabies. The assays available for use in laboratories are reviewed herein, as well as their strengths and weaknesses, which vary with the types of sample analyzed. Depending on the setting, however, the public health objectives and use of RABV diagnosis in the field will also vary. In non-endemic settings, the detection of all introduced or emergent animal or human cases justifies exhaustive testing. In dog RABV-endemic settings, such as rural areas of developing countries where most cases occur, the availability of or access to testing may be severely constrained. Thus, these issues are also discussed along with a proposed strategy to prioritize testing while access to rabies testing in the resource-poor, highly endemic setting is improved. As the epidemiological situation of rabies in a country evolves, the strategy should shift from that of an endemic setting to one more suitable for a decreased rabies incidence following the implementation of efficient control measures and when nearing the target of dog-mediated rabies elimination. Copyright © 2016 The Authors. Published by Elsevier Ltd.. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brown, Nicholas R.; Carlsen, Brett W.; Dixon, Brent W.
Dynamic fuel cycle simulation tools are intended to model holistic transient nuclear fuel cycle scenarios. As with all simulation tools, fuel cycle simulators require verification through unit tests, benchmark cases, and integral tests. Model validation is a vital aspect as well. Although compara-tive studies have been performed, there is no comprehensive unit test and benchmark library for fuel cycle simulator tools. The objective of this paper is to identify the must test functionalities of a fuel cycle simulator tool within the context of specific problems of interest to the Fuel Cycle Options Campaign within the U.S. Department of Energy smore » Office of Nuclear Energy. The approach in this paper identifies the features needed to cover the range of promising fuel cycle options identified in the DOE-NE Fuel Cycle Evaluation and Screening (E&S) and categorizes these features to facilitate prioritization. Features were categorized as essential functions, integrating features, and exemplary capabilities. One objective of this paper is to propose a library of unit tests applicable to each of the essential functions. Another underlying motivation for this paper is to encourage an international dialog on the functionalities and standard test methods for fuel cycle simulator tools.« less
Measuring Up: Standardized Testing and the Making of Postwar American Identities, 1940-2001
ERIC Educational Resources Information Center
Shepherd, Keegan J.
2017-01-01
Standardized testing is a defining feature of contemporary American society. It not only governs how people are channeled through their schooling; it amplifies existing social disparities. Nonetheless, standardized testing endures, namely because it has served as a vital tool for the post-1945 American state. The postwar state prioritized, on the…
Prioritizing and optimizing sustainable measures for food waste prevention and management.
Cristóbal, Jorge; Castellani, Valentina; Manfredi, Simone; Sala, Serenella
2018-02-01
Food waste has gained prominence in the European political debate thanks to the recent Circular Economy package. Currently the waste hierarchy, introduced by the Waste Framework Directive, has been the rule followed to prioritize food waste prevention and management measures according to the environmental criteria. But when considering other criteria along with the environmental one, such as the economic, other tools are needed for the prioritization and optimization. This paper addresses the situation in which a decision-maker has to design a food waste prevention programme considering the limited economic resources in order to achieve the highest environmental impact prevention along the whole food life cycle. A methodology using Life Cycle Assessment and mathematical programing is proposed and its capabilities are shown through a case study. Results show that the order established in the waste hierarchy is generally followed. The proposed methodology revealed to be especially helpful in identifying "quick wins" - measures that should be always prioritized since they avoid a high environmental impact at a low cost. Besides, in order to aggregate the environmental scores related to a variety of impact categories, different weighting sets were proposed. In general, results show that the relevance of the weighting set in the prioritization of the measures appears to be limited. Finally, the correlation between reducing food waste generation and reducing environmental impact along the Food Supply Chain has been studied. Results highlight that when planning food waste prevention strategies, it is important to set the targets at the level of environmental impact instead of setting the targets at the level of avoided food waste generation (in mass). Copyright © 2017 The Author(s). Published by Elsevier Ltd.. All rights reserved.
Kareksela, Santtu; Moilanen, Atte; Ristaniemi, Olli; Välivaara, Reima; Kotiaho, Janne S
2018-02-01
The frequently discussed gap between conservation science and practice is manifest in the gap between spatial conservation prioritization plans and their implementation. We analyzed the research-implementation gap of one zoning case by comparing results of a spatial prioritization analysis aimed at avoiding ecological impact of peat mining in a regional zoning process with the final zoning plan. We examined the relatively complex planning process to determine the gaps among research, zoning, and decision making. We quantified the ecological costs of the differing trade-offs between ecological and socioeconomic factors included in the different zoning suggestions by comparing the landscape-level loss of ecological features (species occurrences, habitat area, etc.) between the different solutions for spatial allocation of peat mining. We also discussed with the scientists and planners the reasons for differing zoning suggestions. The implemented plan differed from the scientists suggestion in that its focus was individual ecological features rather than all the ecological features for which there were data; planners and decision makers considered effects of peat mining on areas not included in the prioritization analysis; zoning was not truly seen as a resource-allocation process and not emphasized in general minimizing ecological losses while satisfying economic needs (peat-mining potential); and decision makers based their prioritization of sites on site-level information showing high ecological value and on single legislative factors instead of finding a cost-effective landscape-level solution. We believe that if the zoning and decision-making processes are very complex, then the usefulness of science-based prioritization tools is likely to be reduced. Nevertheless, we found that high-end tools were useful in clearly exposing trade-offs between conservation and resource utilization. © 2017 Society for Conservation Biology.
Hollin, Ilene L; Peay, Holly; Fischer, Ryan; Janssen, Ellen M; Bridges, John F P
2018-05-26
Patient preference information (PPI) have an increasing role in regulatory decision-making, especially in benefit-risk assessment. PPI can also facilitate prioritization of symptoms to treat and inform meaningful selection of clinical trial endpoints. We engaged patients and caregivers to prioritize symptoms of Duchenne and Becker muscular dystrophy (DBMD) and explored preference heterogeneity. Best-worst scaling (object case) was used to assess priorities across 11 symptoms of DBMD that impact quality of life and for which there is unmet need. Respondents selected the most and least important symptoms to treat among a subset of five. Relative importance scores were estimated for each symptom, and preference heterogeneity was identified using mixed logit and latent class analysis. Respondents included patients (n = 59) and caregivers (n = 96) affected by DBMD. Results indicated that respondents prioritized "weaker heart pumping" [score = 5.13; 95% CI (4.67, 5.59)] and pulmonary symptoms: "lung infections" [3.15; (2.80, 3.50)] and "weaker ability to cough" [2.65; (2.33, 2.97)] as the most important symptoms to treat and "poor attention span" as the least important symptom to treat [- 5.23; (- 5.93, - 4.54)]. Statistically significant preference heterogeneity existed (p value < 0.001). At least two classes existed with different priorities. Priorities of the majority latent class (80%) reflected the aggregate results, whereas the minority latent class (20%) did not distinguish among pulmonary and other symptoms. Estimates of the relative importance for symptoms of Duchenne muscular dystrophy indicated that symptoms with direct links to morbidity and mortality were prioritized above other non-skeletal muscle symptoms. Findings suggested the existence of preference heterogeneity for symptoms, which may be related to symptom experience.
Planning and Assessing To Improve Campus-Community Engagement.
ERIC Educational Resources Information Center
Bringle, Robert G.; Hatcher, Julie; Hamilton, Sharon; Young, Peter
2001-01-01
Presents two methods for assessing the scholarship of engagement at the institutional level: (1) Comprehensive Assessment of the Scholarship of Engagement (CASE), a systematic method that compiles information about service learning and community engagement, identifies campus strengths, and prioritizes planning areas; and (2) an institutional…
DOT National Transportation Integrated Search
2011-09-01
"As is the case for most of the Departments of Transportation in the U.S., the Texas Department of : Transportation has been experiencing fluctuations of budget for maintaining and preserving its highway : infrastructure over the recent years. If the...
3-D QSARS FOR RANKING AND PRIORITIZATION OF LARGE CHEMICAL DATASETS: AN EDC CASE STUDY
The COmmon REactivity Pattern (COREPA) approach is a three-dimensional structure activity (3-D QSAR) technique that permits identification and quantification of specific global and local steroelectronic characteristics associated with a chemical's biological activity. It goes bey...
Judson, Richard S; Houck, Keith A; Kavlock, Robert J; Knudsen, Thomas B; Martin, Matthew T; Mortensen, Holly M; Reif, David M; Rotroff, Daniel M; Shah, Imran; Richard, Ann M; Dix, David J
2010-04-01
Chemical toxicity testing is being transformed by advances in biology and computer modeling, concerns over animal use, and the thousands of environmental chemicals lacking toxicity data. The U.S. Environmental Protection Agency's ToxCast program aims to address these concerns by screening and prioritizing chemicals for potential human toxicity using in vitro assays and in silico approaches. This project aims to evaluate the use of in vitro assays for understanding the types of molecular and pathway perturbations caused by environmental chemicals and to build initial prioritization models of in vivo toxicity. We tested 309 mostly pesticide active chemicals in 467 assays across nine technologies, including high-throughput cell-free assays and cell-based assays, in multiple human primary cells and cell lines plus rat primary hepatocytes. Both individual and composite scores for effects on genes and pathways were analyzed. Chemicals displayed a broad spectrum of activity at the molecular and pathway levels. We saw many expected interactions, including endocrine and xenobiotic metabolism enzyme activity. Chemicals ranged in promiscuity across pathways, from no activity to affecting dozens of pathways. We found a statistically significant inverse association between the number of pathways perturbed by a chemical at low in vitro concentrations and the lowest in vivo dose at which a chemical causes toxicity. We also found associations between a small set of in vitro assays and rodent liver lesion formation. This approach promises to provide meaningful data on the thousands of untested environmental chemicals and to guide targeted testing of environmental contaminants.
Targeted exome sequencing of suspected mitochondrial disorders
Lieber, Daniel S.; Calvo, Sarah E.; Shanahan, Kristy; Slate, Nancy G.; Liu, Shangtao; Hershman, Steven G.; Gold, Nina B.; Chapman, Brad A.; Thorburn, David R.; Berry, Gerard T.; Schmahmann, Jeremy D.; Borowsky, Mark L.; Mueller, David M.; Sims, Katherine B.
2013-01-01
Objective: To evaluate the utility of targeted exome sequencing for the molecular diagnosis of mitochondrial disorders, which exhibit marked phenotypic and genetic heterogeneity. Methods: We considered a diverse set of 102 patients with suspected mitochondrial disorders based on clinical, biochemical, and/or molecular findings, and whose disease ranged from mild to severe, with varying age at onset. We sequenced the mitochondrial genome (mtDNA) and the exons of 1,598 nuclear-encoded genes implicated in mitochondrial biology, mitochondrial disease, or monogenic disorders with phenotypic overlap. We prioritized variants likely to underlie disease and established molecular diagnoses in accordance with current clinical genetic guidelines. Results: Targeted exome sequencing yielded molecular diagnoses in established disease loci in 22% of cases, including 17 of 18 (94%) with prior molecular diagnoses and 5 of 84 (6%) without. The 5 new diagnoses implicated 2 genes associated with canonical mitochondrial disorders (NDUFV1, POLG2), and 3 genes known to underlie other neurologic disorders (DPYD, KARS, WFS1), underscoring the phenotypic and biochemical overlap with other inborn errors. We prioritized variants in an additional 26 patients, including recessive, X-linked, and mtDNA variants that were enriched 2-fold over background and await further support of pathogenicity. In one case, we modeled patient mutations in yeast to provide evidence that recessive mutations in ATP5A1 can underlie combined respiratory chain deficiency. Conclusion: The results demonstrate that targeted exome sequencing is an effective alternative to the sequential testing of mtDNA and individual nuclear genes as part of the investigation of mitochondrial disease. Our study underscores the ongoing challenge of variant interpretation in the clinical setting. PMID:23596069
DOE Office of Scientific and Technical Information (OSTI.GOV)
Horowitz, Scott; Maguire, Jeff; Tabares-Velasco, Paulo Cesar
2016-08-01
This multiphase study involved comprehensive comparative testing of EnergyPlus and SEEM to determine the differences in energy consumption predictions between these two programs and to reconcile prioritized discrepancies through bug fixes, modeling improvements, and/or consistent inputs and assumptions.
The U.S. Environmental Protection Agency is evaluating methods to screen and prioritize organophosphorus pesticides for neurotoxicity using behavioral tests in an in vivo, vertebrate, medium-throughput model (zebrafish; Danio rerio). Our behavioral testing paradigm assesses the e...
Find relationships between bioactivities and NM characteristics or testing conditions. Recommend a dose metric for NMs in vitro studies. Establish associations to in vivo toxicity or pathways identified from testing of conventional chemicals with ToxCast HTS methods. May be abl...
Limited Aspects of Reality: Frames of Reference in Language Assessment
ERIC Educational Resources Information Center
Fulcher, Glenn; Svalberg, Agneta
2013-01-01
Language testers operate within two frames of reference: norm-referenced (NRT) and criterion-referenced testing (CRT). The former underpins the world of large-scale standardized testing that prioritizes variability and comparison. The latter supports substantive score meaning in formative and domain specific assessment. Some claim that the…
Chemical toxicity testing is being transformed by advances in biology and computer modeling, concerns over animal use, and the thousands of environmental chemicals lacking toxicity data. The U.S. Environmental Protection Agency’s ToxCast program aims to address these concerns by ...
Thousands of environmental chemicals are subject to regulatory review for their potential to be endocrine disruptors (ED). In vitro high-throughput screening (HTS) assays have emerged as a potential tool for prioritizing chemicals for ED-related whole-animal tests. In this study,...
ERIC Educational Resources Information Center
Tyson, Deonte Rashawn
2017-01-01
This multiple case study examined the methods by which school leaders determined and planned teacher professional development, as well as what teachers perceived as their professional development needs and how they believe school leaders take those needs into account. The study took place at two suburban elementary schools (1 traditional public, 1…
Weinberg, Justine Lew; Bunin, Lisa J; Das, Rupali
2009-01-01
In 2005, the California Department of Public Health, Occupational Health Branch (OHB) investigated an incident of pesticide exposure and identified 27 vineyard workers who became ill due to drift of cyfluthrin, a pesticide being applied to a neighboring orange field to control katydids. Another pest, citrus thrips, was also present in the field. We investigated safer alternatives for katydid and thrips control to prevent illness due to pesticide exposure and used the industrial hygiene hierarchy of controls to prioritize the control methods. OHB evaluated factors that contributed to pesticide exposure and identified safer alternatives by conducting literature reviews on katydid and thrips control, drift prevention technology, and other relevant topics, and by interviewing integrated pest management advisors, conventional and organic growers, equipment manufacturers, county agricultural commissioners, pest control advisors, regulatory agencies, and others. We prioritized methods using the industrial hygiene hierarchy of controls. We identified safer pest control practices that incorporated hazard elimination, chemical substitution, engineering controls, and administrative controls, including employer policies and government regulations.
Fuchs, C
2010-05-01
The German health care system will face major challenges in the near future. Progress in medicine as well as demographic change will combine to drastically exacerbate the scarcity of resources in the health care system. The word scarcity in this case not only refers to the availability of funds. Other resources, e.g., staff, attention, time, and organs for transplantation, are also becoming scarce. It is conceivable that, in the future, it will no longer be possible to provide medical services for all patients to the same extent as in the past. If the necessary resources are not available in the health care system, if the potential for saving resources has been more or less exhausted, and if rationing shall not be an option, the only option to resort to will be prioritization. Prioritization in the health care sector denotes a supply of services according to specific, predetermined criteria. A broad and open public debate, which would have to be accompanied as well as moderated by the Health Council ("Gesundheitsrat"), is essential for determining such criteria.
21st century tools to prioritize contaminants for monitoring and ...
The webinar focused on ways that ToxCast high throughput screening data and the adverse outcome pathway framework, under development in the CSS program, can be used to prioritize environmental contaminants for monitoring and management. The webinar focused on ways that ToxCast high throughput screening data and the adverse outcome pathway framework, under development in the CSS program, can be used to prioritize environmental contaminants for monitoring and management. The work presented focused on case studies conducted in Region 8, in collaboration with EPA Region 8 and NEIC, as well as other federal (USGS, US FWS) and regional partners (Northern Colorado Plateau Network). The Consortium for Research and Education on Emerging Contaminants (CREEC) is a grass-roots 501(c)(3) non-profit organization comprised of world-class scientists and stakeholders with a shared interest in the source, fate, and physiological effects of contaminants of emerging concern (www.creec.net). As such, they represent an important group of stakeholders with an interest in applying the data, approaches, and tools that are being developed by the CSS program.
van Ede, Freek; Niklaus, Marcel; Nobre, Anna C
2017-01-11
Although working memory is generally considered a highly dynamic mnemonic store, popular laboratory tasks used to understand its psychological and neural mechanisms (such as change detection and continuous reproduction) often remain relatively "static," involving the retention of a set number of items throughout a shared delay interval. In the current study, we investigated visual working memory in a more dynamic setting, and assessed the following: (1) whether internally guided temporal expectations can dynamically and reversibly prioritize individual mnemonic items at specific times at which they are deemed most relevant; and (2) the neural substrates that support such dynamic prioritization. Participants encoded two differently colored oriented bars into visual working memory to retrieve the orientation of one bar with a precision judgment when subsequently probed. To test for the flexible temporal control to access and retrieve remembered items, we manipulated the probability for each of the two bars to be probed over time, and recorded EEG in healthy human volunteers. Temporal expectations had a profound influence on working memory performance, leading to faster access times as well as more accurate orientation reproductions for items that were probed at expected times. Furthermore, this dynamic prioritization was associated with the temporally specific attenuation of contralateral α (8-14 Hz) oscillations that, moreover, predicted working memory access times on a trial-by-trial basis. We conclude that attentional prioritization in working memory can be dynamically steered by internally guided temporal expectations, and is supported by the attenuation of α oscillations in task-relevant sensory brain areas. In dynamic, everyday-like, environments, flexible goal-directed behavior requires that mental representations that are kept in an active (working memory) store are dynamic, too. We investigated working memory in a more dynamic setting than is conventional, and demonstrate that expectations about when mnemonic items are most relevant can dynamically and reversibly prioritize these items in time. Moreover, we uncover a neural substrate of such dynamic prioritization in contralateral visual brain areas and show that this substrate predicts working memory retrieval times on a trial-by-trial basis. This places the experimental study of working memory, and its neuronal underpinnings, in a more dynamic and ecologically valid context, and provides new insights into the neural implementation of attentional prioritization within working memory. Copyright © 2017 van Ede et al.
Borrás-Blasco, Joaquín; Casterá, M Dolores-Elvira; Cortes, Xavier; Rosique-Robles, J Dolores; Abad, F Javier
2014-11-01
Until 2010 the cost of biological treatments in Rheumatoid Arthritis (RA) was increasing annually by 15% in our hospital. In 1st January 2011, a Hospital Commission of Biological Therapies involving rheumatology and pharmacy services was created to improve the management of biological drugs and a biological therapy prioritization protocol in RA patients was also established to improve the efficient usage of biological drugs in RA. To evaluate the economic impact associated with a biological therapy prioritization protocol for RA patients in the Hospital of Sagunto. Observational, ambispective study comparing the associated cost of RA patients treated with biological drugs in the pre-protocol (2009 - 2010) versus post-protocol periods (2011 - 2012). RA patients treated with Abatacept (ABA), Adalimumab (ADA), Etanercept (ETN) or Infliximab (IFX) for at least 6 months during the study period (2009 - 2012) were included. In 2012, Tocilizumab (TCZ) was also included in the prioritization protocol. Prioritization protocol was established based on both clinical and economical aspects and supervised case by case by our Commission. Cost savings and economic impact were calculated using Spanish official prices. In the pre-protocol period (2009 - 2010), total expenses were increasing by €110,000, up to €1,761,000 in 2010 (€11,362 pat/year). After protocol implementation, total expenses decreased by 53,676€ on the 2010 - 2011 period, and 149,200€ on the 2011 - 2012 period. On the 2010 - 2011 period the cost of biological therapy per patient-year decreased 355€ (11,007€ pat/year) and additional 653€ (up to 10,354€ pat/year) by 2012, with a cumulative effect of the protocol implementation of 1,008€ per patient-year. In the pre-protocol period (2009), the annual cost/patient was 10.812€ with ETN, 10.942€ with IFX, 12.961€ with ADA and 12.739€ with ABA. By 1st January 2013, the annual cost per patient was 9,469€ with ETN, 10,579€ with IFX, 11,117€ with ADA, 13,540€ with ABA and 14,932€ with TCZ. The creation of our Commission of Biological Therapies is key to rational management of RA patients and optimization of resources, allowing us to save 200,000€ after 2-year efficiency protocol implementation.
2012-11-01
Research , Development, Test , and Evaluation (RDT&E) Appropriations The RDT&E appropriation consists of the mission program budgets for all... research , development, test and evaluation work performed by contractors and government installations and includes an installations and activities budget...than $4,000,000. 9 f. Research , Development, Test , and Evaluation Appropriations The Research , Development, Test , and Evaluation (RDT&E
Kerr, Cliff C.; Haghparast-Bidgoli, Hassan; Estill, Janne; Grobicki, Laura; Baranczuk, Zofia; Prieto, Lorena; Montañez, Vilma; Reporter, Iyanoosh; Gray, Richard T.; Skordis-Worrall, Jolene; Keiser, Olivia; Cheikh, Nejma; Boonto, Krittayawan; Osornprasop, Sutayut; Lavadenz, Fernando; Benedikt, Clemens J.; Martin-Hughes, Rowan; Hussain, S. Azfar; Kelly, Sherrie L.; Kedziora, David J.; Wilson, David P.
2017-01-01
Background Prioritizing investments across health interventions is complicated by the nonlinear relationship between intervention coverage and epidemiological outcomes. It can be difficult for countries to know which interventions to prioritize for greatest epidemiological impact, particularly when budgets are uncertain. Methods We examined four case studies of HIV epidemics in diverse settings, each with different characteristics. These case studies were based on public data available for Belarus, Peru, Togo, and Myanmar. The Optima HIV model and software package was used to estimate the optimal distribution of resources across interventions associated with a range of budget envelopes. We constructed “investment staircases”, a useful tool for understanding investment priorities. These were used to estimate the best attainable cost-effectiveness of the response at each investment level. Findings We find that when budgets are very limited, the optimal HIV response consists of a smaller number of ‘core’ interventions. As budgets increase, those core interventions should first be scaled up, and then new interventions introduced. We estimate that the cost-effectiveness of HIV programming decreases as investment levels increase, but that the overall cost-effectiveness remains below GDP per capita. Significance It is important for HIV programming to respond effectively to the overall level of funding availability. The analytic tools presented here can help to guide program planners understand the most cost-effective HIV responses and plan for an uncertain future. PMID:28972975
Pathogenic Germline Variants in 10,389 Adult Cancers.
Huang, Kuan-Lin; Mashl, R Jay; Wu, Yige; Ritter, Deborah I; Wang, Jiayin; Oh, Clara; Paczkowska, Marta; Reynolds, Sheila; Wyczalkowski, Matthew A; Oak, Ninad; Scott, Adam D; Krassowski, Michal; Cherniack, Andrew D; Houlahan, Kathleen E; Jayasinghe, Reyka; Wang, Liang-Bo; Zhou, Daniel Cui; Liu, Di; Cao, Song; Kim, Young Won; Koire, Amanda; McMichael, Joshua F; Hucthagowder, Vishwanathan; Kim, Tae-Beom; Hahn, Abigail; Wang, Chen; McLellan, Michael D; Al-Mulla, Fahd; Johnson, Kimberly J; Lichtarge, Olivier; Boutros, Paul C; Raphael, Benjamin; Lazar, Alexander J; Zhang, Wei; Wendl, Michael C; Govindan, Ramaswamy; Jain, Sanjay; Wheeler, David; Kulkarni, Shashikant; Dipersio, John F; Reimand, Jüri; Meric-Bernstam, Funda; Chen, Ken; Shmulevich, Ilya; Plon, Sharon E; Chen, Feng; Ding, Li
2018-04-05
We conducted the largest investigation of predisposition variants in cancer to date, discovering 853 pathogenic or likely pathogenic variants in 8% of 10,389 cases from 33 cancer types. Twenty-one genes showed single or cross-cancer associations, including novel associations of SDHA in melanoma and PALB2 in stomach adenocarcinoma. The 659 predisposition variants and 18 additional large deletions in tumor suppressors, including ATM, BRCA1, and NF1, showed low gene expression and frequent (43%) loss of heterozygosity or biallelic two-hit events. We also discovered 33 such variants in oncogenes, including missenses in MET, RET, and PTPN11 associated with high gene expression. We nominated 47 additional predisposition variants from prioritized VUSs supported by multiple evidences involving case-control frequency, loss of heterozygosity, expression effect, and co-localization with mutations and modified residues. Our integrative approach links rare predisposition variants to functional consequences, informing future guidelines of variant classification and germline genetic testing in cancer. Copyright © 2018 The Authors. Published by Elsevier Inc. All rights reserved.
New High Throughput Methods to Estimate Chemical ...
EPA has made many recent advances in high throughput bioactivity testing. However, concurrent advances in rapid, quantitative prediction of human and ecological exposures have been lacking, despite the clear importance of both measures for a risk-based approach to prioritizing and screening chemicals. A recent report by the National Research Council of the National Academies, Exposure Science in the 21st Century: A Vision and a Strategy (NRC 2012) laid out a number of applications in chemical evaluation of both toxicity and risk in critical need of quantitative exposure predictions, including screening and prioritization of chemicals for targeted toxicity testing, focused exposure assessments or monitoring studies, and quantification of population vulnerability. Despite these significant needs, for the majority of chemicals (e.g. non-pesticide environmental compounds) there are no or limited estimates of exposure. For example, exposure estimates exist for only 7% of the ToxCast Phase II chemical list. In addition, the data required for generating exposure estimates for large numbers of chemicals is severely lacking (Egeghy et al. 2012). This SAP reviewed the use of EPA's ExpoCast model to rapidly estimate potential chemical exposures for prioritization and screening purposes. The focus was on bounded chemical exposure values for people and the environment for the Endocrine Disruptor Screening Program (EDSP) Universe of Chemicals. In addition to exposure, the SAP
Fay, Kellie A; Villeneuve, Daniel L; Swintek, Joe; Edwards, Stephen W; Nelms, Mark D; Blackwell, Brett R; Ankley, Gerald T
2018-06-01
The U.S. Environmental Protection Agency's ToxCast program has screened thousands of chemicals for biological activity, primarily using high-throughput in vitro bioassays. Adverse outcome pathways (AOPs) offer a means to link pathway-specific biological activities with potential apical effects relevant to risk assessors. Thus, efforts are underway to develop AOPs relevant to pathway-specific perturbations detected in ToxCast assays. Previous work identified a "cytotoxic burst" (CTB) phenomenon wherein large numbers of the ToxCast assays begin to respond at or near test chemical concentrations that elicit cytotoxicity, and a statistical approach to defining the bounds of the CTB was developed. To focus AOP development on the molecular targets corresponding to ToxCast assays indicating pathway-specific effects, we conducted a meta-analysis to identify which assays most frequently respond at concentrations below the CTB. A preliminary list of potentially important, target-specific assays was determined by ranking assays by the fraction of chemical hits below the CTB compared with the number of chemicals tested. Additional priority assays were identified using a diagnostic-odds-ratio approach which gives greater ranking to assays with high specificity but low responsivity. Combined, the two prioritization methods identified several novel targets (e.g., peripheral benzodiazepine and progesterone receptors) to prioritize for AOP development, and affirmed the importance of a number of existing AOPs aligned with ToxCast targets (e.g., thyroperoxidase, estrogen receptor, aromatase). The prioritization approaches did not appear to be influenced by inter-assay differences in chemical bioavailability. Furthermore, the outcomes were robust based on a variety of different parameters used to define the CTB.
Raizada, Neeraj; Khaparde, Sunil D; Salhotra, Virender Singh; Rao, Raghuram; Kalra, Aakshi; Swaminathan, Soumya; Khanna, Ashwani; Chopra, Kamal Kishore; Hanif, M; Singh, Varinder; Umadevi, K R; Nair, Sreenivas Achuthan; Huddart, Sophie; Prakash, C H Surya; Mall, Shalini; Singh, Pooja; Saha, B K; Denkinger, Claudia M; Boehme, Catharina; Sarin, Sanjay
2018-01-01
Diagnosis of TB in children is challenging, and is largely based on positive history of contact with a TB case, clinical and radiological findings, often without microbiological confirmation. Diagnostic efforts are also undermined by challenges in specimen collection and the limited availability of high sensitivity, rapid diagnostic tests that can be applied with a quick turnaround time. The current project was undertaken in four major cities of India to address TB diagnostic challenges in pediatric population, by offering free of cost Xpert testing to pediatric presumptive TB cases, thereby paving the way for better TB care. A high throughput lab was established in each of the four project cities, and linked to various health care providers across the city through rapid specimen transportation and electronic reporting linkages. Free Xpert testing was offered to all pediatric (0-14 years) presumptive TB cases (both pulmonary and extra-pulmonary) seeking care at public and private health facilities. The current project enrolled 42,238 pediatric presumptive TB cases from April, 2014 to June, 2016. A total of 3,340 (7.91%, CI 7.65-8.17) bacteriologically confirmed TB cases were detected, of which 295 (8.83%, CI 7.9-9.86) were rifampicin-resistant. The level of rifampicin resistance in the project cohort was high. Overall Xpert yielded a high proportion of valid results and TB detection rates were more than three-fold higher than smear microscopy. The project provided same-day testing and early availability of results led to rapid treatment initiation and success rates and very low rates of treatment failure and loss to follow-up. The current project demonstrated the feasibility of rolling out rapid and upfront Xpert testing for pediatric presumptive TB cases through a single Xpert lab per city in an efficient manner. Rapid turnaround testing time facilitated prompt and appropriate treatment initiation. These results suggest that the upfront Xpert assay is a promising solution to address TB diagnosis in children. The high levels of rifampicin resistance detected in presumptive pediatric TB patients tested under the project are a major cause of concern from a public health perspective which underscores the need to further prioritize upfront Xpert access to this vulnerable population.
Raizada, Neeraj; Khaparde, Sunil D.; Salhotra, Virender Singh; Rao, Raghuram; Kalra, Aakshi; Swaminathan, Soumya; Khanna, Ashwani; Chopra, Kamal Kishore; Hanif, M.; Singh, Varinder; Umadevi, K. R.; Nair, Sreenivas Achuthan; Huddart, Sophie; Prakash, C. H. Surya; Mall, Shalini; Singh, Pooja; Saha, B. K.; Denkinger, Claudia M.; Boehme, Catharina
2018-01-01
Background Diagnosis of TB in children is challenging, and is largely based on positive history of contact with a TB case, clinical and radiological findings, often without microbiological confirmation. Diagnostic efforts are also undermined by challenges in specimen collection and the limited availability of high sensitivity, rapid diagnostic tests that can be applied with a quick turnaround time. The current project was undertaken in four major cities of India to address TB diagnostic challenges in pediatric population, by offering free of cost Xpert testing to pediatric presumptive TB cases, thereby paving the way for better TB care. Methods A high throughput lab was established in each of the four project cities, and linked to various health care providers across the city through rapid specimen transportation and electronic reporting linkages. Free Xpert testing was offered to all pediatric (0–14 years) presumptive TB cases (both pulmonary and extra-pulmonary) seeking care at public and private health facilities. Results The current project enrolled 42,238 pediatric presumptive TB cases from April, 2014 to June, 2016. A total of 3,340 (7.91%, CI 7.65–8.17) bacteriologically confirmed TB cases were detected, of which 295 (8.83%, CI 7.9–9.86) were rifampicin-resistant. The level of rifampicin resistance in the project cohort was high. Overall Xpert yielded a high proportion of valid results and TB detection rates were more than three-fold higher than smear microscopy. The project provided same-day testing and early availability of results led to rapid treatment initiation and success rates and very low rates of treatment failure and loss to follow-up. Conclusion The current project demonstrated the feasibility of rolling out rapid and upfront Xpert testing for pediatric presumptive TB cases through a single Xpert lab per city in an efficient manner. Rapid turnaround testing time facilitated prompt and appropriate treatment initiation. These results suggest that the upfront Xpert assay is a promising solution to address TB diagnosis in children. The high levels of rifampicin resistance detected in presumptive pediatric TB patients tested under the project are a major cause of concern from a public health perspective which underscores the need to further prioritize upfront Xpert access to this vulnerable population. PMID:29489887
Hancock, W Thane; Soeters, Heidi M; Hills, Susan L; Link-Gelles, Ruth; Evans, Mary E; Daley, W Randolph; Piercefield, Emily; Anesi, Magele Scott; Mataia, Mary Aseta; Uso, Anaise M; Sili, Benjamin; Tufa, Aifili John; Solaita, Jacqueline; Irvin-Barnwell, Elizabeth; Meaney-Delman, Dana; Wilken, Jason; Weidle, Paul; Toews, Karrie-Ann E; Walker, William; Talboy, Phillip M; Gallo, William K; Krishna, Nevin; Laws, Rebecca L; Reynolds, Megan R; Koneru, Alaya; Gould, Carolyn V
2017-03-24
The first patients with laboratory-confirmed cases of Zika virus disease in American Samoa had symptom onset in January 2016 (1). In response, the American Samoa Department of Health (ASDoH) implemented mosquito control measures (1), strategies to protect pregnant women (1), syndromic surveillance based on electronic health record (EHR) reports (1), Zika virus testing of persons with one or more signs or symptoms of Zika virus disease (fever, rash, arthralgia, or conjunctivitis) (1-3), and routine testing of all asymptomatic pregnant women in accordance with CDC guidance (2,3) . All collected blood and urine specimens were shipped to the Hawaii Department of Health Laboratory for Zika virus testing and to CDC for confirmatory testing. Early in the response, collection and testing of specimens from pregnant women was prioritized over the collection from symptomatic nonpregnant patients because of limited testing and shipping capacity. The weekly numbers of suspected Zika virus disease cases declined from an average of six per week in January-February 2016 to one per week in May 2016. By August, the EHR-based syndromic surveillance (1) indicated a return to pre-outbreak levels. The last Zika virus disease case detected by real-time, reverse transcription-polymerase chain reaction (rRT-PCR) occurred in a patient who had symptom onset on June 19, 2016. In August 2016, ASDoH requested CDC support in assessing whether local transmission had been reduced or interrupted and in proposing a timeline for discontinuation of routine testing of asymptomatic pregnant women. An end date (October 15, 2016) was determined for active mosquito-borne transmission of Zika virus and a timeline was developed for discontinuation of routine screening of asymptomatic pregnant women in American Samoa (conception after December 10, 2016, with permissive testing for asymptomatic women who conceive through April 15, 2017).
Identifying Mendelian disease genes with the Variant Effect Scoring Tool
2013-01-01
Background Whole exome sequencing studies identify hundreds to thousands of rare protein coding variants of ambiguous significance for human health. Computational tools are needed to accelerate the identification of specific variants and genes that contribute to human disease. Results We have developed the Variant Effect Scoring Tool (VEST), a supervised machine learning-based classifier, to prioritize rare missense variants with likely involvement in human disease. The VEST classifier training set comprised ~ 45,000 disease mutations from the latest Human Gene Mutation Database release and another ~45,000 high frequency (allele frequency >1%) putatively neutral missense variants from the Exome Sequencing Project. VEST outperforms some of the most popular methods for prioritizing missense variants in carefully designed holdout benchmarking experiments (VEST ROC AUC = 0.91, PolyPhen2 ROC AUC = 0.86, SIFT4.0 ROC AUC = 0.84). VEST estimates variant score p-values against a null distribution of VEST scores for neutral variants not included in the VEST training set. These p-values can be aggregated at the gene level across multiple disease exomes to rank genes for probable disease involvement. We tested the ability of an aggregate VEST gene score to identify candidate Mendelian disease genes, based on whole-exome sequencing of a small number of disease cases. We used whole-exome data for two Mendelian disorders for which the causal gene is known. Considering only genes that contained variants in all cases, the VEST gene score ranked dihydroorotate dehydrogenase (DHODH) number 2 of 2253 genes in four cases of Miller syndrome, and myosin-3 (MYH3) number 2 of 2313 genes in three cases of Freeman Sheldon syndrome. Conclusions Our results demonstrate the potential power gain of aggregating bioinformatics variant scores into gene-level scores and the general utility of bioinformatics in assisting the search for disease genes in large-scale exome sequencing studies. VEST is available as a stand-alone software package at http://wiki.chasmsoftware.org and is hosted by the CRAVAT web server at http://www.cravat.us PMID:23819870
Bundschuh, Rebecca; Kuhn, Ulrike; Bundschuh, Mirco; Naegele, Caroline; Elsaesser, David; Schlechtriemen, Ulrich; Oehen, Bernadette; Hilbeck, Angelika; Otto, Mathias; Schulz, Ralf; Hofmann, Frieder
2016-03-15
Crop plant residues may enter aquatic ecosystems via wind deposition or surface runoff. In the case of genetically modified crops or crops treated with systemic pesticides, these materials may contain insecticidal Bt toxins or pesticides that potentially affect aquatic life. However, the particular exposure pattern of aquatic ecosystems (i.e., via plant material) is not properly reflected in current risk assessment schemes, which primarily focus on waterborne toxicity and not on plant material as the route of uptake. To assist in risk assessment, the present study proposes a prioritization procedure of stream types based on the freshwater network and crop-specific cultivation data using maize in Germany as a model system. To identify stream types with a high probability of receiving crop materials, we developed a formalized, criteria-based and thus transparent procedure that considers the exposure-related parameters, ecological status--an estimate of the diversity and potential vulnerability of local communities towards anthropogenic stress--and availability of uncontaminated reference sections. By applying the procedure to maize, ten stream types out of 38 are expected to be the most relevant if the ecological effects from plant-incorporated pesticides need to be evaluated. This information is an important first step to identifying habitats within these stream types with a high probability of receiving crop plant material at a more local scale, including accumulation areas. Moreover, the prioritization procedure developed in the present study may support the selection of aquatic species for ecotoxicological testing based on their probability of occurrence in stream types having a higher chance of exposure. Finally, this procedure can be adapted to any geographical region or crop of interest and is, therefore, a valuable tool for a site-specific risk assessment of crop plants carrying systemic pesticides or novel proteins, such as insecticidal Bt toxins, expressed in genetically modified crops. Copyright © 2015 Elsevier B.V. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Miller, P E
Tips and case histories on computer use for idea and outline processing: Productivity software to solve problems of idea hierarchy, transitions, and developments is matched to solutions for communicators. One case is text that ranges from methods and procedures to histories and legal definitions of classification for the US Department of Energy. Applications of value to writers, editors, and managers are for research; calendars; creativity; prioritization; idea discovery and manipulation; file and time management; and contents, indexes, and glossaries. 6 refs., 7 figs.
2012-01-01
Background Deciding which health technologies to fund involves confronting some of the most difficult choices in medicine. As for other countries, the Israeli health system is faced each year with having to make these difficult decisions. The Public National Advisory Committee, known as ‘the Basket Committee’, selects new technologies for the basic list of health care that all Israelis are entitled to access, known as the ‘health basket’. We introduce a framework for health technology prioritization based explicitly on value for money that enables the main variables considered by decision-makers to be explicitly included. Although the framework’s exposition is in terms of the Basket Committee selecting new technologies for Israel’s health basket, we believe that the framework would also work well for other countries. Methods Our proposed prioritization framework involves comparing four main variables for each technology: 1. Incremental benefits, including ‘equity benefits’, to Israel’s population; 2. Incremental total cost to Israel’s health system; 3. Quality of evidence; and 4. Any additional ‘X-factors’ not elsewhere included, such as strategic or legal factors, etc. Applying methodology from multi-criteria decision analysis, the multiple dimensions comprising the first variable are aggregated via a points system. Results The four variables are combined for each technology and compared across the technologies in the ‘Value for Money (VfM) Chart’. The VfM Chart can be used to identify technologies that are good value for money, and, given a budget constraint, to select technologies that should be funded. This is demonstrated using 18 illustrative technologies. Conclusions The VfM Chart is an intuitively appealing decision-support tool for helping decision-makers to focus on the inherent tradeoffs involved in health technology prioritization. Such deliberations can be performed in a systematic and transparent fashion that can also be easily communicated to stakeholders, including the general public. Possible future research includes pilot-testing the VfM Chart using real-world data. Ideally, this would involve working with the Basket Committee. Likewise, the framework could be tested and applied by health technology prioritization agencies in other countries. PMID:23181391
Golan, Ofra; Hansen, Paul
2012-11-26
Deciding which health technologies to fund involves confronting some of the most difficult choices in medicine. As for other countries, the Israeli health system is faced each year with having to make these difficult decisions. The Public National Advisory Committee, known as 'the Basket Committee', selects new technologies for the basic list of health care that all Israelis are entitled to access, known as the 'health basket'. We introduce a framework for health technology prioritization based explicitly on value for money that enables the main variables considered by decision-makers to be explicitly included. Although the framework's exposition is in terms of the Basket Committee selecting new technologies for Israel's health basket, we believe that the framework would also work well for other countries. Our proposed prioritization framework involves comparing four main variables for each technology: 1. Incremental benefits, including 'equity benefits', to Israel's population; 2. Incremental total cost to Israel's health system; 3. Quality of evidence; and 4. Any additional 'X-factors' not elsewhere included, such as strategic or legal factors, etc. Applying methodology from multi-criteria decision analysis, the multiple dimensions comprising the first variable are aggregated via a points system. The four variables are combined for each technology and compared across the technologies in the 'Value for Money (VfM) Chart'. The VfM Chart can be used to identify technologies that are good value for money, and, given a budget constraint, to select technologies that should be funded. This is demonstrated using 18 illustrative technologies. The VfM Chart is an intuitively appealing decision-support tool for helping decision-makers to focus on the inherent tradeoffs involved in health technology prioritization. Such deliberations can be performed in a systematic and transparent fashion that can also be easily communicated to stakeholders, including the general public. Possible future research includes pilot-testing the VfM Chart using real-world data. Ideally, this would involve working with the Basket Committee. Likewise, the framework could be tested and applied by health technology prioritization agencies in other countries.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Prindle, N.H.; Mendenhall, F.T.; Trauth, K.
1996-05-01
The Systems Prioritization Method (SPM) is a decision-aiding tool developed by Sandia National Laboratories (SNL). SPM provides an analytical basis for supporting programmatic decisions for the Waste Isolation Pilot Plant (WIPP) to meet selected portions of the applicable US EPA long-term performance regulations. The first iteration of SPM (SPM-1), the prototype for SPM< was completed in 1994. It served as a benchmark and a test bed for developing the tools needed for the second iteration of SPM (SPM-2). SPM-2, completed in 1995, is intended for programmatic decision making. This is Volume II of the three-volume final report of the secondmore » iteration of the SPM. It describes the technical input and model implementation for SPM-2, and presents the SPM-2 technical baseline and the activities, activity outcomes, outcome probabilities, and the input parameters for SPM-2 analysis.« less
Diroma, Maria Angela; Santorsola, Mariangela; Guttà, Cristiano; Gasparre, Giuseppe; Picardi, Ernesto; Pesole, Graziano; Attimonelli, Marcella
2014-01-01
Motivation: The increasing availability of mitochondria-targeted and off-target sequencing data in whole-exome and whole-genome sequencing studies (WXS and WGS) has risen the demand of effective pipelines to accurately measure heteroplasmy and to easily recognize the most functionally important mitochondrial variants among a huge number of candidates. To this purpose, we developed MToolBox, a highly automated pipeline to reconstruct and analyze human mitochondrial DNA from high-throughput sequencing data. Results: MToolBox implements an effective computational strategy for mitochondrial genomes assembling and haplogroup assignment also including a prioritization analysis of detected variants. MToolBox provides a Variant Call Format file featuring, for the first time, allele-specific heteroplasmy and annotation files with prioritized variants. MToolBox was tested on simulated samples and applied on 1000 Genomes WXS datasets. Availability and implementation: MToolBox package is available at https://sourceforge.net/projects/mtoolbox/. Contact: marcella.attimonelli@uniba.it Supplementary information: Supplementary data are available at Bioinformatics online. PMID:25028726
Armitage, Emily G; Godzien, Joanna; Peña, Imanol; López-Gonzálvez, Ángeles; Angulo, Santiago; Gradillas, Ana; Alonso-Herranz, Vanesa; Martín, Julio; Fiandor, Jose M; Barrett, Michael P; Gabarro, Raquel; Barbas, Coral
2018-05-18
A lack of viable hits, increasing resistance, and limited knowledge on mode of action is hindering drug discovery for many diseases. To optimize prioritization and accelerate the discovery process, a strategy to cluster compounds based on more than chemical structure is required. We show the power of metabolomics in comparing effects on metabolism of 28 different candidate treatments for Leishmaniasis (25 from the GSK Leishmania box, two analogues of Leishmania box series, and amphotericin B as a gold standard treatment), tested in the axenic amastigote form of Leishmania donovani. Capillary electrophoresis-mass spectrometry was applied to identify the metabolic profile of Leishmania donovani, and principal components analysis was used to cluster compounds on potential mode of action, offering a medium throughput screening approach in drug selection/prioritization. The comprehensive and sensitive nature of the data has also made detailed effects of each compound obtainable, providing a resource to assist in further mechanistic studies and prioritization of these compounds for the development of new antileishmanial drugs.
Communities are concerned over pollution levels and seek methods to systematically identify and prioritize the environmental stressors in their communities. Geographic information system (GIS) maps of environmental information can be useful tools for communities in their assessm...
Scope Management: A Core Information System Implementation Project Pedagogy
ERIC Educational Resources Information Center
Léger, Pierre-Majorique; Lyle, Derick; Babin, Gilbert; Charland, Patrick; Pellerin, Robert
2013-01-01
This article describes an initiative to provide IS management a capstone course that builds on the zone of proximal development concept, oriented towards developing prioritization and critical reasoning skills, and to promote self-learning. Request for proposal business cases appear to offer effective mechanisms for retaining context, while…
Prioritizing Social and Moral Learning Amid Conservative Curriculum Trends: Spaces of Possibility
ERIC Educational Resources Information Center
Keddie, Amanda
2015-01-01
Conservative trends across western schooling contexts are signalling an explicit devaluing of social and moral learning within their official curriculum mandates. These mandates are increasingly privileging the "academic rigour" of traditional subject disciplines. This paper draws on interview and observation data from a case study of a…
The Administrative Use of Microcomputers. Technical Report.
ERIC Educational Resources Information Center
Alabama Univ., University. Coll. of Education.
Citing the growing interest in using microcomputers as an aid in educational administration, this report discusses factors that must be considered when purchasing a computer and describes an actual case of computer implementation for administrative purposes. The first steps in the purchasing process are to assess and to prioritize the…
The HESI-led RISK21 effort has developed a framework supporting the use of twenty first century technology in obtaining and using information for chemical risk assessment. This framework represents a problem formulation-based, exposure-driven, tiered data acquisition approach tha...
The Balancing Act: Arts Integration and High-Stakes Testing
ERIC Educational Resources Information Center
Van Eman, Linnea; Thorman, Jerilyn; Montgomery, Diane; Otto, Stacy
2007-01-01
This study describes three teachers and their experiences of an arts-integration reform model amidst the high-stakes accountability movement. Their struggle to practice arts integration within their school district, a culture in which high-stakes testing is prioritized is described by way of a circus metaphor. Through the theoretical lens of Self…
Research efforts by the US Environmental Protection Agency have set out to develop alternative testing programs to prioritize limited testing resources toward chemicals that likely represent the greatest hazard to human health and the environment. Efforts such as EPA’s ToxCast r...
We are evaluating methods to screen/prioritize large numbers of chemicals using 6 day old zebrafish (Danio rerio) as an alternative model for detecting neurotoxic effects. Our behavioral testing paradigm simultaneously tests individual larval zebrafish under sequential light and...
In accordance with recommendations contained in a National Research Council Report on Toxicity Testing in the 21st Century: A Vision and Strategy, as well as European goals pertaining to reducing, refining, and replacing the use of animals in ecotoxicology safety testing there is...
The U.S. Environmental Protection Agency is evaluating methods to screen and prioritize large numbers of chemicals using 6 day old zebrafish (Danio rerio) as an alternative test model for detecting neurotoxic chemicals. We use a behavioral testing paradigm that simultaneously tes...
Effective Rating Scale Development for Speaking Tests: Performance Decision Trees
ERIC Educational Resources Information Center
Fulcher, Glenn; Davidson, Fred; Kemp, Jenny
2011-01-01
Rating scale design and development for testing speaking is generally conducted using one of two approaches: the measurement-driven approach or the performance data-driven approach. The measurement-driven approach prioritizes the ordering of descriptors onto a single scale. Meaning is derived from the scaling methodology and the agreement of…
Tox21: Putting a Lens on the Vision of Toxicity Testing in the 21st Century
In response to the release of the NRC report on "Toxicity Testing in the 21st Century, a Vision and Strategy" (NRC, 2007), two NIH institutes and EPA formed a collaboration (Tox21) to 1) identify mechanisms of chemically induced biological activity, 2) prioritize chemicals for mo...
The National Academies report on Toxicity Testing in the 21st Century envisioned the use of in vitro toxicity tests using cells of human origin to predict the ability of chemicals to cause toxicity in vivo. Successful implementation of this strategy will ultimately result in fast...
Interactogeneous: Disease Gene Prioritization Using Heterogeneous Networks and Full Topology Scores
Gonçalves, Joana P.; Francisco, Alexandre P.; Moreau, Yves; Madeira, Sara C.
2012-01-01
Disease gene prioritization aims to suggest potential implications of genes in disease susceptibility. Often accomplished in a guilt-by-association scheme, promising candidates are sorted according to their relatedness to known disease genes. Network-based methods have been successfully exploiting this concept by capturing the interaction of genes or proteins into a score. Nonetheless, most current approaches yield at least some of the following limitations: (1) networks comprise only curated physical interactions leading to poor genome coverage and density, and bias toward a particular source; (2) scores focus on adjacencies (direct links) or the most direct paths (shortest paths) within a constrained neighborhood around the disease genes, ignoring potentially informative indirect paths; (3) global clustering is widely applied to partition the network in an unsupervised manner, attributing little importance to prior knowledge; (4) confidence weights and their contribution to edge differentiation and ranking reliability are often disregarded. We hypothesize that network-based prioritization related to local clustering on graphs and considering full topology of weighted gene association networks integrating heterogeneous sources should overcome the above challenges. We term such a strategy Interactogeneous. We conducted cross-validation tests to assess the impact of network sources, alternative path inclusion and confidence weights on the prioritization of putative genes for 29 diseases. Heat diffusion ranking proved the best prioritization method overall, increasing the gap to neighborhood and shortest paths scores mostly on single source networks. Heterogeneous associations consistently delivered superior performance over single source data across the majority of methods. Results on the contribution of confidence weights were inconclusive. Finally, the best Interactogeneous strategy, heat diffusion ranking and associations from the STRING database, was used to prioritize genes for Parkinson’s disease. This method effectively recovered known genes and uncovered interesting candidates which could be linked to pathogenic mechanisms of the disease. PMID:23185389
Case management information systems: how to put the pieces together now and beyond year 2000.
Matthews, P
1999-01-01
Healthcare organizations must establish the goals and objectives of their case management processes before functional and system requirements can be defined. A gap analysis will identify existing systems that can be used to support case management as well as areas in need of systems support. The gap analysis will also identify short-term tactical projects and long-term strategic initiatives supporting the automation of case management. The projects resulting from the gap analysis must be incorporated into the organization's business and information systems plan and budget to ensure appropriate funding and prioritization.
Failure mode and effects analysis: an empirical comparison of failure mode scoring procedures.
Ashley, Laura; Armitage, Gerry
2010-12-01
To empirically compare 2 different commonly used failure mode and effects analysis (FMEA) scoring procedures with respect to their resultant failure mode scores and prioritization: a mathematical procedure, where scores are assigned independently by FMEA team members and averaged, and a consensus procedure, where scores are agreed on by the FMEA team via discussion. A multidisciplinary team undertook a Healthcare FMEA of chemotherapy administration. This included mapping the chemotherapy process, identifying and scoring failure modes (potential errors) for each process step, and generating remedial strategies to counteract them. Failure modes were scored using both an independent mathematical procedure and a team consensus procedure. Almost three-fifths of the 30 failure modes generated were scored differently by the 2 procedures, and for just more than one-third of cases, the score discrepancy was substantial. Using the Healthcare FMEA prioritization cutoff score, almost twice as many failure modes were prioritized by the consensus procedure than by the mathematical procedure. This is the first study to empirically demonstrate that different FMEA scoring procedures can score and prioritize failure modes differently. It found considerable variability in individual team members' opinions on scores, which highlights the subjective and qualitative nature of failure mode scoring. A consensus scoring procedure may be most appropriate for FMEA as it allows variability in individuals' scores and rationales to become apparent and to be discussed and resolved by the team. It may also yield team learning and communication benefits unlikely to result from a mathematical procedure.
Prioritizing investments in innovations to protect women from the leading causes of maternal death
2014-01-01
PATH, an international nonprofit organization, assessed nearly 40 technologies for their potential to reduce maternal mortality from postpartum hemorrhage and preeclampsia and eclampsia in low-resource settings. The evaluation used a new Excel-based prioritization tool covering 22 criteria developed by PATH, the Maternal and Neonatal Directed Assessment of Technology (MANDATE) model, and consultations with experts. It identified five innovations with especially high potential: technologies to improve use of oxytocin, a uterine balloon tamponade, simplified dosing of magnesium sulfate, an improved proteinuria test, and better blood pressure measurement devices. Investments are needed to realize the potential of these technologies to reduce mortality. PMID:24405972
Prioritizing investments in innovations to protect women from the leading causes of maternal death.
Herrick, Tara M; Harner-Jay, Claudia M; Levisay, Alice M; Coffey, Patricia S; Free, Michael J; LaBarre, Paul D
2014-01-09
PATH, an international nonprofit organization, assessed nearly 40 technologies for their potential to reduce maternal mortality from postpartum hemorrhage and preeclampsia and eclampsia in low-resource settings. The evaluation used a new Excel-based prioritization tool covering 22 criteria developed by PATH, the Maternal and Neonatal Directed Assessment of Technology (MANDATE) model, and consultations with experts. It identified five innovations with especially high potential: technologies to improve use of oxytocin, a uterine balloon tamponade, simplified dosing of magnesium sulfate, an improved proteinuria test, and better blood pressure measurement devices. Investments are needed to realize the potential of these technologies to reduce mortality.
Gaines, Tommi L; Caldwell, Julia T; Ford, Chandra L; Mulatu, Mesfin S; Godette, Dionne C
2016-01-01
The Centers for Disease Control and Prevention's (CDC) expanded testing initiative (ETI) aims to bolster HIV testing among populations disproportionately affected by the HIV epidemic by providing additional funding to health departments serving these communities. ETI prioritizes testing in clinical settings; therefore, we examined the relationship between state-level ETI participation and past-year HIV testing among a racially/ethnically diverse sample of adult respondents to the 2012 Behavioral Risk Factor Surveillance System who accessed health services within the 12 months prior to being interviewed. Controlling for individual- and state-level characteristics in a multilevel logistic regression model, ETI participation was independently and positively associated with past-year testing, but this association varied by race/ethnicity. Hispanics had higher odds (adjusted odds ratio [AOR]: 1.49; 95% CI: 1.11-2.02) and American Indian/Alaska Natives had lower odds (AOR: 0.66; 95% CI: 0.43-0.99) of testing if they resided in states with (vs. without) ETI participation. State-level ETI participation did not significantly alter past-year testing among other racial/ethnic groups. Prioritizing public health resources in states most affected by HIV can improve testing patterns, but other mechanisms likely influence which racial/ethnic groups undergo testing.
A review of selection-based tests of abiotic surrogates for species representation.
Beier, Paul; Sutcliffe, Patricia; Hjort, Jan; Faith, Daniel P; Pressey, Robert L; Albuquerque, Fabio
2015-06-01
Because conservation planners typically lack data on where species occur, environmental surrogates--including geophysical settings and climate types--have been used to prioritize sites within a planning area. We reviewed 622 evaluations of the effectiveness of abiotic surrogates in representing species in 19 study areas. Sites selected using abiotic surrogates represented more species than an equal number of randomly selected sites in 43% of tests (55% for plants) and on average improved on random selection of sites by about 8% (21% for plants). Environmental diversity (ED) (42% median improvement on random selection) and biotically informed clusters showed promising results and merit additional testing. We suggest 4 ways to improve performance of abiotic surrogates. First, analysts should consider a broad spectrum of candidate variables to define surrogates, including rarely used variables related to geographic separation, distance from coast, hydrology, and within-site abiotic diversity. Second, abiotic surrogates should be defined at fine thematic resolution. Third, sites (the landscape units prioritized within a planning area) should be small enough to ensure that surrogates reflect species' environments and to produce prioritizations that match the spatial resolution of conservation decisions. Fourth, if species inventories are available for some planning units, planners should define surrogates based on the abiotic variables that most influence species turnover in the planning area. Although species inventories increase the cost of using abiotic surrogates, a modest number of inventories could provide the data needed to select variables and evaluate surrogates. Additional tests of nonclimate abiotic surrogates are needed to evaluate the utility of conserving nature's stage as a strategy for conservation planning in the face of climate change. © 2015 Society for Conservation Biology.
Fradgley, Elizabeth A; Paul, Christine L; Bryant, Jamie; Roos, Ian A; Henskens, Frans A; Paul, David J
2014-12-19
With increasing attention given to the quality of chronic disease care, a measurement approach that empowers consumers to participate in improving quality of care and enables health services to systematically introduce patient-centered initiatives is needed. A Web-based survey with complex adaptive questioning and interactive survey items would allow consumers to easily identify and prioritize detailed service initiatives. The aim was to develop and test a Web-based survey capable of identifying and prioritizing patient-centered initiatives in chronic disease outpatient services. Testing included (1) test-retest reliability, (2) patient-perceived acceptability of the survey content and delivery mode, and (3) average completion time, completion rates, and Flesch-Kincaid reading score. In Phase I, the Web-based Consumer Preferences Survey was developed based on a structured literature review and iterative feedback from expert groups of service providers and consumers. The touchscreen survey contained 23 general initiatives, 110 specific initiatives available through adaptive questioning, and a relative prioritization exercise. In Phase II, a pilot study was conducted within 4 outpatient clinics to evaluate the reliability properties, patient-perceived acceptability, and feasibility of the survey. Eligible participants were approached to complete the survey while waiting for an appointment or receiving intravenous therapy. The age and gender of nonconsenters was estimated to ascertain consent bias. Participants with a subsequent appointment within 14 days were asked to complete the survey for a second time. A total of 741 of 1042 individuals consented to participate (71.11% consent), 529 of 741 completed all survey content (78.9% completion), and 39 of 68 completed the test-retest component. Substantial or moderate reliability (Cohen's kappa>0.4) was reported for 16 of 20 general initiatives with observed percentage agreement ranging from 82.1%-100.0%. The majority of participants indicated the Web-based survey was easy to complete (97.9%, 531/543) and comprehensive (93.1%, 505/543). Participants also reported the interactive relative prioritization exercise was easy to complete (97.0%, 189/195) and helped them to decide which initiatives were of most importance (84.6%, 165/195). Average completion time was 8.54 minutes (SD 3.91) and the Flesch-Kincaid reading level was 6.8. Overall, 84.6% (447/529) of participants indicated a willingness to complete a similar survey again. The Web-based Consumer Preferences Survey is sufficiently reliable and highly acceptable to patients. Based on completion times and reading level, this tool could be integrated in routine clinical practice and allows consumers to easily participate in quality evaluation. Results provide a comprehensive list of patient-prioritized initiatives for patients with major chronic conditions and delivers practice-ready evidence to guide improvements in patient-centered care.
SARP: a value-based approach to hospice admissions triage.
MacDonald, D
1995-01-01
As hospices become established and case referrals increase, many programs are faced with the necessity of instituting waiting lists. Prioritizing cases for order of admission requires a triage method that is rational, fair, and consistent. This article describes the SARP method of hospice admissions triage, which evaluates prospective cases according to seniority, acuity, risk, and political significance. SARP's essential features, operative assumptions, advantages, and limitations are discussed, as well as the core hospice values which underlie its use. The article concludes with a call for trial and evaluation of SARP in other hospice settings.
TinyOS-based quality of service management in wireless sensor networks
Peterson, N.; Anusuya-Rangappa, L.; Shirazi, B.A.; Huang, R.; Song, W.-Z.; Miceli, M.; McBride, D.; Hurson, A.; LaHusen, R.
2009-01-01
Previously the cost and extremely limited capabilities of sensors prohibited Quality of Service (QoS) implementations in wireless sensor networks. With advances in technology, sensors are becoming significantly less expensive and the increases in computational and storage capabilities are opening the door for new, sophisticated algorithms to be implemented. Newer sensor network applications require higher data rates with more stringent priority requirements. We introduce a dynamic scheduling algorithm to improve bandwidth for high priority data in sensor networks, called Tiny-DWFQ. Our Tiny-Dynamic Weighted Fair Queuing scheduling algorithm allows for dynamic QoS for prioritized communications by continually adjusting the treatment of communication packages according to their priorities and the current level of network congestion. For performance evaluation, we tested Tiny-DWFQ, Tiny-WFQ (traditional WFQ algorithm implemented in TinyOS), and FIFO queues on an Imote2-based wireless sensor network and report their throughput and packet loss. Our results show that Tiny-DWFQ performs better in all test cases. ?? 2009 IEEE.
Artificial intelligence for the CTA Observatory scheduler
NASA Astrophysics Data System (ADS)
Colomé, Josep; Colomer, Pau; Campreciós, Jordi; Coiffard, Thierry; de Oña, Emma; Pedaletti, Giovanna; Torres, Diego F.; Garcia-Piquer, Alvaro
2014-08-01
The Cherenkov Telescope Array (CTA) project will be the next generation ground-based very high energy gamma-ray instrument. The success of the precursor projects (i.e., HESS, MAGIC, VERITAS) motivated the construction of this large infrastructure that is included in the roadmap of the ESFRI projects since 2008. CTA is planned to start the construction phase in 2015 and will consist of two arrays of Cherenkov telescopes operated as a proposal-driven open observatory. Two sites are foreseen at the southern and northern hemispheres. The CTA observatory will handle several observation modes and will have to operate tens of telescopes with a highly efficient and reliable control. Thus, the CTA planning tool is a key element in the control layer for the optimization of the observatory time. The main purpose of the scheduler for CTA is the allocation of multiple tasks to one single array or to multiple sub-arrays of telescopes, while maximizing the scientific return of the facility and minimizing the operational costs. The scheduler considers long- and short-term varying conditions to optimize the prioritization of tasks. A short-term scheduler provides the system with the capability to adapt, in almost real-time, the selected task to the varying execution constraints (i.e., Targets of Opportunity, health or status of the system components, environment conditions). The scheduling procedure ensures that long-term planning decisions are correctly transferred to the short-term prioritization process for a suitable selection of the next task to execute on the array. In this contribution we present the constraints to CTA task scheduling that helped classifying it as a Flexible Job-Shop Problem case and finding its optimal solution based on Artificial Intelligence techniques. We describe the scheduler prototype that uses a Guarded Discrete Stochastic Neural Network (GDSN), for an easy representation of the possible long- and short-term planning solutions, and Constraint Propagation techniques. A simulation platform, an analysis tool and different test case scenarios for CTA were developed to test the performance of the scheduler and are also described.
Text Mining in Cancer Gene and Pathway Prioritization
Luo, Yuan; Riedlinger, Gregory; Szolovits, Peter
2014-01-01
Prioritization of cancer implicated genes has received growing attention as an effective way to reduce wet lab cost by computational analysis that ranks candidate genes according to the likelihood that experimental verifications will succeed. A multitude of gene prioritization tools have been developed, each integrating different data sources covering gene sequences, differential expressions, function annotations, gene regulations, protein domains, protein interactions, and pathways. This review places existing gene prioritization tools against the backdrop of an integrative Omic hierarchy view toward cancer and focuses on the analysis of their text mining components. We explain the relatively slow progress of text mining in gene prioritization, identify several challenges to current text mining methods, and highlight a few directions where more effective text mining algorithms may improve the overall prioritization task and where prioritizing the pathways may be more desirable than prioritizing only genes. PMID:25392685
Text mining in cancer gene and pathway prioritization.
Luo, Yuan; Riedlinger, Gregory; Szolovits, Peter
2014-01-01
Prioritization of cancer implicated genes has received growing attention as an effective way to reduce wet lab cost by computational analysis that ranks candidate genes according to the likelihood that experimental verifications will succeed. A multitude of gene prioritization tools have been developed, each integrating different data sources covering gene sequences, differential expressions, function annotations, gene regulations, protein domains, protein interactions, and pathways. This review places existing gene prioritization tools against the backdrop of an integrative Omic hierarchy view toward cancer and focuses on the analysis of their text mining components. We explain the relatively slow progress of text mining in gene prioritization, identify several challenges to current text mining methods, and highlight a few directions where more effective text mining algorithms may improve the overall prioritization task and where prioritizing the pathways may be more desirable than prioritizing only genes.
Marketing Principles in Higher Education.
ERIC Educational Resources Information Center
Moisan, Leonard J.
1987-01-01
Ten marketing concepts to consider are discussed including: assigning a protagonist, understanding of institutional mission, testing purpose against constituencies, planning effectiveness, positioning, politics, prioritizing, promotion and publicity, choosing representatives with a sense of passion, and performance or delivery of promised…
Combinatorial QSAR Modeling of Rat Acute Toxicity by Oral Exposure
Quantitative Structure-Activity Relationship (QSAR) toxicity models have become popular tools for identifying potential toxic compounds and prioritizing candidates for animal toxicity tests. However, few QSAR studies have successfully modeled large, diverse mammalian toxicity end...
Fernández, Alberto; Rallo, Robert; Giralt, Francesc
2015-10-01
Ready biodegradability is a key property for evaluating the long-term effects of chemicals on the environment and human health. As such, it is used as a screening test for the assessment of persistent, bioaccumulative and toxic substances. Regulators encourage the use of non-testing methods, such as in silico models, to save money and time. A dataset of 757 chemicals was collected to assess the performance of four freely available in silico models that predict ready biodegradability. They were applied to develop a new consensus method that prioritizes the use of each individual model according to its performance on chemical subsets driven by the presence or absence of different molecular descriptors. This consensus method was capable of almost eliminating unpredictable chemicals, while the performance of combined models was substantially improved with respect to that of the individual models. Copyright © 2015 Elsevier Inc. All rights reserved.
A methodology to enhance electromagnetic compatibility in joint military operations
NASA Astrophysics Data System (ADS)
Buckellew, William R.
The development and validation of an improved methodology to identify, characterize, and prioritize potential joint EMI (electromagnetic interference) interactions and identify and develop solutions to reduce the effects of the interference are discussed. The methodology identifies potential EMI problems using results from field operations, historical data bases, and analytical modeling. Operational expertise, engineering analysis, and testing are used to characterize and prioritize the potential EMI problems. Results can be used to resolve potential EMI during the development and acquisition of new systems and to develop engineering fixes and operational workarounds for systems already employed. The analytic modeling portion of the methodology is a predictive process that uses progressive refinement of the analysis and the operational electronic environment to eliminate noninterfering equipment pairs, defer further analysis on pairs lacking operational significance, and resolve the remaining EMI problems. Tests are conducted on equipment pairs to ensure that the analytical models provide a realistic description of the predicted interference.
Cattarino, Lorenzo; Hermoso, Virgilio; Carwardine, Josie; Kennard, Mark J.; Linke, Simon
2015-01-01
Planning for the remediation of multiple threats is crucial to ensure the long term persistence of biodiversity. Limited conservation budgets require prioritizing which management actions to implement and where. Systematic conservation planning traditionally assumes that all the threats in priority sites are abated (fixed prioritization approach). However, abating only the threats affecting the species of conservation concerns may be more cost-effective. This requires prioritizing individual actions independently within the same site (independent prioritization approach), which has received limited attention so far. We developed an action prioritization algorithm that prioritizes multiple alternative actions within the same site. We used simulated annealing to find the combination of actions that remediate threats to species at the minimum cost. Our algorithm also accounts for the importance of selecting actions in sites connected through the river network (i.e., connectivity). We applied our algorithm to prioritize actions to address threats to freshwater fish species in the Mitchell River catchment, northern Australia. We compared how the efficiency of the independent and fixed prioritization approach varied as the importance of connectivity increased. Our independent prioritization approach delivered more efficient solutions than the fixed prioritization approach, particularly when the importance of achieving connectivity was high. By spatially prioritizing the specific actions necessary to remediate the threats affecting the target species, our approach can aid cost-effective habitat restoration and land-use planning. It is also particularly suited to solving resource allocation problems, where consideration of spatial design is important, such as prioritizing conservation efforts for highly mobile species, species facing climate change-driven range shifts, or minimizing the risk of threats spreading across different realms. PMID:26020794
Cattarino, Lorenzo; Hermoso, Virgilio; Carwardine, Josie; Kennard, Mark J; Linke, Simon
2015-01-01
Planning for the remediation of multiple threats is crucial to ensure the long term persistence of biodiversity. Limited conservation budgets require prioritizing which management actions to implement and where. Systematic conservation planning traditionally assumes that all the threats in priority sites are abated (fixed prioritization approach). However, abating only the threats affecting the species of conservation concerns may be more cost-effective. This requires prioritizing individual actions independently within the same site (independent prioritization approach), which has received limited attention so far. We developed an action prioritization algorithm that prioritizes multiple alternative actions within the same site. We used simulated annealing to find the combination of actions that remediate threats to species at the minimum cost. Our algorithm also accounts for the importance of selecting actions in sites connected through the river network (i.e., connectivity). We applied our algorithm to prioritize actions to address threats to freshwater fish species in the Mitchell River catchment, northern Australia. We compared how the efficiency of the independent and fixed prioritization approach varied as the importance of connectivity increased. Our independent prioritization approach delivered more efficient solutions than the fixed prioritization approach, particularly when the importance of achieving connectivity was high. By spatially prioritizing the specific actions necessary to remediate the threats affecting the target species, our approach can aid cost-effective habitat restoration and land-use planning. It is also particularly suited to solving resource allocation problems, where consideration of spatial design is important, such as prioritizing conservation efforts for highly mobile species, species facing climate change-driven range shifts, or minimizing the risk of threats spreading across different realms.
Mejía, Vilma; Gonzalez, Carlos; Delfino, Alejandro E; Altermatt, Fernando R; Corvetto, Marcia A
The primary purpose of this study was to compare the effect of high fidelity simulation versus a computer-based case solving self-study, in skills acquisition about malignant hyperthermia on first year anesthesiology residents. After institutional ethical committee approval, 31 first year anesthesiology residents were enrolled in this prospective randomized single-blinded study. Participants were randomized to either a High Fidelity Simulation Scenario or a computer-based Case Study about malignant hyperthermia. After the intervention, all subjects' performance in was assessed through a high fidelity simulation scenario using a previously validated assessment rubric. Additionally, knowledge tests and a satisfaction survey were applied. Finally, a semi-structured interview was done to assess self-perception of reasoning process and decision-making. 28 first year residents finished successfully the study. Resident's management skill scores were globally higher in High Fidelity Simulation versus Case Study, however they were significant in 4 of the 8 performance rubric elements: recognize signs and symptoms (p = 0.025), prioritization of initial actions of management (p = 0.003), recognize complications (p = 0.025) and communication (p = 0.025). Average scores from pre- and post-test knowledge questionnaires improved from 74% to 85% in the High Fidelity Simulation group, and decreased from 78% to 75% in the Case Study group (p = 0.032). Regarding the qualitative analysis, there was no difference in factors influencing the student's process of reasoning and decision-making with both teaching strategies. Simulation-based training with a malignant hyperthermia high-fidelity scenario was superior to computer-based case study, improving knowledge and skills in malignant hyperthermia crisis management, with a very good satisfaction level in anesthesia residents. Copyright © 2018 Sociedade Brasileira de Anestesiologia. Publicado por Elsevier Editora Ltda. All rights reserved.
Transcriptomic Profiling and Functional Characterization of Fusion Genes in Recurrent Ovarian Cancer
2017-09-01
the enhanced malignancy observed in recurrent disease. In the first year of this proposal we have assembled a cohort of 18 patient matched pairs of...significance and biologic function of prioritized RNA fusion events. 15. SUBJECT TERMS 16. SECURITY CLASSIFICATION OF: 17. LIMITATION OF ABSTRACT 18 ...cellularity. 19 cases were identified (Table 1) but one was removed for quality control issues thus leaving a total of 18 cases. Table 1 shows the clinical
Recent Transmission of Tuberculosis - United States, 2011-2014.
Yuen, Courtney M; Kammerer, J Steve; Marks, Kala; Navin, Thomas R; France, Anne Marie
2016-01-01
Tuberculosis is an infectious disease that may result from recent transmission or from an infection acquired many years in the past; there is no diagnostic test to distinguish the two causes. Cases resulting from recent transmission are particularly concerning from a public health standpoint. To describe recent tuberculosis transmission in the United States, we used a field-validated plausible source-case method to estimate cases likely resulting from recent transmission during January 2011-September 2014. We classified cases as resulting from either limited or extensive recent transmission based on transmission cluster size. We used logistic regression to analyze patient characteristics associated with recent transmission. Of 26,586 genotyped cases, 14% were attributable to recent transmission, 39% of which were attributable to extensive recent transmission. The burden of cases attributed to recent transmission was geographically heterogeneous and poorly predicted by tuberculosis incidence. Extensive recent transmission was positively associated with American Indian/Alaska Native (adjusted prevalence ratio [aPR] = 3.6 (95% confidence interval [CI] 2.9-4.4), Native Hawaiian/Pacific Islander (aPR = 3.2, 95% CI 2.3-4.5), and black (aPR = 3.0, 95% CI 2.6-3.5) race, and homelessness (aPR = 2.3, 95% CI 2.0-2.5). Extensive recent transmission was negatively associated with foreign birth (aPR = 0.2, 95% CI 0.2-0.2). Tuberculosis control efforts should prioritize reducing transmission among higher-risk populations.
Prioritization for Plastic Surgery Procedures Aimed to Improve Quality of Life: Moral Considerations
Kolby, Lars; Elander, Anna
2017-01-01
Background: Different health conditions are treated in a Plastic Surgery unit, including those cases whose main goal is to enable patients to feel and integrate better within society and therefore improving quality of life, rather then physical functions. Methods: We discuss moral principles that can be used as a guide for health professionals to revise and create policies for plastic surgery patients presenting with non–life-threatening conditions. Results: A specific anatomical feature is not always an indicator of patient’s well-being and quality of life, and therefore it cannot be used as the sole parameter to identify the worst-off and prioritize the provision of health care. A policy should identify who preoperatively are the worst-off and come to some plausible measure of how much they can be expected to benefit from an operation. Policies that do not track these principles in any reliable way can cause discrimination. Conclusions: A patient-centered operating system and patient’s informed preferences might be implemented in the process of prioritizing health. In circumstances when the effectiveness of a specific treatment is unproven, professionals should not make assumptions based on their own values. PMID:28894658
Prioritizing equipment for replacement.
Capuano, Mike
2010-01-01
It is suggested that clinical engineers take the lead in formulating evaluation processes to recommend equipment replacement. Their skill, knowledge, and experience, combined with access to equipment databases, make them a logical choice. Based on ideas from Fennigkoh's scheme, elements such as age, vendor support, accumulated maintenance cost, and function/risk were used.6 Other more subjective criteria such as cost benefits and efficacy of newer technology were not used. The element of downtime was also omitted due to the data element not being available. The resulting Periop Master Equipment List and its rationale was presented to the Perioperative Services Program Council. They deemed the criteria to be robust and provided overwhelming acceptance of the list. It was quickly put to use to estimate required capital funding, justify items already thought to need replacement, and identify high-priority ranked items for replacement. Incorporating prioritization criteria into an existing equipment database would be ideal. Some commercially available systems do have the basic elements of this. Maintaining replacement data can be labor-intensive regardless of the method used. There is usually little time to perform the tasks necessary for prioritizing equipment. However, where appropriate, a clinical engineering department might be able to conduct such an exercise as shown in the following case study.
Ho, Cheng-I; Lin, Min-Der; Lo, Shang-Lien
2010-07-01
A methodology based on the integration of a seismic-based artificial neural network (ANN) model and a geographic information system (GIS) to assess water leakage and to prioritize pipeline replacement is developed in this work. Qualified pipeline break-event data derived from the Taiwan Water Corporation Pipeline Leakage Repair Management System were analyzed. "Pipe diameter," "pipe material," and "the number of magnitude-3( + ) earthquakes" were employed as the input factors of ANN, while "the number of monthly breaks" was used for the prediction output. This study is the first attempt to manipulate earthquake data in the break-event ANN prediction model. Spatial distribution of the pipeline break-event data was analyzed and visualized by GIS. Through this, the users can swiftly figure out the hotspots of the leakage areas. A northeastern township in Taiwan, frequently affected by earthquakes, is chosen as the case study. Compared to the traditional processes for determining the priorities of pipeline replacement, the methodology developed is more effective and efficient. Likewise, the methodology can overcome the difficulty of prioritizing pipeline replacement even in situations where the break-event records are unavailable.
Brown, Nicholas R.; Carlsen, Brett W.; Dixon, Brent W.; ...
2016-06-09
Dynamic fuel cycle simulation tools are intended to model holistic transient nuclear fuel cycle scenarios. As with all simulation tools, fuel cycle simulators require verification through unit tests, benchmark cases, and integral tests. Model validation is a vital aspect as well. Although compara-tive studies have been performed, there is no comprehensive unit test and benchmark library for fuel cycle simulator tools. The objective of this paper is to identify the must test functionalities of a fuel cycle simulator tool within the context of specific problems of interest to the Fuel Cycle Options Campaign within the U.S. Department of Energy smore » Office of Nuclear Energy. The approach in this paper identifies the features needed to cover the range of promising fuel cycle options identified in the DOE-NE Fuel Cycle Evaluation and Screening (E&S) and categorizes these features to facilitate prioritization. Features were categorized as essential functions, integrating features, and exemplary capabilities. One objective of this paper is to propose a library of unit tests applicable to each of the essential functions. Another underlying motivation for this paper is to encourage an international dialog on the functionalities and standard test methods for fuel cycle simulator tools.« less
Community Building as an Instructional Goal in Japanese Adult Basic Education
ERIC Educational Resources Information Center
Jacobson, Erik
2009-01-01
This paper presents the results of a multisite case study of adult basic education in Japan. A key finding of the study is that as part of community building within classrooms, students, teachers, and administrators prioritize human relations and expressions of empathy rather than academic skill development. In contrast to Japanese educational…
Girls First! Promoting Early Education in Tibetan Areas of China, a Case Study
ERIC Educational Resources Information Center
Seeberg, Vilma
2008-01-01
This study explores conditions promoting girls' education in ethnic Tibetan pastoral highlands of the Qinghai-Tibet plateau of western China. Global discourse and cross national evidence on the transgenerational benefits of girls' education has shown prioritizing girls' education to be the most effective strategy of breaking the vicious cycle of…
Explaining Differing Perceptions of Employees' Skill Needs: The Case of Garment Workers in Ethiopia
ERIC Educational Resources Information Center
Yamada, Shoko; Otchia, Christian S.; Taniguchi, Kyoko
2018-01-01
The Ethiopian economy has grown significantly and the government has prioritized industrial skills development and expanded technical and vocational education and training (TVET). However, mismatches between the skills available and the skills required are widespread and the unemployment rate for TVET graduates is high. Little scholarly effort has…
Validating the Octave Allegro Information Systems Risk Assessment Methodology: A Case Study
ERIC Educational Resources Information Center
Keating, Corland G.
2014-01-01
An information system (IS) risk assessment is an important part of any successful security management strategy. Risk assessments help organizations to identify mission-critical IS assets and prioritize risk mitigation efforts. Many risk assessment methodologies, however, are complex and can only be completed successfully by highly qualified and…
Setting the Agenda for Children. California Report Card, 2010
ERIC Educational Resources Information Center
Children Now, 2010
2010-01-01
Throughout history, societal investments in children have resulted in increased prosperity for individuals, communities, states and nations. This proved to be the case for California in the 1950s and 1960s, when the state strongly supported children's futures. Despite once following this path to prosperity, California has de-prioritized children…
Implementing Wireless Mobile Instructional Labs: Planning Issues and Case Study
ERIC Educational Resources Information Center
McKimmy, Paul B.
2005-01-01
In April 2002, the Technology Advisory Committee of the University of Hawaii-Manoa College of Education (COE) prioritized the upgrade of existing instructional computer labs. Following several weeks of research and discussion, a decision was made to support wireless and mobile technologies during the upgrade. In June 2002, the first of three…
The Vocational Skills Gap for Management Accountants: The Stakeholders' Perspectives.
ERIC Educational Resources Information Center
Hassall, Trevor; Joyce, John; Montano, Jose Luis Arquero; Anes, Jose Antonio Donoso
2003-01-01
Develops a case study that examines the process of the professional education and training of management accountants. Identifies the relative importance of a specified range of vocational skills needed for a chartered management accountant, and prioritizes areas for development and training based on opinions of employers and students. (Author/LRW)
MELODI: Mining Enriched Literature Objects to Derive Intermediates
Elsworth, Benjamin; Dawe, Karen; Vincent, Emma E; Langdon, Ryan; Lynch, Brigid M; Martin, Richard M; Relton, Caroline; Higgins, Julian P T; Gaunt, Tom R
2018-01-01
Abstract Background The scientific literature contains a wealth of information from different fields on potential disease mechanisms. However, identifying and prioritizing mechanisms for further analytical evaluation presents enormous challenges in terms of the quantity and diversity of published research. The application of data mining approaches to the literature offers the potential to identify and prioritize mechanisms for more focused and detailed analysis. Methods Here we present MELODI, a literature mining platform that can identify mechanistic pathways between any two biomedical concepts. Results Two case studies demonstrate the potential uses of MELODI and how it can generate hypotheses for further investigation. First, an analysis of ETS-related gene ERG and prostate cancer derives the intermediate transcription factor SP1, recently confirmed to be physically interacting with ERG. Second, examining the relationship between a new potential risk factor for pancreatic cancer identifies possible mechanistic insights which can be studied in vitro. Conclusions We have demonstrated the possible applications of MELODI, including two case studies. MELODI has been implemented as a Python/Django web application, and is freely available to use at [www.melodi.biocompute.org.uk]. PMID:29342271
Reasons for operation cancellations at a teaching hospital: prioritizing areas of improvement.
Abeeleh, Mahmoud Abu; Tareef, Tareq M; Hani, Amjad Bani; Albsoul, Nader; Samarah, Omar Q; ElMohtaseb, M S; Alshehabat, Musa; Ismail, Zuhair Bani; Alnoubani, Omar; Obeidat, Salameh S; Halawa, Sami Abu
2017-08-01
To report rates of and reasons for operation cancellation, and to prioritize areas of improvement. Retrospective data were extracted from the monthly reports of cancelled listed operations. Data on 14 theatres were collected by the office of quality assurance at Jordan University Hospital from August 2012 to April 2016. Rates and reasons for operation cancellation were investigated. A Pareto chart was constructed to identify the reasons of highest priority. During the period of study, 6,431 cases (9.31%) were cancelled out of 69,066 listed cases. Patient no-shows accounted for 62.52% of cancellations. A Pareto analysis showed that around 80% of the known reasons for cancellation after admission were due to a lack of surgical theatre time (30%), incomplete preoperative assessment (21%), upper respiratory tract infection (19%), and high blood pressure (13%). This study identified the most common reasons for operation cancellation at a teaching hospital. Potential avoidable root causes and recommended interventions were suggested accordingly. Future research, available resources, hospital policies, and strategic measures directed to tackle these reasons should take priority.
Reif, David M; Sypa, Myroslav; Lock, Eric F; Wright, Fred A; Wilson, Ander; Cathey, Tommy; Judson, Richard R; Rusyn, Ivan
2013-02-01
Scientists and regulators are often faced with complex decisions, where use of scarce resources must be prioritized using collections of diverse information. The Toxicological Prioritization Index (ToxPi™) was developed to enable integration of multiple sources of evidence on exposure and/or safety, transformed into transparent visual rankings to facilitate decision making. The rankings and associated graphical profiles can be used to prioritize resources in various decision contexts, such as testing chemical toxicity or assessing similarity of predicted compound bioactivity profiles. The amount and types of information available to decision makers are increasing exponentially, while the complex decisions must rely on specialized domain knowledge across multiple criteria of varying importance. Thus, the ToxPi bridges a gap, combining rigorous aggregation of evidence with ease of communication to stakeholders. An interactive ToxPi graphical user interface (GUI) application has been implemented to allow straightforward decision support across a variety of decision-making contexts in environmental health. The GUI allows users to easily import and recombine data, then analyze, visualize, highlight, export and communicate ToxPi results. It also provides a statistical metric of stability for both individual ToxPi scores and relative prioritized ranks. The ToxPi GUI application, complete user manual and example data files are freely available from http://comptox.unc.edu/toxpi.php.
Endeavour update: a web resource for gene prioritization in multiple species
Tranchevent, Léon-Charles; Barriot, Roland; Yu, Shi; Van Vooren, Steven; Van Loo, Peter; Coessens, Bert; De Moor, Bart; Aerts, Stein; Moreau, Yves
2008-01-01
Endeavour (http://www.esat.kuleuven.be/endeavourweb; this web site is free and open to all users and there is no login requirement) is a web resource for the prioritization of candidate genes. Using a training set of genes known to be involved in a biological process of interest, our approach consists of (i) inferring several models (based on various genomic data sources), (ii) applying each model to the candidate genes to rank those candidates against the profile of the known genes and (iii) merging the several rankings into a global ranking of the candidate genes. In the present article, we describe the latest developments of Endeavour. First, we provide a web-based user interface, besides our Java client, to make Endeavour more universally accessible. Second, we support multiple species: in addition to Homo sapiens, we now provide gene prioritization for three major model organisms: Mus musculus, Rattus norvegicus and Caenorhabditis elegans. Third, Endeavour makes use of additional data sources and is now including numerous databases: ontologies and annotations, protein–protein interactions, cis-regulatory information, gene expression data sets, sequence information and text-mining data. We tested the novel version of Endeavour on 32 recent disease gene associations from the literature. Additionally, we describe a number of recent independent studies that made use of Endeavour to prioritize candidate genes for obesity and Type II diabetes, cleft lip and cleft palate, and pulmonary fibrosis. PMID:18508807
Prioritizing GWAS Results: A Review of Statistical Methods and Recommendations for Their Application
Cantor, Rita M.; Lange, Kenneth; Sinsheimer, Janet S.
2010-01-01
Genome-wide association studies (GWAS) have rapidly become a standard method for disease gene discovery. A substantial number of recent GWAS indicate that for most disorders, only a few common variants are implicated and the associated SNPs explain only a small fraction of the genetic risk. This review is written from the viewpoint that findings from the GWAS provide preliminary genetic information that is available for additional analysis by statistical procedures that accumulate evidence, and that these secondary analyses are very likely to provide valuable information that will help prioritize the strongest constellations of results. We review and discuss three analytic methods to combine preliminary GWAS statistics to identify genes, alleles, and pathways for deeper investigations. Meta-analysis seeks to pool information from multiple GWAS to increase the chances of finding true positives among the false positives and provides a way to combine associations across GWAS, even when the original data are unavailable. Testing for epistasis within a single GWAS study can identify the stronger results that are revealed when genes interact. Pathway analysis of GWAS results is used to prioritize genes and pathways within a biological context. Following a GWAS, association results can be assigned to pathways and tested in aggregate with computational tools and pathway databases. Reviews of published methods with recommendations for their application are provided within the framework for each approach. PMID:20074509
Chemical Safety Research Advances in Support of Lautenberg Act
EPA researchers are developing new ways to identify which chemicals to prioritize for further testing, to provide better access to information about chemicals, and to understand what potential risks chemicals may pose to humans and the environment.
DOT National Transportation Integrated Search
2014-04-01
Through the analysis of national crash databases from the National Highway Traffic Safety Administration, pre-crash scenarios are identified, prioritized, and described for the development of objective tests for pedestrian crash avoidance/mitigation ...
TOXICITY SCREENING WITH ZEBRAFISH ASSAY
The proposed toxicity screening will help EPA to prioritize chemicals for further testing, and it may also alert chemical manufacturers that some of their commercial products may be toxic. The proposed toxicity pathway studies will improve the research community’s abi...
Feature Analysis of ToxCast Compounds
ToxCast was initiated by the US Environmental Protection Agency (EPA) to prioritize environmental chemicals for toxicity testing. Phase I generated data for 309 unique chemicals, mostly pesticide actives, that span diverse chemical feature/property space, as determined by quantu...
Ni, Jingchao; Koyuturk, Mehmet; Tong, Hanghang; Haines, Jonathan; Xu, Rong; Zhang, Xiang
2016-11-10
Accurately prioritizing candidate disease genes is an important and challenging problem. Various network-based methods have been developed to predict potential disease genes by utilizing the disease similarity network and molecular networks such as protein interaction or gene co-expression networks. Although successful, a common limitation of the existing methods is that they assume all diseases share the same molecular network and a single generic molecular network is used to predict candidate genes for all diseases. However, different diseases tend to manifest in different tissues, and the molecular networks in different tissues are usually different. An ideal method should be able to incorporate tissue-specific molecular networks for different diseases. In this paper, we develop a robust and flexible method to integrate tissue-specific molecular networks for disease gene prioritization. Our method allows each disease to have its own tissue-specific network(s). We formulate the problem of candidate gene prioritization as an optimization problem based on network propagation. When there are multiple tissue-specific networks available for a disease, our method can automatically infer the relative importance of each tissue-specific network. Thus it is robust to the noisy and incomplete network data. To solve the optimization problem, we develop fast algorithms which have linear time complexities in the number of nodes in the molecular networks. We also provide rigorous theoretical foundations for our algorithms in terms of their optimality and convergence properties. Extensive experimental results show that our method can significantly improve the accuracy of candidate gene prioritization compared with the state-of-the-art methods. In our experiments, we compare our methods with 7 popular network-based disease gene prioritization algorithms on diseases from Online Mendelian Inheritance in Man (OMIM) database. The experimental results demonstrate that our methods recover true associations more accurately than other methods in terms of AUC values, and the performance differences are significant (with paired t-test p-values less than 0.05). This validates the importance to integrate tissue-specific molecular networks for studying disease gene prioritization and show the superiority of our network models and ranking algorithms toward this purpose. The source code and datasets are available at http://nijingchao.github.io/CRstar/ .
Heidebrecht, Christine L; Podewils, Laura J; Pym, Alexander; Mthiyane, Thuli; Cohen, Ted
2016-01-01
KwaZulu-Natal (KZN) has the highest burden of notified multidrug-resistant tuberculosis (MDR TB) and extensively drug-resistant (XDR) TB cases in South Africa. A better understanding of spatial heterogeneity in the risk of drug-resistance may help to prioritize local responses. Between July 2012 and June 2013, we conducted a two-way Lot Quality Assurance Sampling (LQAS) study to classify the burden of rifampicin (RIF)-resistant TB among incident TB cases notified within the catchment areas of seven laboratories in two northern and one southern district of KZN. Decision rules for classification of areas as having either a high- or low-risk of RIF resistant TB (based on proportion of RIF resistance among all TB cases) were based on consultation with local policy makers. We classified five areas as high-risk and two as low-risk. High-risk areas were identified in both Southern and Northern districts, with the greatest proportion of RIF resistance observed in the northernmost area, the Manguzi community situated on the Mozambique border. Our study revealed heterogeneity in the risk of RIF resistant disease among incident TB cases in KZN. This study demonstrates the potential for LQAS to detect geographic heterogeneity in areas where access to drug susceptibility testing is limited.
Heidebrecht, Christine L.; Podewils, Laura J.; Pym, Alexander; Mthiyane, Thuli; Cohen, Ted
2016-01-01
Background KwaZulu-Natal (KZN) has the highest burden of notified multidrug-resistant tuberculosis (MDR TB) and extensively drug-resistant (XDR) TB cases in South Africa. A better understanding of spatial heterogeneity in the risk of drug-resistance may help to prioritize local responses. Methods Between July 2012 and June 2013, we conducted a two-way Lot Quality Assurance Sampling (LQAS) study to classify the burden of rifampicin (RIF)-resistant TB among incident TB cases notified within the catchment areas of seven laboratories in two northern and one southern district of KZN. Decision rules for classification of areas as having either a high- or low-risk of RIF resistant TB (based on proportion of RIF resistance among all TB cases) were based on consultation with local policy makers. Results We classified five areas as high-risk and two as low-risk. High-risk areas were identified in both Southern and Northern districts, with the greatest proportion of RIF resistance observed in the northernmost area, the Manguzi community situated on the Mozambique border. Conclusion Our study revealed heterogeneity in the risk of RIF resistant disease among incident TB cases in KZN. This study demonstrates the potential for LQAS to detect geographic heterogeneity in areas where access to drug susceptibility testing is limited. PMID:27050561
2012-01-01
Background Genome-wide association studies (GWAS) do not provide a full account of the heritability of genetic diseases since gene-gene interactions, also known as epistasis are not considered in single locus GWAS. To address this problem, a considerable number of methods have been developed for identifying disease-associated gene-gene interactions. However, these methods typically fail to identify interacting markers explaining more of the disease heritability over single locus GWAS, since many of the interactions significant for disease are obscured by uninformative marker interactions e.g., linkage disequilibrium (LD). Results In this study, we present a novel SNP interaction prioritization algorithm, named iLOCi (Interacting Loci). This algorithm accounts for marker dependencies separately in case and control groups. Disease-associated interactions are then prioritized according to a novel ranking score calculated from the difference in marker dependencies for every possible pair between case and control groups. The analysis of a typical GWAS dataset can be completed in less than a day on a standard workstation with parallel processing capability. The proposed framework was validated using simulated data and applied to real GWAS datasets using the Wellcome Trust Case Control Consortium (WTCCC) data. The results from simulated data showed the ability of iLOCi to identify various types of gene-gene interactions, especially for high-order interaction. From the WTCCC data, we found that among the top ranked interacting SNP pairs, several mapped to genes previously known to be associated with disease, and interestingly, other previously unreported genes with biologically related roles. Conclusion iLOCi is a powerful tool for uncovering true disease interacting markers and thus can provide a more complete understanding of the genetic basis underlying complex disease. The program is available for download at http://www4a.biotec.or.th/GI/tools/iloci. PMID:23281813
Hwang, Sohyun; Rhee, Seung Y; Marcotte, Edward M; Lee, Insuk
2012-01-01
AraNet is a functional gene network for the reference plant Arabidopsis and has been constructed in order to identify new genes associated with plant traits. It is highly predictive for diverse biological pathways and can be used to prioritize genes for functional screens. Moreover, AraNet provides a web-based tool with which plant biologists can efficiently discover novel functions of Arabidopsis genes (http://www.functionalnet.org/aranet/). This protocol explains how to conduct network-based prediction of gene functions using AraNet and how to interpret the prediction results. Functional discovery in plant biology is facilitated by combining candidate prioritization by AraNet with focused experimental tests. PMID:21886106
Bell, Shannon M; Edwards, Stephen W
2015-11-01
There are > 80,000 chemicals in commerce with few data available describing their impacts on human health. Biomonitoring surveys, such as the NHANES (National Health and Nutrition Examination Survey), offer one route to identifying possible relationships between environmental chemicals and health impacts, but sparse data and the complexity of traditional models make it difficult to leverage effectively. We describe a workflow to efficiently and comprehensively evaluate and prioritize chemical-health impact relationships from the NHANES biomonitoring survey studies. Using a frequent itemset mining (FIM) approach, we identified relationships between chemicals and health biomarkers and diseases. The FIM method identified 7,848 relationships between 219 chemicals and 93 health outcomes/biomarkers. Two case studies used to evaluate the FIM rankings demonstrate that the FIM approach is able to identify published relationships. Because the relationships are derived from the vast majority of the chemicals monitored by NHANES, the resulting list of associations is appropriate for evaluating results from targeted data mining or identifying novel candidate relationships for more detailed investigation. Because of the computational efficiency of the FIM method, all chemicals and health effects can be considered in a single analysis. The resulting list provides a comprehensive summary of the chemical/health co-occurrences from NHANES that are higher than expected by chance. This information enables ranking and prioritization on chemicals or health effects of interest for evaluation of published results and design of future studies. Bell SM, Edwards SW. 2015. Identification and prioritization of relationships between environmental stressors and adverse human health impacts. Environ Health Perspect 123:1193-1199; http://dx.doi.org/10.1289/ehp.1409138.
Sabourin, Jeremy; Nobel, Andrew B.; Valdar, William
2014-01-01
Genomewide association studies sometimes identify loci at which both the number and identities of the underlying causal variants are ambiguous. In such cases, statistical methods that model effects of multiple SNPs simultaneously can help disentangle the observed patterns of association and provide information about how those SNPs could be prioritized for follow-up studies. Current multi-SNP methods, however, tend to assume that SNP effects are well captured by additive genetics; yet when genetic dominance is present, this assumption translates to reduced power and faulty prioritizations. We describe a statistical procedure for prioritizing SNPs at GWAS loci that efficiently models both additive and dominance effects. Our method, LLARRMA-dawg, combines a group LASSO procedure for sparse modeling of multiple SNP effects with a resampling procedure based on fractional observation weights; it estimates for each SNP the robustness of association with the phenotype both to sampling variation and to competing explanations from other SNPs. In producing a SNP prioritization that best identifies underlying true signals, we show that: our method easily outperforms a single marker analysis; when additive-only signals are present, our joint model for additive and dominance is equivalent to or only slightly less powerful than modeling additive-only effects; and, when dominance signals are present, even in combination with substantial additive effects, our joint model is unequivocally more powerful than a model assuming additivity. We also describe how performance can be improved through calibrated randomized penalization, and discuss how dominance in ungenotyped SNPs can be incorporated through either heterozygote dosage or multiple imputation. PMID:25417853
Hongoh, Valerie; Gosselin, Pierre; Michel, Pascal; Ravel, André; Waaub, Jean-Philippe; Campagna, Céline; Samoura, Karim
2017-01-01
Prioritizing resources for optimal responses to an ever growing list of existing and emerging infectious diseases represents an important challenge to public health. In the context of climate change, there is increasing anticipated variability in the occurrence of infectious diseases, notably climate-sensitive vector-borne diseases. An essential step in prioritizing efforts is to identify what considerations and concerns to take into account to guide decisions and thus set disease priorities. This study was designed to perform a comprehensive review of criteria for vector-borne disease prioritization, assess their applicability in a context of climate change with a diverse cross-section of stakeholders in order to produce a baseline list of considerations to use in this decision-making context. Differences in stakeholder choices were examined with regards to prioritization of these criteria for research, surveillance and disease prevention and control objectives. A preliminary list of criteria was identified following a review of the literature. Discussions with stakeholders were held to consolidate and validate this list of criteria and examine their effects on disease prioritization. After this validation phase, a total of 21 criteria were retained. A pilot vector-borne disease prioritization exercise was conducted using PROMETHEE to examine the effects of the retained criteria on prioritization in different intervention domains. Overall, concerns expressed by stakeholders for prioritization were well aligned with categories of criteria identified in previous prioritization studies. Weighting by category was consistent between stakeholders overall, though some significant differences were found between public health and non-public health stakeholders. From this exercise, a general model for climate-sensitive vector-borne disease prioritization has been developed that can be used as a starting point for further public health prioritization exercises relating to research, surveillance, and prevention and control interventions in a context of climate change. Multi-stakeholder engagement in prioritization can help broaden the range of criteria taken into account, offer opportunities for early identification of potential challenges and may facilitate acceptability of any resulting decisions.
MODELING CHEMICAL FATE AND METABOLISM FOR COMPUTATIONAL TOXICOLOGY
The goal of ORD's Computational Toxicology initiative is to develop the science for EPA to prioritize toxicity-testing requirements for chemicals subject to regulation. Many toxic effects, however, result from metabolism of parent chemicals to form metabolites that are much more...
Cheminformatic Analysis of the US EPA ToxCast Chemical Library
The ToxCast project is employing high throughput screening (HTS) technologies, along with chemical descriptors and computational models, to develop approaches for screening and prioritizing environmental chemicals for further toxicity testing. ToxCast Phase I generated HTS data f...
77 FR 69916 - Aviation Rulemaking Advisory Committee; Meeting
Federal Register 2010, 2011, 2012, 2013, 2014
2012-11-21
... Expectations 3. Recommendation Reports a. Rulemaking Prioritization Working Group (RPWG) Recommendation Report (ARAC) b. Avionics Systems Harmonization Working Group--Low Speed Alerting, Phase 2 Recommendation Report (TAE) 4. Status Reports From Active Working Groups a. Airman Testing Standards and Training...
Immunotoxicant Screening and Prioritization in the 21st Century**
Current immunotoxicity testing guidance for drugs, high production volume chemicals and pesticides specifies the use of animal models to assess potential biomarkers of immune system effects (e.g., lymphoid organ and bone marrow indices, histopathology) or actual measures of immun...
Adverse Outcome Pathway (AOP) Network Development for Fatty Liver
Adverse outcome pathways (AOPs) are descriptive biological sequences that start from a molecular initiating event (MIE) and end with an adverse health outcome. AOPs provide biological context for high throughput chemical testing and further prioritize environmental health risk re...
In Vitro Pulmonary Toxicity of Metal Oxide Nanoparticles
Nanomaterials (NMs) encompass a diversity of materials with unique physicochemical characteristics which raise concerns about their potential risk to human health. Rapid predictive testing methods are needed to characterize NMs health effects as well as to screen and prioritize N...
AN INTEGRATED COASTAL-WATERSHED MONITORING FRAMEWORK FOR ASSESSMENT
An approach for watershed classification in support of assessments, disgnosis of biological impairment, and prioritization of watershed restorations has been tested in coastal watersheds surrounding the western arm of Lake Superior and is currently being assessed for a series of ...
QSAR PRIORITIZATION OF CHEMICAL INVENTORIES FOR ENDOCRINE DISRUPTOR TESTING
Binding affinity between chemicals and the estrogen receptor (ER) serves as an indicator of the potential to cause endocrine disruption through this receptor-mediated endocrine pathway. Estimating ER binding affinity is, therefore, one strategic approach to reducing the costs of ...
New High Throughput Methods to Estimate Chemical Exposure
EPA has made many recent advances in high throughput bioactivity testing. However, concurrent advances in rapid, quantitative prediction of human and ecological exposures have been lacking, despite the clear importance of both measures for a risk-based approach to prioritizing an...
EPA Perspectives on Nanoinformatics for Prioritization and Toxicity Testing
The U.S. Environmental Protection Agency’s (EPA) Office of Research and Development is investigating the environmental health and safety implications of engineered nanomaterials. Research activities as outlined in ORD’s Nanomaterial Strategy (http://www.epa.gov/nanoscience/files/...
Brevé, Niels W P; Buijse, Anthonie D; Kroes, Martin J; Wanningen, Herman; Vriese, Frederik T
2014-10-15
Preservation and restoration of Europe's endangered migratory fish species and habitats are high on the international river basin policy agenda. Improvement through restoration of longitudinal connectivity is seen as an important measure, but although prioritization of in-stream barriers has been addressed at local and regional levels the process still lacks adequate priority on the international level. This paper introduces a well-tested method, designed to help decision makers achieve the rehabilitation of targeted ichthyofauna more successfully. This method assesses artificial barriers within waters designated under the Water Framework Directive (WFD), Europe's main legislative driver for ecological improvement of river basins. The method aggregates migratory fish communities (both diadromous and potamodromous) into functional biological units (ecological fish guilds) and defines their most pressing habitat requirements. Using GIS mapping and spatial analysis of the potential ranges (fish zonation) we pin-point the most important barriers, per guild. This method was developed and deployed over a 12 year period as a practical case study, fitting data derived from the 36 regional water management organisations in the Netherlands. We delivered national advice on the prioritization of a total of 2924 barriers located within WFD water bodies, facilitating migration for all 18 indigenous migratory fish species. Scaling up to larger geographical areas can be achieved using datasets from other countries to link water body typologies to distribution ranges of migratory fish species. Copyright © 2014 Elsevier B.V. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nelson, Jerel G.; Kruzic, Michael; Castillo, Carlos
2013-07-01
Chalk River Laboratory (CRL), located in Ontario Canada, has a large number of remediation projects currently in the Nuclear Legacy Liabilities Program (NLLP), including hundreds of facility decommissioning projects and over one hundred environmental remediation projects, all to be executed over the next 70 years. Atomic Energy of Canada Limited (AECL) utilized WorleyParsons to prioritize the NLLP projects at the CRL through a risk-based prioritization and ranking process, using the WorleyParsons Sequencing Unit Prioritization and Estimating Risk Model (SUPERmodel). The prioritization project made use of the SUPERmodel which has been previously used for other large-scale site prioritization and sequencing ofmore » facilities at nuclear laboratories in the United States. The process included development and vetting of risk parameter matrices as well as confirmation/validation of project risks. Detailed sensitivity studies were also conducted to understand the impacts that risk parameter weighting and scoring had on prioritization. The repeatable prioritization process yielded an objective, risk-based and technically defendable process for prioritization that gained concurrence from all stakeholders, including Natural Resources Canada (NRCan) who is responsible for the oversight of the NLLP. (authors)« less
Tap Testing Hammer using Unmanned Aerial Systems (UASs)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mason, JaMein DeShon; Ayorinde, Emmanuel Temiloluwa; Mascarenas, David Dennis
This is the final poster for a Student Symposium at Los Alamos National Laboratory. This research describes the development, validation, and testing of a remote concrete tapping mechanism enabled by UAS. The conclusion is the following: The results quantify for the first time concrete tapping data collected remotely with UAS, enabling cost-effective, safer and sustainable upgrade prioritization of railroad bridges inventories.
Self-priorization processes in action and perception.
Frings, Christian; Wentura, Dirk
2014-10-01
Recently, Sui, He, and Humphreys (2012) introduced a new paradigm to investigate prioritized processing of self-related information. In a balanced design, they arbitrarily assigned simple geometric shapes to the participant and 2 others. Subsequently, the task was to judge whether label-shape pairings matched. The authors found a remarkable self-prioritization effect, that is, for matching self-related trials verification was very fast and accurate in comparison to the other conditions. We tested the hypothesis that the self-priorization effect extends from perception-self links to action-self links. In particular, we assigned simple movements (i.e., up, down, left, right) to the participant, 2 others (i.e., the mother; a stranger), and a neutral label, respectively. In each trial participants executed a movement (triggered by a cue), which was followed by a briefly presented label. Participants had to judge whether label-movement pairings matched. In accordance with Sui et al. (2012) we found a remarkable self-prioritization effect, that is, for matching self-related trials verification was very fast and accurate in comparison to the other conditions.
Del Giudice, G; Padulano, R; Siciliano, D
2016-01-01
The lack of geometrical and hydraulic information about sewer networks often excludes the adoption of in-deep modeling tools to obtain prioritization strategies for funds management. The present paper describes a novel statistical procedure for defining the prioritization scheme for preventive maintenance strategies based on a small sample of failure data collected by the Sewer Office of the Municipality of Naples (IT). Novelty issues involve, among others, considering sewer parameters as continuous statistical variables and accounting for their interdependences. After a statistical analysis of maintenance interventions, the most important available factors affecting the process are selected and their mutual correlations identified. Then, after a Box-Cox transformation of the original variables, a methodology is provided for the evaluation of a vulnerability map of the sewer network by adopting a joint multivariate normal distribution with different parameter sets. The goodness-of-fit is eventually tested for each distribution by means of a multivariate plotting position. The developed methodology is expected to assist municipal engineers in identifying critical sewers, prioritizing sewer inspections in order to fulfill rehabilitation requirements.
Davis, Robert C; Jensen, Carl J; Burgette, Lane; Burnett, Kathryn
2014-03-01
Cold case squads have garnered much attention; however, they have yet to undergo significant empirical scrutiny. In the present study, the authors interviewed investigators and reviewed 189 solved and unsolved cold cases in Washington, D.C., to determine whether there are factors that can predict cold case solvability. In the interviews, new information from witnesses or information from new witnesses was cited as the most prevalent reason for case clearance. The case reviews determined that there were factors in each of the following domains that predicted whether cases would be solved during cold case investigations: Crime Context, Initial Investigation Results, Basis for Opening Cold Case, and Cold Case Investigator Actions. The results suggest that it is possible to prioritize cold case work based on the likelihood of investigations leading to clearances. © 2014 American Academy of Forensic Sciences.
Saito, Masaya M.; Kinoshita, Ryo
2018-01-01
Elevating herd immunity level against rubella is essential to prevent congenital rubella syndrome (CRS). Insufficient vaccination coverage left susceptible pockets among adults in Japan, and the outbreak of rubella from 2012 to 2013 resulted in 45 observed CRS cases. Given a limited stock of rubella-containing vaccine (RCV) available, the Japanese government recommended healthcare providers to prioritize vaccination to those confirmed with low level of immunity, or to those likely to transmit to pregnant women. Although a test-and-vaccinate policy could potentially help reduce the use of the limited stockpile of vaccines, by selectively elevating herd immunity, the cost of serological testing is generally high and comparable to the vaccine itself. Here, we aimed to examine whether random vaccination would be more cost-beneficial than the test-and-vaccinate strategy. A mathematical model was employed to evaluate the vaccination policy implemented in 2012–2013, quantifying the benefit-to-cost ratio to achieve herd immunity. The modelling exercise demonstrated that, while the test-and-vaccinate strategy can efficiently achieve herd immunity when stockpiles of RCV are limited, random vaccination would be a more cost-beneficial strategy. As long as the herd immunity acts as the goal of vaccination, our findings apply to future supplementary immunization strategy. PMID:29565821
As part of the northern spotted owl recovery planning effort, we evaluated a series of alternative critical habitat scenarios using a species-distribution model (MaxEnt), a conservation-planning model (Zonation), and an individual-based population model (HexSim). With this suite ...
Daniel R. Williams; Pamela J. Jakes; Sam Burns; Antony S. Cheng; Kristen C. Nelson; Victoria Sturtevant; Rachel F. Brummel; Emily Staychock; Stephanie G. Souter
2012-01-01
Community wildfire protection planning has become an important tool for engaging wildland-urban interface residents and other stakeholders in efforts to address their mutual concerns about wildland fire management, prioritize hazardous fuel reduction projects, and improve forest health. Drawing from 13 case studies from across the United States, this article describes...
ERIC Educational Resources Information Center
Morton, Keith; Bergbauer, Samantha
2015-01-01
This paper describes an eight-year service-learning experiment that created four distinct spaces in which campus and community members meet, reflect, and act together. This work explores the tensions between traditional and critical service-learning, and points to the importance of building relationships with members of local communities and…
ERIC Educational Resources Information Center
Blanc, Suzanne; Brown, Joanna; Nevarez-La Torre, Aida; Brown, Chris
This report describes Chicago's Logan Square Neighborhood Association (LSNA), which has long worked to mobilize neighborhood residents to maintain and improve the quality of community life and bring additional resources and services into the neighborhood. LSNA prioritizes the needs of low- and moderate-income residents. LSNA works to make schools…
ERIC Educational Resources Information Center
Cherrstrom, Catherine A.; Raisor, Cindy; Fowler, Debra
2015-01-01
Engineering educators and employers value and prioritize communication skills, but developing and assessing such skills in engineering programs is challenging. Reflective ePortfolios provide opportunities to enhance communication skills. The purpose of this three-year qualitative case study was to investigate the use of reflective ePortfolios in…
ERIC Educational Resources Information Center
O'Malley, Michael P.; Long, Tanya A.; King, Jeffry
2015-01-01
Multiple and complex issues simultaneously present themselves for the principal's attention. Learning how to identify, prioritize, synthesize, and act in relation to these issues poses a particular challenge to early career principals. This case study engages aspiring and current school leaders in critical reflection upon leadership opportunities…
ERIC Educational Resources Information Center
Tjomsland, Hege Eikland
2010-01-01
This study examines an elementary school which during enrollment in the European Network of Health Promoting Schools, 1993-2003, and the Norwegian Physical Activity and Healthy Meals Project, 2004-2006, selected physical activity (PA) as a prioritized area. Survey data, school documents, and focus group data were collected and analyzed through a…
Federal Register 2010, 2011, 2012, 2013, 2014
2011-04-29
... DEPARTMENT OF COMMERCE Patent and Trademark Office 37 CFR Part 1 [Docket No.: PTO-P-2010-0092] RIN... Timing Control Procedures AGENCY: United States Patent and Trademark Office, Commerce. ACTION: Final rule... Trademark Office (Office) published a final rule that revises the rules of practice in patent cases to...
Federal Register 2010, 2011, 2012, 2013, 2014
2012-07-02
... Environmental Assessment (NCEA) within EPA's Office of Research and Development (ORD). It does not draw... workshop process that the draft document will be used in for identifying and prioritizing research gaps...'' will be held at the EPA facility in Research Triangle Park, North Carolina. The RTI Workshop will be...
2013-02-01
or funds authorized under Section 219(a) for projects costing no more than $4M. Research , Development, Test , and Evaluation (RDT&E) Appropriations...The RDT&E appropriation consists of the mission program budgets for all research , development, test and evaluation work performed by contractors...carry out an unspecified minor military construction project costing not more than $4,000,000. 9 f. Research , Development, Test , and Evaluation
Stockpiling Ventilators for Influenza Pandemics.
Huang, Hsin-Chan; Araz, Ozgur M; Morton, David P; Johnson, Gregory P; Damien, Paul; Clements, Bruce; Meyers, Lauren Ancel
2017-06-01
In preparing for influenza pandemics, public health agencies stockpile critical medical resources. Determining appropriate quantities and locations for such resources can be challenging, given the considerable uncertainty in the timing and severity of future pandemics. We introduce a method for optimizing stockpiles of mechanical ventilators, which are critical for treating hospitalized influenza patients in respiratory failure. As a case study, we consider the US state of Texas during mild, moderate, and severe pandemics. Optimal allocations prioritize local over central storage, even though the latter can be deployed adaptively, on the basis of real-time needs. This prioritization stems from high geographic correlations and the slightly lower treatment success assumed for centrally stockpiled ventilators. We developed our model and analysis in collaboration with academic researchers and a state public health agency and incorporated it into a Web-based decision-support tool for pandemic preparedness and response.
2017-02-01
Reports an error in "An integrative formal model of motivation and decision making: The MGPM*" by Timothy Ballard, Gillian Yeo, Shayne Loft, Jeffrey B. Vancouver and Andrew Neal ( Journal of Applied Psychology , 2016[Sep], Vol 101[9], 1240-1265). Equation A3 contained an error. This correct equation is provided in the erratum. (The following abstract of the original article appeared in record 2016-28692-001.) We develop and test an integrative formal model of motivation and decision making. The model, referred to as the extended multiple-goal pursuit model (MGPM*), is an integration of the multiple-goal pursuit model (Vancouver, Weinhardt, & Schmidt, 2010) and decision field theory (Busemeyer & Townsend, 1993). Simulations of the model generated predictions regarding the effects of goal type (approach vs. avoidance), risk, and time sensitivity on prioritization. We tested these predictions in an experiment in which participants pursued different combinations of approach and avoidance goals under different levels of risk. The empirical results were consistent with the predictions of the MGPM*. Specifically, participants pursuing 1 approach and 1 avoidance goal shifted priority from the approach to the avoidance goal over time. Among participants pursuing 2 approach goals, those with low time sensitivity prioritized the goal with the larger discrepancy, whereas those with high time sensitivity prioritized the goal with the smaller discrepancy. Participants pursuing 2 avoidance goals generally prioritized the goal with the smaller discrepancy. Finally, all of these effects became weaker as the level of risk increased. We used quantitative model comparison to show that the MGPM* explained the data better than the original multiple-goal pursuit model, and that the major extensions from the original model were justified. The MGPM* represents a step forward in the development of a general theory of decision making during multiple-goal pursuit. (PsycINFO Database Record (c) 2017 APA, all rights reserved).
Paul, Christine L; Bryant, Jamie; Roos, Ian A; Henskens, Frans A; Paul, David J
2014-01-01
Background With increasing attention given to the quality of chronic disease care, a measurement approach that empowers consumers to participate in improving quality of care and enables health services to systematically introduce patient-centered initiatives is needed. A Web-based survey with complex adaptive questioning and interactive survey items would allow consumers to easily identify and prioritize detailed service initiatives. Objective The aim was to develop and test a Web-based survey capable of identifying and prioritizing patient-centered initiatives in chronic disease outpatient services. Testing included (1) test-retest reliability, (2) patient-perceived acceptability of the survey content and delivery mode, and (3) average completion time, completion rates, and Flesch-Kincaid reading score. Methods In Phase I, the Web-based Consumer Preferences Survey was developed based on a structured literature review and iterative feedback from expert groups of service providers and consumers. The touchscreen survey contained 23 general initiatives, 110 specific initiatives available through adaptive questioning, and a relative prioritization exercise. In Phase II, a pilot study was conducted within 4 outpatient clinics to evaluate the reliability properties, patient-perceived acceptability, and feasibility of the survey. Eligible participants were approached to complete the survey while waiting for an appointment or receiving intravenous therapy. The age and gender of nonconsenters was estimated to ascertain consent bias. Participants with a subsequent appointment within 14 days were asked to complete the survey for a second time. Results A total of 741 of 1042 individuals consented to participate (71.11% consent), 529 of 741 completed all survey content (78.9% completion), and 39 of 68 completed the test-retest component. Substantial or moderate reliability (Cohen’s kappa>0.4) was reported for 16 of 20 general initiatives with observed percentage agreement ranging from 82.1%-100.0%. The majority of participants indicated the Web-based survey was easy to complete (97.9%, 531/543) and comprehensive (93.1%, 505/543). Participants also reported the interactive relative prioritization exercise was easy to complete (97.0%, 189/195) and helped them to decide which initiatives were of most importance (84.6%, 165/195). Average completion time was 8.54 minutes (SD 3.91) and the Flesch-Kincaid reading level was 6.8. Overall, 84.6% (447/529) of participants indicated a willingness to complete a similar survey again. Conclusions The Web-based Consumer Preferences Survey is sufficiently reliable and highly acceptable to patients. Based on completion times and reading level, this tool could be integrated in routine clinical practice and allows consumers to easily participate in quality evaluation. Results provide a comprehensive list of patient-prioritized initiatives for patients with major chronic conditions and delivers practice-ready evidence to guide improvements in patient-centered care. PMID:25532217
Kalid, Naser; Zaidan, A A; Zaidan, B B; Salman, Omar H; Hashim, M; Albahri, O S; Albahri, A S
2018-03-02
This paper presents a new approach to prioritize "Large-scale Data" of patients with chronic heart diseases by using body sensors and communication technology during disasters and peak seasons. An evaluation matrix is used for emergency evaluation and large-scale data scoring of patients with chronic heart diseases in telemedicine environment. However, one major problem in the emergency evaluation of these patients is establishing a reasonable threshold for patients with the most and least critical conditions. This threshold can be used to detect the highest and lowest priority levels when all the scores of patients are identical during disasters and peak seasons. A practical study was performed on 500 patients with chronic heart diseases and different symptoms, and their emergency levels were evaluated based on four main measurements: electrocardiogram, oxygen saturation sensor, blood pressure monitoring, and non-sensory measurement tool, namely, text frame. Data alignment was conducted for the raw data and decision-making matrix by converting each extracted feature into an integer. This integer represents their state in the triage level based on medical guidelines to determine the features from different sources in a platform. The patients were then scored based on a decision matrix by using multi-criteria decision-making techniques, namely, integrated multi-layer for analytic hierarchy process (MLAHP) and technique for order performance by similarity to ideal solution (TOPSIS). For subjective validation, cardiologists were consulted to confirm the ranking results. For objective validation, mean ± standard deviation was computed to check the accuracy of the systematic ranking. This study provides scenarios and checklist benchmarking to evaluate the proposed and existing prioritization methods. Experimental results revealed the following. (1) The integration of TOPSIS and MLAHP effectively and systematically solved the patient settings on triage and prioritization problems. (2) In subjective validation, the first five patients assigned to the doctors were the most urgent cases that required the highest priority, whereas the last five patients were the least urgent cases and were given the lowest priority. In objective validation, scores significantly differed between the groups, indicating that the ranking results were identical. (3) For the first, second, and third scenarios, the proposed method exhibited an advantage over the benchmark method with percentages of 40%, 60%, and 100%, respectively. In conclusion, patients with the most and least urgent cases received the highest and lowest priority levels, respectively.
Are Earth System model software engineering practices fit for purpose? A case study.
NASA Astrophysics Data System (ADS)
Easterbrook, S. M.; Johns, T. C.
2009-04-01
We present some analysis and conclusions from a case study of the culture and practices of scientists at the Met Office and Hadley Centre working on the development of software for climate and Earth System models using the MetUM infrastructure. The study examined how scientists think about software correctness, prioritize their requirements in making changes, and develop a shared understanding of the resulting models. We conclude that highly customized techniques driven strongly by scientific research goals have evolved for verification and validation of such models. In a formal software engineering context these represents costly, but invaluable, software integration tests with considerable benefits. The software engineering practices seen also exhibit recognisable features of both agile and open source software development projects - self-organisation of teams consistent with a meritocracy rather than top-down organisation, extensive use of informal communication channels, and software developers who are generally also users and science domain experts. We draw some general conclusions on whether these practices work well, and what new software engineering challenges may lie ahead as Earth System models become ever more complex and petascale computing becomes the norm.
DEVELOPING COMPUTATIONAL TOOLS FOR PREDICTING CHEMICAL FATE, METABOLISM, AND TOXICITY PATHWAYS
ORD's research program in Computational Toxicology (CompTox) will enable EPA Program Offices and other regulators to prioritize and reduce toxicity-testing requirements for potentially hazardous chemicals. The CompTox program defines the "toxicity process" as follows : 1) a stre...
DOT National Transportation Integrated Search
2014-01-24
The Carrier Safety Measurement System (CSMS) is the Federal Motor Carrier Safety Administrations (FMCSA's) workload prioritization tool. This tool is used to identify carriers with potential safety issues so that they are subject to interventions ...
The U.S. EPA is developing alternative screening methods to identify putative developmental neurotoxicants and prioritize chemicals for additional testing. One method developmentally exposes zebrafish embryos and assesses nervous system structure at 2 days post-fertilization (dpf...
In 2007, EPA launched ToxCast™ in order to develop a cost-effective approach for prioritizing the toxicity testing of large numbers of chemicals in a short period of time. Using data from state-of-the-art high throughput screening (HTS) bioassays developed in the pharmaceutical i...
Advance the characterization of exposure and dose metrics required to translate advances and findings in computational toxicology to information that can be directly used to support exposure and risk assessment for decision making and improved public health.
High-Throughput Toxicity Testing: New Strategies for ...
In recent years, the food industry has made progress in improving safety testing methods focused on microbial contaminants in order to promote food safety. However, food industry toxicologists must also assess the safety of food-relevant chemicals including pesticides, direct additives, and food contact substances. With the rapidly growing use of new food additives, as well as innovation in food contact substance development, an interest in exploring the use of high-throughput chemical safety testing approaches has emerged. Currently, the field of toxicology is undergoing a paradigm shift in how chemical hazards can be evaluated. Since there are tens of thousands of chemicals in use, many of which have little to no hazard information and there are limited resources (namely time and money) for testing these chemicals, it is necessary to prioritize which chemicals require further safety testing to better protect human health. Advances in biochemistry and computational toxicology have paved the way for animal-free (in vitro) high-throughput screening which can characterize chemical interactions with highly specific biological processes. Screening approaches are not novel; in fact, quantitative high-throughput screening (qHTS) methods that incorporate dose-response evaluation have been widely used in the pharmaceutical industry. For toxicological evaluation and prioritization, it is the throughput as well as the cost- and time-efficient nature of qHTS that makes it
Ho, Daniel W. H.; Yap, Maurice K. H.; Ng, Po Wah; Fung, Wai Yan; Yip, Shea Ping
2012-01-01
Background Myopia is the most common ocular disorder worldwide and imposes tremendous burden on the society. It is a complex disease. The MYP6 locus at 22 q12 is of particular interest because many studies have detected linkage signals at this interval. The MYP6 locus is likely to contain susceptibility gene(s) for myopia, but none has yet been identified. Methodology/Principal Findings Two independent subject groups of southern Chinese in Hong Kong participated in the study an initial study using a discovery sample set of 342 cases and 342 controls, and a follow-up study using a replication sample set of 316 cases and 313 controls. Cases with high myopia were defined by spherical equivalent ≤ -8 dioptres and emmetropic controls by spherical equivalent within ±1.00 dioptre for both eyes. Manual candidate gene selection from the MYP6 locus was supported by objective in silico prioritization. DNA samples of discovery sample set were genotyped for 178 tagging single nucleotide polymorphisms (SNPs) from 26 genes. For replication, 25 SNPs (tagging or located at predicted transcription factor or microRNA binding sites) from 4 genes were subsequently examined using the replication sample set. Fisher P value was calculated for all SNPs and overall association results were summarized by meta-analysis. Based on initial and replication studies, rs2009066 located in the crystallin beta A4 (CRYBA4) gene was identified to be the most significantly associated with high myopia (initial study: P = 0.02; replication study: P = 1.88e-4; meta-analysis: P = 1.54e-5) among all the SNPs tested. The association result survived correction for multiple comparisons. Under the allelic genetic model for the combined sample set, the odds ratio of the minor allele G was 1.41 (95% confidence intervals, 1.21-1.64). Conclusions/Significance A novel susceptibility gene (CRYBA4) was discovered for high myopia. Our study also signified the potential importance of appropriate gene prioritization in candidate selection. PMID:22792142
Andronis, Lazaros; Billingham, Lucinda J; Bryan, Stirling; James, Nicholas D; Barton, Pelham M
2016-04-01
Efforts to ensure that funded research represents "value for money" have led to increasing calls for the use of analytic methods in research prioritization. A number of analytic approaches have been proposed to assist research funding decisions, the most prominent of which are value of information (VOI) and prospective payback of research (PPoR). Despite the increasing interest in the topic, there are insufficient VOI and PPoR applications on the same case study to contrast their methods and compare their outcomes. We undertook VOI and PPoR analyses to determine the value of conducting 2 proposed research programs. The application served as a vehicle for identifying differences and similarities between the methods, provided insight into the assumptions and practical requirements of undertaking prospective analyses for research prioritization, and highlighted areas for future research. VOI and PPoR were applied to case studies representing proposals for clinical trials in advanced non-small-cell lung cancer and prostate cancer. Decision models were built to synthesize the evidence available prior to the funding decision. VOI (expected value of perfect and sample information) and PPoR (PATHS model) analyses were undertaken using the developed models. VOI and PPoR results agreed in direction, suggesting that the proposed trials would be cost-effective investments. However, results differed in magnitude, largely due to the way each method conceptualizes the possible outcomes of further research and the implementation of research results in practice. Compared with VOI, PPoR is less complex but requires more assumptions. Although the approaches are not free from limitations, they can provide useful input for research funding decisions. © The Author(s) 2015.
A Multilayer Network Approach for Guiding Drug Repositioning in Neglected Diseases
Chernomoretz, Ariel; Agüero, Fernán
2016-01-01
Drug development for neglected diseases has been historically hampered due to lack of market incentives. The advent of public domain resources containing chemical information from high throughput screenings is changing the landscape of drug discovery for these diseases. In this work we took advantage of data from extensively studied organisms like human, mouse, E. coli and yeast, among others, to develop a novel integrative network model to prioritize and identify candidate drug targets in neglected pathogen proteomes, and bioactive drug-like molecules. We modeled genomic (proteins) and chemical (bioactive compounds) data as a multilayer weighted network graph that takes advantage of bioactivity data across 221 species, chemical similarities between 1.7 105 compounds and several functional relations among 1.67 105 proteins. These relations comprised orthology, sharing of protein domains, and shared participation in defined biochemical pathways. We showcase the application of this network graph to the problem of prioritization of new candidate targets, based on the information available in the graph for known compound-target associations. We validated this strategy by performing a cross validation procedure for known mouse and Trypanosoma cruzi targets and showed that our approach outperforms classic alignment-based approaches. Moreover, our model provides additional flexibility as two different network definitions could be considered, finding in both cases qualitatively different but sensible candidate targets. We also showcase the application of the network to suggest targets for orphan compounds that are active against Plasmodium falciparum in high-throughput screens. In this case our approach provided a reduced prioritization list of target proteins for the query molecules and showed the ability to propose new testable hypotheses for each compound. Moreover, we found that some predictions highlighted by our network model were supported by independent experimental validations as found post-facto in the literature. PMID:26735851
Khandan, Mohammad; Nili, Majid; Koohpaei, Alireza; Mosaferchi, Saeedeh
2016-01-01
Nowadays, the health work decision makers need to analyze a huge amount of data and consider many conflicting evaluation criteria and sub-criteria. Therefore, an ergonomic evaluation in the work environment in order to the control occupational disorders is considered as the Multi Criteria Decision Making (MCDM) problem. In this study, the ergonomic risks factors, which may influence health, were evaluated in a manufacturing company in 2014. Then entropy method was applied to prioritize the different risk factors. This study was done with a descriptive-analytical approach and 13 tasks were included from total number of employees who were working in the seven halls of an ark opal manufacturing (240). Required information was gathered by the demographic questionnaire and Assessment of Repetitive Tasks (ART) method for repetitive task assessment. In addition, entropy was used to prioritize the risk factors based on the ergonomic control needs. The total exposure score based on the ART method calculated was equal to 30.07 ±12.43. Data analysis illustrated that 179 cases (74.6% of tasks) were in the high level of risk area and 13.8% were in the medium level of risk. ART- entropy results revealed that based on the weighted factors, higher value belongs to grip factor and the lowest value was related to neck and hand posture and duration. Based on the limited financial resources, it seems that MCDM in many challenging situations such as control procedures and priority approaches could be used successfully. Other MCDM methods for evaluating and prioritizing the ergonomic problems are recommended.
A first-line nurse manager's goal-profile.
Johansson, Gunilla; Pörn, Ingmar; Theorell, Töres; Gustafsson, Barbro
2007-01-01
The aim of this case study was to acquire understanding concerning the first-line nurse manager's goal-profile, i.e. prioritization of goals in her work as a first-line nurse manager, through use of an action-theoretic and confirmatory theory. The first-line nurse manager's pivotal role regarding quality of care and development in relation to on-going changes in the health care sector is stressed by many researchers and the transition from nurse to manager is described as a demanding challenge for the first-line nurse manager. The case study described in this paper concerns a first-line nurse manager in an actual working environment in care of older people. Data collection comprised interviews, observations, a job description and policy documents. A hermeneutic interpretation was used for data analysis. The results showed that the first-line nurse manager had three goals in her goal-profile, in the following order of priority: (i) a nurse goal that she had strongly accepted and in which she had excellent control, (ii) an administrator goal that she had accepted and in which she had control, (iii) a leadership goal that she had not accepted and in which she did not have control. Both the administrator and leadership goals were based on her job description, but the nurse goal was a personally chosen goal based on her own self-relation/goal-fulfillment. The first-line nurse manager's prioritized self-identity, based on successful realization of goals in her goal-profile, was decisive in the manifestation of her work. This study contributes to a new understanding of the first-line nurse manager's self-identity related to work in terms of goal acceptance and goal control of prioritized goals. This action-theoretic approach could be a valuable 'key' for understanding leadership (or lack of leadership) in clinical practice.
A Multilayer Network Approach for Guiding Drug Repositioning in Neglected Diseases.
Berenstein, Ariel José; Magariños, María Paula; Chernomoretz, Ariel; Agüero, Fernán
2016-01-01
Drug development for neglected diseases has been historically hampered due to lack of market incentives. The advent of public domain resources containing chemical information from high throughput screenings is changing the landscape of drug discovery for these diseases. In this work we took advantage of data from extensively studied organisms like human, mouse, E. coli and yeast, among others, to develop a novel integrative network model to prioritize and identify candidate drug targets in neglected pathogen proteomes, and bioactive drug-like molecules. We modeled genomic (proteins) and chemical (bioactive compounds) data as a multilayer weighted network graph that takes advantage of bioactivity data across 221 species, chemical similarities between 1.7 105 compounds and several functional relations among 1.67 105 proteins. These relations comprised orthology, sharing of protein domains, and shared participation in defined biochemical pathways. We showcase the application of this network graph to the problem of prioritization of new candidate targets, based on the information available in the graph for known compound-target associations. We validated this strategy by performing a cross validation procedure for known mouse and Trypanosoma cruzi targets and showed that our approach outperforms classic alignment-based approaches. Moreover, our model provides additional flexibility as two different network definitions could be considered, finding in both cases qualitatively different but sensible candidate targets. We also showcase the application of the network to suggest targets for orphan compounds that are active against Plasmodium falciparum in high-throughput screens. In this case our approach provided a reduced prioritization list of target proteins for the query molecules and showed the ability to propose new testable hypotheses for each compound. Moreover, we found that some predictions highlighted by our network model were supported by independent experimental validations as found post-facto in the literature.
Sullivan, Trudy; Hansen, Paul
2017-04-01
The use of multicriteria decision analysis for health technology prioritization depends on decision-making criteria and weights according to their relative importance. We report on a methodology for determining criteria and weights that was developed and piloted in New Zealand and enables extensive participation by members of the general population. Stimulated by a preliminary ranking exercise that involved prioritizing 14 diverse technologies, six focus groups discussed what matters to people when thinking about technologies that should be funded. These discussions informed the specification of criteria related to technologies' benefits for use in a discrete choice survey designed to generate weights for each individual participant as well as mean weights. A random sample of 3218 adults was invited to participate. To check test-retest reliability, a subsample completed the survey twice. Cluster analysis was performed to identify participants with similar patterns of weights. Six benefits-related criteria were distilled from the focus group discussions and included in the discrete choice survey, which was completed by 322 adults (10% response rate). Most participants (85%) found the survey easy to understand, and the survey exhibited test-retest reliability. The cluster analysis revealed that participant weights are related more to idiosyncratic personal preferences than to demographic and background characteristics. The methodology enables extensive participation by members of the general population, for whom it is both acceptable and reliable. Generating weights for each participant allows the heterogeneity of individual preferences, and the extent to which they are related to demographic and background characteristics, to be tested. Copyright © 2017 International Society for Pharmacoeconomics and Outcomes Research (ISPOR). Published by Elsevier Inc. All rights reserved.
Clostridium difficile infection among children across diverse US geographic locations.
Wendt, Joyanna M; Cohen, Jessica A; Mu, Yi; Dumyati, Ghinwa K; Dunn, John R; Holzbauer, Stacy M; Winston, Lisa G; Johnston, Helen L; Meek, James I; Farley, Monica M; Wilson, Lucy E; Phipps, Erin C; Beldavs, Zintars G; Gerding, Dale N; McDonald, L Clifford; Gould, Carolyn V; Lessa, Fernanda C
2014-04-01
Little is known about the epidemiology of Clostridium difficile infection (CDI) among children, particularly children ≤3 years of age in whom colonization is common but pathogenicity uncertain. We sought to describe pediatric CDI incidence, clinical presentation, and outcomes across age groups. Data from an active population- and laboratory-based CDI surveillance in 10 US geographic areas during 2010-2011 were used to identify cases (ie, residents with C difficile-positive stool without a positive test in the previous 8 weeks). Community-associated (CA) cases had stool collected as outpatients or ≤3 days after hospital admission and no overnight health care facility stay in the previous 12 weeks. A convenience sample of CA cases were interviewed. Demographic, exposure, and clinical data for cases aged 1 to 17 years were compared across 4 age groups: 1 year, 2 to 3 years, 4 to 9 years, and 10 to 17 years. Of 944 pediatric CDI cases identified, 71% were CA. CDI incidence per 100,000 children was highest among 1-year-old (66.3) and white (23.9) cases. The proportion of cases with documented diarrhea (72%) or severe disease (8%) was similar across age groups; no cases died. Among the 84 cases interviewed who reported diarrhea on the day of stool collection, 73% received antibiotics during the previous 12 weeks. Similar disease severity across age groups suggests an etiologic role for C difficile in the high rates of CDI observed in younger children. Prevention efforts to reduce unnecessary antimicrobial use among young children in outpatient settings should be prioritized.
Ng, Victoria; Sargeant, Jan M.
2012-01-01
Background Zoonoses account for over half of all communicable diseases causing illness in humans. As there are limited resources available for the control and prevention of zoonotic diseases, a framework for their prioritization is necessary to ensure resources are directed into those of highest importance. Although zoonotic outbreaks are a significant burden of disease in North America, the systematic prioritization of zoonoses in this region has not been previously evaluated. Methodology/Principal Findings This study describes the novel use of a well-established quantitative method, conjoint analysis (CA), to identify the relative importance of 21 key characteristics of zoonotic diseases that can be used for their prioritization in Canada and the US. Relative importance weights from the CA were used to develop a point-scoring system to derive a recommended list of zoonoses for prioritization in Canada and the US. Over 1,500 participants from the general public were recruited to complete the online survey (761 from Canada and 778 from the US). Hierarchical Bayes models were fitted to the survey data to derive CA-weighted scores. Scores were applied to 62 zoonotic diseases of public health importance in Canada and the US to rank diseases in order of priority. Conclusions/Significance This was the first study to describe a systematic and quantitative approach to the prioritization of zoonoses in North America involving public participants. We found individuals with no prior knowledge or experience in prioritizing zoonoses were capable of producing meaningful results using CA as a novel quantitative approach to prioritization. More similarities than differences were observed between countries suggesting general agreement in disease prioritization between Canadians and Americans. We demonstrate CA as a potential tool for the prioritization of zoonoses; other prioritization exercises may also consider this approach. PMID:23133639
Gosselin, Pierre; Michel, Pascal; Ravel, André; Waaub, Jean-Philippe; Campagna, Céline; Samoura, Karim
2017-01-01
Prioritizing resources for optimal responses to an ever growing list of existing and emerging infectious diseases represents an important challenge to public health. In the context of climate change, there is increasing anticipated variability in the occurrence of infectious diseases, notably climate-sensitive vector-borne diseases. An essential step in prioritizing efforts is to identify what considerations and concerns to take into account to guide decisions and thus set disease priorities. This study was designed to perform a comprehensive review of criteria for vector-borne disease prioritization, assess their applicability in a context of climate change with a diverse cross-section of stakeholders in order to produce a baseline list of considerations to use in this decision-making context. Differences in stakeholder choices were examined with regards to prioritization of these criteria for research, surveillance and disease prevention and control objectives. A preliminary list of criteria was identified following a review of the literature. Discussions with stakeholders were held to consolidate and validate this list of criteria and examine their effects on disease prioritization. After this validation phase, a total of 21 criteria were retained. A pilot vector-borne disease prioritization exercise was conducted using PROMETHEE to examine the effects of the retained criteria on prioritization in different intervention domains. Overall, concerns expressed by stakeholders for prioritization were well aligned with categories of criteria identified in previous prioritization studies. Weighting by category was consistent between stakeholders overall, though some significant differences were found between public health and non-public health stakeholders. From this exercise, a general model for climate-sensitive vector-borne disease prioritization has been developed that can be used as a starting point for further public health prioritization exercises relating to research, surveillance, and prevention and control interventions in a context of climate change. Multi-stakeholder engagement in prioritization can help broaden the range of criteria taken into account, offer opportunities for early identification of potential challenges and may facilitate acceptability of any resulting decisions. PMID:29281726
Davis, John M.; Ekman, Drew R.; Teng, Quincy; Ankley, Gerald T.; Berninger, Jason P.; Cavallin, Jenna E.; Jensen, Kathleen M.; Kahl, Michael D.; Schroeder, Anthony L.; Villeneuve, Daniel L.; Jorgenson, Zachary G.; Lee, Kathy E.; Collette, Timothy W.
2016-01-01
The ability to focus on the most biologically relevant contaminants affecting aquatic ecosystems can be challenging because toxicity-assessment programs have not kept pace with the growing number of contaminants requiring testing. Because it has proven effective at assessing the biological impacts of potentially toxic contaminants, profiling of endogenous metabolites (metabolomics) may help screen out contaminants with a lower likelihood of eliciting biological impacts, thereby prioritizing the most biologically important contaminants. The authors present results from a study that utilized cage-deployed fathead minnows (Pimephales promelas) at 18 sites across the Great Lakes basin. They measured water temperature and contaminant concentrations in water samples (132 contaminants targeted, 86 detected) and used 1H-nuclear magnetic resonance spectroscopy to measure endogenous metabolites in polar extracts of livers. They used partial least-squares regression to compare relative abundances of endogenous metabolites with contaminant concentrations and temperature. The results indicated that profiles of endogenous polar metabolites covaried with at most 49 contaminants. The authors identified up to 52% of detected contaminants as not significantly covarying with changes in endogenous metabolites, suggesting they likely were not eliciting measurable impacts at these sites. This represents a first step in screening for the biological relevance of detected contaminants by shortening lists of contaminants potentially affecting these sites. Such information may allow risk assessors to prioritize contaminants and focus toxicity testing on the most biologically relevant contaminants. Environ Toxicol Chem 2016;35:2493–2502.
Effects of preference heterogeneity among landowners on spatial conservation prioritization.
Nielsen, Anne Sofie Elberg; Strange, Niels; Bruun, Hans Henrik; Jacobsen, Jette Bredahl
2017-06-01
The participation of private landowners in conservation is crucial to efficient biodiversity conservation. This is especially the case in settings where the share of private ownership is large and the economic costs associated with land acquisition are high. We used probit regression analysis and historical participation data to examine the likelihood of participation of Danish forest owners in a voluntary conservation program. We used the results to spatially predict the likelihood of participation of all forest owners in Denmark. We merged spatial data on the presence of forest, cadastral information on participation contracts, and individual-level socioeconomic information about the forest owners and their households. We included predicted participation in a probability model for species survival. Uninformed and informed (included land owner characteristics) models were then incorporated into a spatial prioritization for conservation of unmanaged forests. The choice models are based on sociodemographic data on the entire population of Danish forest owners and historical data on their participation in conservation schemes. Inclusion in the model of information on private landowners' willingness to supply land for conservation yielded at intermediate budget levels up to 30% more expected species coverage than the uninformed prioritization scheme. Our landowner-choice model provides an example of moving toward more implementable conservation planning. © 2016 Society for Conservation Biology.
Li, Min; Li, Qi; Ganegoda, Gamage Upeksha; Wang, JianXin; Wu, FangXiang; Pan, Yi
2014-11-01
Identification of disease-causing genes among a large number of candidates is a fundamental challenge in human disease studies. However, it is still time-consuming and laborious to determine the real disease-causing genes by biological experiments. With the advances of the high-throughput techniques, a large number of protein-protein interactions have been produced. Therefore, to address this issue, several methods based on protein interaction network have been proposed. In this paper, we propose a shortest path-based algorithm, named SPranker, to prioritize disease-causing genes in protein interaction networks. Considering the fact that diseases with similar phenotypes are generally caused by functionally related genes, we further propose an improved algorithm SPGOranker by integrating the semantic similarity of GO annotations. SPGOranker not only considers the topological similarity between protein pairs in a protein interaction network but also takes their functional similarity into account. The proposed algorithms SPranker and SPGOranker were applied to 1598 known orphan disease-causing genes from 172 orphan diseases and compared with three state-of-the-art approaches, ICN, VS and RWR. The experimental results show that SPranker and SPGOranker outperform ICN, VS, and RWR for the prioritization of orphan disease-causing genes. Importantly, for the case study of severe combined immunodeficiency, SPranker and SPGOranker predict several novel causal genes.
Tipping the Balance: Hepatotoxicity and the Four Apical Key Events of Hepatic Steatosis
Adverse outcome pathways (AOPs) are descriptive biological sequences that start from a molecular initiating event (MIE) and end with an adverse health outcome. AOPs provide biological context for high throughput chemical testing and further prioritize environmental health risk r...
The Toxicity Data Landscape for Environmental Chemicals (journal)
Thousands of chemicals are in common use but only a portion of them have undergone significant toxicological evaluation, leading to the need to prioritize the remainder for targeted testing. To address this issue, the U.S. Environmental Protection Agency (U.S. EPA) and other orga...
QSAR EVALUATION OF ER BINDING AFFINITY OF CHEMICALS AND METABOLITES
Chemicals in commerce are assessed for a variety of potential adverse effects. As governments around the globe strive to meet the challenge of assessing chemicals as endocrine disruptors, the need for hypothesis-driven strategies to prioritize chemicals for testing has risen to t...
The U.S. Environmental Protection Agency is evaluating methods to screen and prioritize large numbers of chemicals for developmental toxicity. As such, we are exploring a behavioral testing paradigm, which can assess the effect of sublethal and subteratogenic concentrations of de...
The U.S. Environmental Protection Agency is evaluating methods to screen and prioritize large numbers of chemicals for developmental toxicity. We are exploring methods to detect developmentally neurotoxic chemicals using zebrafish behavior at 6 days of age. The behavioral paradig...
Understanding the Biology and Technology of ToxCast and Tox21 Assays
The ToxCast high-throughput toxicity (HTT) testing methods have been developed to evaluate the hazard potential of diverse environmental, industrial and consumer product chemicals. The main goal is prioritizing the compounds of greatest concern for more detailed toxicological stu...
Gabb, Henry A; Blake, Catherine
2016-08-01
Simultaneous or sequential exposure to multiple environmental stressors can affect chemical toxicity. Cumulative risk assessments consider multiple stressors but it is impractical to test every chemical combination to which people are exposed. New methods are needed to prioritize chemical combinations based on their prevalence and possible health impacts. We introduce an informatics approach that uses publicly available data to identify chemicals that co-occur in consumer products, which account for a significant proportion of overall chemical load. Fifty-five asthma-associated and endocrine disrupting chemicals (target chemicals) were selected. A database of 38,975 distinct consumer products and 32,231 distinct ingredient names was created from online sources, and PubChem and the Unified Medical Language System were used to resolve synonymous ingredient names. Synonymous ingredient names are different names for the same chemical (e.g., vitamin E and tocopherol). Nearly one-third of the products (11,688 products, 30%) contained ≥ 1 target chemical and 5,229 products (13%) contained > 1. Of the 55 target chemicals, 31 (56%) appear in ≥ 1 product and 19 (35%) appear under more than one name. The most frequent three-way chemical combination (2-phenoxyethanol, methyl paraben, and ethyl paraben) appears in 1,059 products. Further work is needed to assess combined chemical exposures related to the use of multiple products. The informatics approach increased the number of products considered in a traditional analysis by two orders of magnitude, but missing/incomplete product labels can limit the effectiveness of this approach. Such an approach must resolve synonymy to ensure that chemicals of interest are not missed. Commonly occurring chemical combinations can be used to prioritize cumulative toxicology risk assessments. Gabb HA, Blake C. 2016. An informatics approach to evaluating combined chemical exposures from consumer products: a case study of asthma-associated chemicals and potential endocrine disruptors. Environ Health Perspect 124:1155-1165; http://dx.doi.org/10.1289/ehp.1510529.
Teaching and Assessing Clinical Reasoning Skills.
Modi, Jyoti Nath; Anshu; Gupta, Piyush; Singh, Tejinder
2015-09-01
Clinical reasoning is a core competency expected to be acquired by all clinicians. It is the ability to integrate and apply different types of knowledge, weigh evidence critically and reflect upon the process used to arrive at a diagnosis. Problems with clinical reasoning often occur because of inadequate knowledge, flaws in data gathering and improper approach to information processing. Some of the educational strategies which can be used to encourage acquisition of clinical reasoning skills are: exposure to a wide variety of clinical cases, activation of previous knowledge, development of illness scripts, sharing expert strategies to arrive at a diagnosis, forcing students to prioritize differential diagnoses; and encouraging reflection, metacognition, deliberate practice and availability of formative feedback. Assessment of clinical reasoning abilities should be done throughout the training course in diverse settings. Use of scenario based multiple choice questions, key feature test and script concordance test are some ways of theoretically assessing clinical reasoning ability. In the clinical setting, these skills can be tested in most forms of workplace based assessment. We recommend that clinical reasoning must be taught at all levels of medical training as it improves clinician performance and reduces cognitive errors.
Recent Transmission of Tuberculosis — United States, 2011–2014
Yuen, Courtney M.; Kammerer, J. Steve; Marks, Kala; Navin, Thomas R.; France, Anne Marie
2016-01-01
Tuberculosis is an infectious disease that may result from recent transmission or from an infection acquired many years in the past; there is no diagnostic test to distinguish the two causes. Cases resulting from recent transmission are particularly concerning from a public health standpoint. To describe recent tuberculosis transmission in the United States, we used a field-validated plausible source-case method to estimate cases likely resulting from recent transmission during January 2011–September 2014. We classified cases as resulting from either limited or extensive recent transmission based on transmission cluster size. We used logistic regression to analyze patient characteristics associated with recent transmission. Of 26,586 genotyped cases, 14% were attributable to recent transmission, 39% of which were attributable to extensive recent transmission. The burden of cases attributed to recent transmission was geographically heterogeneous and poorly predicted by tuberculosis incidence. Extensive recent transmission was positively associated with American Indian/Alaska Native (adjusted prevalence ratio [aPR] = 3.6 (95% confidence interval [CI] 2.9–4.4), Native Hawaiian/Pacific Islander (aPR = 3.2, 95% CI 2.3–4.5), and black (aPR = 3.0, 95% CI 2.6–3.5) race, and homelessness (aPR = 2.3, 95% CI 2.0–2.5). Extensive recent transmission was negatively associated with foreign birth (aPR = 0.2, 95% CI 0.2–0.2). Tuberculosis control efforts should prioritize reducing transmission among higher-risk populations. PMID:27082644
Integrated in silico strategy for PBT assessment and prioritization under REACH.
Pizzo, Fabiola; Lombardo, Anna; Manganaro, Alberto; Cappelli, Claudia I; Petoumenou, Maria I; Albanese, Federica; Roncaglioni, Alessandra; Brandt, Marc; Benfenati, Emilio
2016-11-01
Chemicals may persist in the environment, bioaccumulate and be toxic for humans and wildlife, posing great concern. These three properties, persistence (P), bioaccumulation (B), and toxicity (T) are the key targets of the PBT-hazard assessment. The European regulation for the Registration, Evaluation, Authorisation and Restriction of Chemicals (REACH) requires assessment of PBT-properties for all chemicals that are produced or imported in Europe in amounts exceeding 10 tonnes per year, checking whether the criteria set out in REACH Annex XIII are met, so the substance should therefore be considered to have properties of very high concern. Considering how many substances can fall under the REACH regulation, there is a pressing need for new strategies to identify and screen large numbers fast and inexpensively. An efficient non-testing screening approach to identify PBT candidates is necessary, as a valuable alternative to money- and time-consuming laboratory tests and a good start for prioritization since few tools exist (e.g. the PBT profiler developed by US EPA). The aim of this work was to offer a conceptual scheme for identifying and prioritizing chemicals for further assessment and if appropriate further testing, based on their PBT-potential, using a non-testing screening approach. We integrated in silico models (using existing and developing new ones) in a final algorithm for screening and ranking PBT-potential, which uses experimental and predicted values as well as associated uncertainties. The Multi-Criteria Decision-Making (MCDM) theory was used to integrate the different values. Then we compiled a new set of data containing known PBT and non-PBT substances, in order to check how well our approach clearly differentiated compounds labeled as PBT from those labeled as non-PBT. This indicated that the integrated model distinguished between PBT from non-PBT compounds. Copyright © 2016 Elsevier Inc. All rights reserved.
Accelerating to Zero: Strategies to Eliminate Malaria in the Peruvian Amazon
Quispe, Antonio M.; Llanos-Cuentas, Alejandro; Rodriguez, Hugo; Clendenes, Martin; Cabezas, Cesar; Leon, Luis M.; Chuquiyauri, Raul; Moreno, Marta; Kaslow, David C.; Grogl, Max; Herrera, Sócrates; Magill, Alan J.; Kosek, Margaret; Vinetz, Joseph M.; Lescano, Andres G.; Gotuzzo, Eduardo
2016-01-01
In February 2014, the Malaria Elimination Working Group, in partnership with the Peruvian Ministry of Health (MoH), hosted its first international conference on malaria elimination in Iquitos, Peru. The 2-day meeting gathered 85 malaria experts, including 18 international panelists, 23 stakeholders from different malaria-endemic regions of Peru, and 11 MoH authorities. The main outcome was consensus that implementing a malaria elimination project in the Amazon region is achievable, but would require: 1) a comprehensive strategic plan, 2) the altering of current programmatic guidelines from control toward elimination by including symptomatic as well as asymptomatic individuals for antimalarial therapy and transmission-blocking interventions, and 3) the prioritization of community-based active case detection with proper rapid diagnostic tests to interrupt transmission. Elimination efforts must involve key stakeholders and experts at every level of government and include integrated research activities to evaluate, implement, and tailor sustainable interventions appropriate to the region.
Automatic health record review to help prioritize gravely ill Social Security disability applicants.
Abbott, Kenneth; Ho, Yen-Yi; Erickson, Jennifer
2017-07-01
Every year, thousands of patients die waiting for disability benefits from the Social Security Administration. Some qualify for expedited service under the Compassionate Allowance (CAL) initiative, but CAL software focuses exclusively on information from a single form field. This paper describes the development of a supplemental process for identifying some overlooked but gravely ill applicants, through automatic annotation of health records accompanying new claims. We explore improved prioritization instead of fully autonomous claims approval. We developed a sample of claims containing medical records at the moment of arrival in a single office. A series of tools annotated both patient records and public Web page descriptions of CAL medical conditions. We trained random forests to identify CAL patients and validated each model with 10-fold cross validation. Our main model, a general CAL classifier, had an area under the receiver operating characteristic curve of 0.915. Combining this classifier with existing software improved sensitivity from 0.960 to 0.994, detecting every deceased patient, but reducing positive predictive value to 0.216. True positive CAL identification is a priority, given CAL patient mortality. Mere prioritization of the false positives would not create a meaningful burden in terms of manual review. Death certificate data suggest the presence of truly ill patients among putative false positives. To a limited extent, it is possible to identify gravely ill Social Security disability applicants by analyzing annotations of unstructured electronic health records, and the level of identification is sufficient to be useful in prioritizing case reviews. Published by Oxford University Press on behalf of the American Medical Informatics Association 2017. This work is written by US Government employees and is in the public domain in the US.
Patients' views on priority setting in neurosurgery: A qualitative study.
Gunaratnam, Caroline; Bernstein, Mark
2016-01-01
Accountability for Reasonableness is an ethical framework which has been implemented in various health care systems to improve and evaluate the fairness of priority setting. This framework is grounded on four mandatory conditions: relevance, publicity, appeals, and enforcement. There have been few studies which have evaluated the patient stakeholders' acceptance of this framework; certainly no studies have been done on patients' views on the prioritization system for allocating patients for operating time in a system with pressure on the resource of inpatient beds. The aim of this study is to examine neurosurgical patients' views on the prioritization of patients for operating theater (OT) time on a daily basis at a tertiary and quaternary referral neurosurgery center. Semi-structured face-to-face interviews were conducted with thirty-seven patients, recruited from the neurosurgery clinic at Toronto Western Hospital. Family members and friends who accompanied the patient to their clinic visit were encouraged to contribute to the discussion. Interviews were audio recorded, transcribed verbatim, and subjected to thematic analysis using open and axial coding. Overall, patients are supportive of the concept of a priority-setting system based on fairness, but felt that a few changes would help to improve the fairness of the current system. These changes include lowering the level of priority given to volume-funded cases and providing scheduled surgeries that were previously canceled a higher level of prioritization. Good communication, early notification, and rescheduling canceled surgeries as soon as possible were important factors that directly reflected the patients' confidence level in their doctor, the hospital, and the health care system. This study is the first clinical qualitative study of patients' perspective on a prioritization system used for allocating neurosurgical patients for OT time on a daily basis in a socialized not-for-profit health care system with fixed resources.
Balabanova, Yanina; Gilsdorf, Andreas; Buda, Silke; Burger, Reinhard; Eckmanns, Tim; Gärtner, Barbara; Groß, Uwe; Haas, Walter; Hamouda, Osamah; Hübner, Johannes; Jänisch, Thomas; Kist, Manfred; Kramer, Michael H.; Ledig, Thomas; Mielke, Martin; Pulz, Matthias; Stark, Klaus; Suttorp, Norbert; Ulbrich, Uta; Wichmann, Ole; Krause, Gérard
2011-01-01
Introduction To establish strategic priorities for the German national public health institute (RKI) and guide the institute's mid-term strategic decisions, we prioritized infectious pathogens in accordance with their importance for national surveillance and epidemiological research. Methods We used the Delphi process with internal (RKI) and external experts and a metric-consensus approach to score pathogens according to ten three-tiered criteria. Additional experts were invited to weight each criterion, leading to the calculation of a median weight by which each score was multiplied. We ranked the pathogens according to the total weighted score and divided them into four priority groups. Results 127 pathogens were scored. Eighty-six experts participated in the weighting; “Case fatality rate” was rated as the most important criterion. Twenty-six pathogens were ranked in the highest priority group; among those were pathogens with internationally recognised importance (e.g., Human Immunodeficiency Virus, Mycobacterium tuberculosis, Influenza virus, Hepatitis C virus, Neisseria meningitides), pathogens frequently causing large outbreaks (e.g., Campylobacter spp.), and nosocomial pathogens associated with antimicrobial resistance. Other pathogens in the highest priority group included Helicobacter pylori, Respiratory Syncytial Virus, Varicella zoster virus and Hantavirus. Discussion While several pathogens from the highest priority group already have a high profile in national and international health policy documents, high scores for other pathogens (e.g., Helicobacter pylori, Respiratory syncytial virus or Hantavirus) indicate a possible under-recognised importance within the current German public health framework. A process to strengthen respective surveillance systems and research has been started. The prioritization methodology has worked well; its modular structure makes it potentially useful for other settings. PMID:21991334
ERIC Educational Resources Information Center
Fleming, Allison R.; Boeltzig-Brown, Heike; Foley, Susan M.
2015-01-01
Purpose: We describe a modified Delphi method used to select effective state vocational rehabilitation agency practices to prioritize rehabilitation services for individuals with most significant disabilities within the context of Order of Selection, an area where there is little known and published. Specifically, we describe how we applied the…
MELODI: Mining Enriched Literature Objects to Derive Intermediates.
Elsworth, Benjamin; Dawe, Karen; Vincent, Emma E; Langdon, Ryan; Lynch, Brigid M; Martin, Richard M; Relton, Caroline; Higgins, Julian P T; Gaunt, Tom R
2018-01-12
The scientific literature contains a wealth of information from different fields on potential disease mechanisms. However, identifying and prioritizing mechanisms for further analytical evaluation presents enormous challenges in terms of the quantity and diversity of published research. The application of data mining approaches to the literature offers the potential to identify and prioritize mechanisms for more focused and detailed analysis. Here we present MELODI, a literature mining platform that can identify mechanistic pathways between any two biomedical concepts. Two case studies demonstrate the potential uses of MELODI and how it can generate hypotheses for further investigation. First, an analysis of ETS-related gene ERG and prostate cancer derives the intermediate transcription factor SP1, recently confirmed to be physically interacting with ERG. Second, examining the relationship between a new potential risk factor for pancreatic cancer identifies possible mechanistic insights which can be studied in vitro. We have demonstrated the possible applications of MELODI, including two case studies. MELODI has been implemented as a Python/Django web application, and is freely available to use at [www.melodi.biocompute.org.uk]. © The Author(s) 2018. Published by Oxford University Press on behalf of the International Epidemiological Association
Economic and environmental optimization of waste treatment
DOE Office of Scientific and Technical Information (OSTI.GOV)
Münster, M.; Ravn, H.; Hedegaard, K.
2015-04-15
Highlights: • Optimizing waste treatment by incorporating LCA methodology. • Applying different objectives (minimizing costs or GHG emissions). • Prioritizing multiple objectives given different weights. • Optimum depends on objective and assumed displaced electricity production. - Abstract: This article presents the new systems engineering optimization model, OptiWaste, which incorporates a life cycle assessment (LCA) methodology and captures important characteristics of waste management systems. As part of the optimization, the model identifies the most attractive waste management options. The model renders it possible to apply different optimization objectives such as minimizing costs or greenhouse gas emissions or to prioritize several objectivesmore » given different weights. A simple illustrative case is analysed, covering alternative treatments of one tonne of residual household waste: incineration of the full amount or sorting out organic waste for biogas production for either combined heat and power generation or as fuel in vehicles. The case study illustrates that the optimal solution depends on the objective and assumptions regarding the background system – illustrated with different assumptions regarding displaced electricity production. The article shows that it is feasible to combine LCA methodology with optimization. Furthermore, it highlights the need for including the integrated waste and energy system into the model.« less
2009-11-24
production on Air Bases Field the Critical Asset Prioritization Methodology ( CAPM ) tool Manage costs Provide energy leadership throughout the Air...residing on military installations • Field the Critical Asset Prioritization Methodology ( CAPM ) tool. This CAPM tool will allow prioritization of Air...fielding of the Critical Asset Prioritization Methodology ( CAPM ) tool and the adoption of financial standards to enable transparency across Air
Aggressive effects of prioritizing popularity in early adolescence.
Cillessen, Antonius H N; Mayeux, Lara; Ha, Thao; de Bruyn, Eddy H; LaFontana, Kathryn M
2014-01-01
This study examined the moderating effects of prioritizing popularity on the association between early adolescents' popularity and their aggressive, leadership, and prosocial behaviors with peers. Participants were 288 14-year-olds from The Netherlands who completed a sociometric instrument and an assessment of how much they prioritized popularity over other personal goals. Results indicated that prioritizing popularity was distinct from actual popularity in the peer group. Further, prioritizing popularity moderated the association of popularity with aggressive and leadership behaviors, with adolescents who were both popular and who prioritized popularity being particularly aggressive and scoring high on leadership behaviors. This trend was especially true for boys. The same moderating effect was not found for prosocial behaviors. Motivational and social-cognitive factors in the dynamics of peer popularity are highlighted. © 2013 Wiley Periodicals, Inc.
Self-Prioritization Beyond Perception.
Schäfer, Sarah; Wentura, Dirk; Frings, Christian
2015-01-01
Recently, Sui, He, and Humphreys (2012) introduced a new paradigm to measure perceptual self-prioritization processes. It seems that arbitrarily tagging shapes to self-relevant words (I, my, me, and so on) leads to speeded verification times when matching self-relevant word shape pairings (e.g., me - triangle) as compared to non-self-relevant word shape pairings (e.g., stranger - circle). In order to analyze the level at which self-prioritization takes place we analyzed whether the self-prioritization effect is due to a tagging of the self-relevant label and the particular associated shape or due to a tagging of the self with an abstract concept. In two experiments participants showed standard self-prioritization effects with varying stimulus features or different exemplars of a particular stimulus-category suggesting that self-prioritization also works at a conceptual level.
Childhood lead poisoning prevention activities within Michigan local public health departments.
Kemper, Alex R; Uren, Rebecca L; Hudson, Sharon R
2007-01-01
Local public health departments have a wide array of responsibilities, including coordinating childhood lead poisoning prevention activities. This study was conducted in an effort to understand how local public health officers prioritized lead poisoning prevention activities and the barriers to the delivery of childhood lead poisoning prevention services delivered through local health departments. A telephone survey was conducted of health officers in Michigan, a state with a high burden of environmental lead. Analysis included Spearman rank correlation and Fisher's exact test. No association was found between the local risk of lead poisoning and the priority placed by local health departments on lead poisoning prevention activities. Similarly, there was no association between the local risk of lead poisoning and the availability of services. Only 60% of local health departments offered blood lead testing, environmental investigation, and case management. Most (74%) believed that lead poisoning is inadequately addressed within the area served by their local health department. New strategies of providing lead poisoning prevention activities are needed to achieve the federal and state goals of eliminating childhood lead poisoning over the next decade.
Stakeholder approach for evaluating organizational change projects.
Peltokorpi, Antti; Alho, Antti; Kujala, Jaakko; Aitamurto, Johanna; Parvinen, Petri
2008-01-01
This paper aims to create a model for evaluating organizational change initiatives from a stakeholder resistance viewpoint. The paper presents a model to evaluate change projects and their expected benefits. Factors affecting the challenge to implement change were defined based on stakeholder theory literature. The authors test the model's practical validity for screening change initiatives to improve operating room productivity. Change initiatives can be evaluated using six factors: the effect of the planned intervention on stakeholders' actions and position; stakeholders' capability to influence the project's implementation; motivation to participate; capability to change; change complexity; and management capability. The presented model's generalizability should be explored by filtering presented factors through a larger number of historical cases operating in different healthcare contexts. The link between stakeholders, the change challenge and the outcomes of change projects needs to be empirically tested. The proposed model can be used to prioritize change projects, manage stakeholder resistance and establish a better organizational and professional competence for managing healthcare organization change projects. New insights into existing stakeholder-related understanding of change project successes are provided.
ERIC Educational Resources Information Center
Lewis, Steven
1996-01-01
Disaster recovery planning need not be expensive nor complete to be effective. Systematic planning involves several crucial steps, including outlining the final plan, understanding the nature of a disaster's effects and the stages of disaster recovery, prioritizing appropriately, and learning how to test the plan in a practical way for the…
To address the EPA's need to prioritize hundreds to thousands of chemicals for testing, we are developing a rapid, cost-effective in vivo screen for developmental neurotoxicity using zebrafish (Danio rerio), a small freshwater fish with external fertilization. Zebrafish embryos d...
In order to determine the potential toxicological effects, toxicokinetics, and route(s) of exposure for chemicals, their structures and corresponding physicochemical properties are required. With this data, the risk for thousands of environmental chemicals can be prioritized. How...
The ToxCast Chemical Prioritization Program at the US EPA (UCLA Molecular Toxicology Program)
To meet the needs of chemical regulators reviewing large numbers of data-poor chemicals for safety, the EPA's National Center for Computational Toxicology is developing a means of efficiently testing thousands of compounds for potential toxicity. High-throughput bioactivity profi...
The U.S. Environmental Protection Agency is evaluating methods to screen and prioritize large numbers of chemicals for developmental toxicity. We are exploring methods to screen for developmentally neurotoxic chemicals using zebrafish behavior at 6 days of age. The behavioral par...
The Analysis of Genomic Dose-Response Data in the EPA ToxCast™ Program
The U.S. EPA must assess the potential adverse effects of thousands of chemicals, often with limited toxicity information. Accurate toxicity predictions will help prioritize chemicals for further testing, focusing resources on the greater potential hazards or risks. In vitro geno...
Prioritization of Louisiana Parishes based on Industrial Releases of Known or Suspected Carcinogens.
Katner, Adrienne
2015-01-01
This investigation evaluated the geographic distribution of carcinogen releases by Louisiana industries to prioritize areas for regulatory oversight, research and monitoring, and to promote clinician awareness and vigilance. Data on estimated industry releases for the period between 1996 and 2011 were obtained from the US Environmental Protection Agency's Toxics Release Inventory. Chemicals associated with cancers of the prostate, lung, bladder, kidney, breast and non-Hodgkin lymphoma were identified. The Risk Screening Environmental Indicators model was used to derive measures or model scores based on chemical toxicity, fate and transport, and population characteristics. Parishes, chemicals, industries and media generating the highest model scores were identified. Parishes with the highest model scores were East Baton Rouge, Calcasieu, Caddo and St. John the Baptist. Clinicians should carefully monitor cancer cases in these areas, and if patients reside near or work in industry, an occupational and environmental history should be considered.
Hughes, David; Williams-Jones, Bryn
2013-01-01
In the context of scarce public resources, patient interest groups have increasingly turned to private organizations for financing, including the pharmaceutical industry. This practice puts advocacy groups in a situation of potential conflicts between the interests of patients and those of the drug companies. The interests of patients and industry can converge on issues related to the approval and reimbursement of medications. But even on this issue, interests do not always align perfectly. Using the Quebec example of Coalition Priorité Cancer (CPC) as a case study, we examine the ethical issues raised by such financial relationships in the context of drug reimbursement decision-making. We collected, compiled and analyzed publicly available information on the CPC's organization and activities; this approach allowed us to raise and discuss important questions regarding the possible influence exerted on patient groups by donors. We conclude with some recommendations. PMID:23968674
Stakeholder prioritization of zoonoses in Japan with analytic hierarchy process method.
Kadohira, M; Hill, G; Yoshizaki, R; Ota, S; Yoshikawa, Y
2015-05-01
There exists an urgent need to develop iterative risk assessment strategies of zoonotic diseases. The aim of this study is to develop a method of prioritizing 98 zoonoses derived from animal pathogens in Japan and to involve four major groups of stakeholders: researchers, physicians, public health officials, and citizens. We used a combination of risk profiling and analytic hierarchy process (AHP). Profiling risk was accomplished with semi-quantitative analysis of existing public health data. AHP data collection was performed by administering questionnaires to the four stakeholder groups. Results showed that researchers and public health officials focused on case fatality as the chief important factor, while physicians and citizens placed more weight on diagnosis and prevention, respectively. Most of the six top-ranked diseases were similar among all stakeholders. Transmissible spongiform encephalopathy, severe acute respiratory syndrome, and Ebola fever were ranked first, second, and third, respectively.
Stockpiling Ventilators for Influenza Pandemics
Araz, Ozgur M.; Morton, David P.; Johnson, Gregory P.; Damien, Paul; Clements, Bruce; Meyers, Lauren Ancel
2017-01-01
In preparing for influenza pandemics, public health agencies stockpile critical medical resources. Determining appropriate quantities and locations for such resources can be challenging, given the considerable uncertainty in the timing and severity of future pandemics. We introduce a method for optimizing stockpiles of mechanical ventilators, which are critical for treating hospitalized influenza patients in respiratory failure. As a case study, we consider the US state of Texas during mild, moderate, and severe pandemics. Optimal allocations prioritize local over central storage, even though the latter can be deployed adaptively, on the basis of real-time needs. This prioritization stems from high geographic correlations and the slightly lower treatment success assumed for centrally stockpiled ventilators. We developed our model and analysis in collaboration with academic researchers and a state public health agency and incorporated it into a Web-based decision-support tool for pandemic preparedness and response. PMID:28518041
Maher, Dermot
2010-01-01
The global financial crisis poses a threat to global health, and may exacerbate diseases of poverty, e.g. HIV, malaria and tuberculosis. Exploring the implications of the global financial crisis for the health sector response to tuberculosis is useful to illustrate the practical problems and propose possible solutions. The response to tuberculosis is considered in the context of health sector development. Problems and solutions are considered in five key areas: financing, prioritization, government regulation, integration and decentralization. Securing health gains in global tuberculosis control depends on protecting expenditure by governments of countries badly affected by tuberculosis and by donors, taking measures to increase efficiencies, prioritizing health expenditures and strengthening government regulation. Lessons learned will be valuable for stakeholders involved in the health sector response to tuberculosis and other diseases of poverty.
Navigating complex patients using an innovative tool: the MTM Spider Web.
Morello, Candis M; Hirsch, Jan D; Lee, Kelly C
2013-01-01
To introduce a teaching tool that can be used to assess the complexity of medication therapy management (MTM) patients, prioritize appropriate interventions, and design patient-centered care plans for each encounter. MTM patients are complex as a result of multiple comorbidities, medications, and socioeconomic and behavioral issues. Pharmacists who provide MTM services are required to synthesize a plethora of information (medical and nonmedical), evaluate and prioritize the clinical problems, and design a comprehensive patient-centered care plan. The MTM Spider Web is a visual tool to facilitate this process. A description is provided regarding how to build the MTM Spider Web using case-based scenarios. This model can be used to teach pharmacists, health professional students, and patients. The MTM Spider Web is an innovative teaching tool that can be used to teach pharmacists and students how to assess complex patients and design a patient-centered care plan to deliver the most appropriate medication therapy.
Awareness and perceived fairness of option B+ in Malawi: a population-level perspective
Yeatman, Sara; Trinitapoli, Jenny
2017-01-01
Abstract Introduction: Policies for rationing antiretroviral therapy (ART) have been subject to on-going ethical debates. Introduced in Malawi in 2011, Option B+ prioritized HIV-positive pregnant women for lifelong ART regardless of the underlying state of their immune system, shifting the logic of allocation away from medical eligibility. Despite the rapid expansion of this policy, we know little about how it has been understood and interpreted by the people it affects. Methods: We assessed awareness and perceived fairness of the prioritization system for ART among a population-based sample of young women (n = 1440) and their partners (n = 574) in southern Malawi. We use a card-sort technique to elicit understandings of who gets ART under Option B+ and who should be prioritized, and we compare perceptions to actual ART policy using sequence analysis and optimal matching. We then use ordered logistic regression to identify the factors associated with policy awareness. Results: In 2015, only 30.7% of women and 21.1% of male partners understood how ART was being distributed. There was widespread confusion around whether otherwise healthy HIV-positive pregnant women could access ART under Option B + . Nonetheless, more young adults thought that the fairest policy should prioritize such women than believed the actual policy did. Women who were older, more educated or had recently engaged with the health system through antenatal care or ART had more accurate understandings of Option B + . Among men, policy awareness was lower, and was patterned only by education. Conclusions: Although most respondents were unaware that Option B+ afforded ART access to healthy-pregnant women, Malawians support the prioritization of pregnant women. Countries adopting Option B+ or other new ART policies such as universal test-and-treat should communicate the policies and their rationales to the public – such transparency would be more consistent with a fair and ethical process and could additionally serve to clarify confusion and enhance retention. PMID:28362070
Awareness and perceived fairness of Option B+ in Malawi: A population-level perspective
Yeatman, Sara; Trinitapoli, Jenny
2017-03-08
Policies for rationing antiretroviral therapy (ART) have been subject to on-going ethical debates. Introduced in Malawi in 2011, Option B+ prioritized HIV-positive pregnant women for lifelong ART regardless of the underlying state of their immune system, shifting the logic of allocation away from medical eligibility. Despite the rapid expansion of this policy, we know little about how it has been understood and interpreted by the people it affects. We assessed awareness and perceived fairness of the prioritization system for ART among a population-based sample of young women (n = 1440) and their partners (n = 574) in southern Malawi. We use a card-sort technique to elicit understandings of who gets ART under Option B+ and who should be prioritized, and we compare perceptions to actual ART policy using sequence analysis and optimal matching. We then use ordered logistic regression to identify the factors associated with policy awareness. In 2015, only 30.7% of women and 21.1% of male partners understood how ART was being distributed. There was widespread confusion around whether otherwise healthy HIV-positive pregnant women could access ART under Option B + . Nonetheless, more young adults thought that the fairest policy should prioritize such women than believed the actual policy did. Women who were older, more educated or had recently engaged with the health system through antenatal care or ART had more accurate understandings of Option B + . Among men, policy awareness was lower, and was patterned only by education. Although most respondents were unaware that Option B+ afforded ART access to healthy-pregnant women, Malawians support the prioritization of pregnant women. Countries adopting Option B+ or other new ART policies such as universal test-and-treat should communicate the policies and their rationales to the public - such transparency would be more consistent with a fair and ethical process and could additionally serve to clarify confusion and enhance retention..
Ng, Victoria; Sargeant, Jan M.
2012-01-01
Background Zoonotic diseases account for over 60% of all communicable diseases causing illness in humans and 75% of recently emerging infectious diseases. As limited resources are available for the control and prevention of zoonotic diseases, it is necessary to prioritize diseases in order to direct resources into those with the greatest needs. The selection of criteria for prioritization has traditionally been on the basis of expert opinion; however, details of the methods used to identify criteria from expert opinion often are not published and a full range of criteria may not be captured by expert opinion. Methodology/Principal Findings This study used six focus groups to identify criteria for the prioritization of zoonotic diseases in Canada. Focus groups included people from the public, animal health professionals and human health professionals. A total of 59 criteria were identified for prioritizing zoonotic diseases. Human-related criteria accounted for the highest proportion of criteria identified (55%), followed by animal-related criteria (26%) then pathogen/disease-related criteria (19%). Similarities and differences were observed in the identification and scoring of criteria for disease prioritization between groups; the public groups were strongly influenced by the individual-level of disease burden, the responsibility of the scientific community in disease prioritization and the experiences of recent events while the professional groups were influenced by the societal- and population-level of disease burden and political and public pressure. Conclusions/Significance This was the first study to describe a mixed semi-quantitative and qualitative approach to deriving criteria for disease prioritization. This was also the first study to involve the opinion of the general public regarding disease prioritization. The number of criteria identified highlights the difficulty in prioritizing zoonotic diseases. The method presented in this paper has formulated a comprehensive list of criteria that can be used to inform future disease prioritization studies. PMID:22238648
ACToR - Aggregated Computational Toxicology Resource ...
There are too many uncharacterized environmental chemicals to test with current in vivo protocols. Develop predictive in vitro screening assays that can be used to prioritize chemicals for detailed testing. ToxCast program requires large amounts of data: In vitro assays (mainly generated by ToxCast program) and In vivo data to develop and validate predictive signatures ACToR is compiling both sets of data for use in predictive algorithms.
Overview of ToxCast™ | Science Inventory | US EPA
In 2007, EPA launched ToxCast™ in order to develop a cost-effective approach for prioritizing the toxicity testing of large numbers of chemicals in a short period of time. Using data from state-of-the-art high throughput screening (HTS) bioassays developed in the pharmaceutical industry, ToxCast™ is building computational models to forecast the potential human toxicity of chemicals. These hazard predictions will provide EPA regulatory programs with science-based information helpful in prioritizing chemicals for more detailed toxicological evaluations, and lead to more efficient use of animal testing. In its first phase, ToxCast™ is profiling over 300 well-characterized chemicals (primarily pesticides) in over 400 HTS endpoints. These endpoints include biochemical assays of protein function, cell-based transcriptional reporter assays, multi-cell interaction assays, transcriptomics on primary cell cultures, and developmental assays in zebrafish embryos. Almost all of the compounds being examined in Phase 1 of ToxCast™ have been tested in traditional toxicology tests, including developmental toxicity, multi-generation studies, and sub-chronic and chronic rodent bioassays. ToxRefDB, a relational database being created to house this information, will contain nearly $1B worth of toxicity studies in animals when completed. ToxRefDB is integrated into a more comprehensive data management system developed by NCCT called ACToR (Aggregated Computational Toxicology
Galván, Pedro; Cane, Virgilio; Samudio, Margarita; Cabello, Agueda; Cabral, Margarita; Basogain, Xavier; Rivas, Ronald; Hilario, Enrique
2014-01-01
Report preliminary results of the application of the BONIS system in community tele-epidemiological surveillance in Paraguay. A study of viability and implementation carried out in the Family Health Unit located in Bañado Sur in the city of Asunción by the Paraguay River. The system automatically records personal data and symptoms of individuals who make telephone reports, and suspected cases of dengue are classified and prioritized. This information goes to community agents for follow-up and to specialists in charge of epidemiological surveillance. From April 2010 to August 2011, 1 028 calls to the system were logged. Of 157 reported cases of fever, home visits were made to 140 (89.2%); of these, fever and headache or body ache were confirmed in 52 (37.1%) cases, and headache or body ache without fever in 58 (41.4%) cases. Community agents referred 49 (35.0%) of them for medical consultation and blood tests, and they took blood samples in the homes of 19; of these, 56 (82.3%) were positive for dengue and 12 (17.4%) for influenza. Paraguay has a low-cost community tele-epidemiological surveillance system based on information and communication technologies and open-source software, which is scalable to other health symptoms and disorders of interest. To enable its acceptance and application, education programs should be developed to strengthen the management and promotion of community health.
Exploration technology prioritization
NASA Technical Reports Server (NTRS)
Dula, Alex
1992-01-01
A series of outlines and graphs describing NASA's Space Exploration Initiative (SEI) technology prioritization are presented. Prioritization criteria and preliminary critical technology priorities for a first lunar outpost and a Mars and permanently-manned lunar mission are addressed.
Definition and applications of a versatile chemical pollution footprint methodology.
Zijp, Michiel C; Posthuma, Leo; van de Meent, Dik
2014-09-16
Because of the great variety in behavior and modes of action of chemicals, impact assessment of multiple substances is complex, as is the communication of its results. Given calls for cumulative impact assessments, we developed a methodology that is aimed at expressing the expected cumulative impacts of mixtures of chemicals on aquatic ecosystems for a region and subsequently allows to present these results as a chemical pollution footprint, in short: a chemical footprint. Setting and using a boundary for chemical pollution is part of the methodology. Two case studies were executed to test and illustrate the methodology. The first case illustrates that the production and use of organic substances in Europe, judged with the European water volume, stays within the currently set policy boundaries for chemical pollution. The second case shows that the use of pesticides in Northwestern Europe, judged with the regional water volume, has exceeded the set boundaries, while showing a declining trend over time. The impact of mixtures of substances in the environment could be expressed as a chemical footprint, and the relative contribution of substances to that footprint could be evaluated. These features are a novel type of information to support risk management, by helping prioritization of management among chemicals and environmental compartments.
False memory and importance: can we prioritize encoding without consequence?
Bui, Dung C; Friedman, Michael C; McDonough, Ian M; Castel, Alan D
2013-10-01
Given the large amount of information that we encounter, we often must prioritize what information we attempt to remember. Although critical for everyday functioning, relatively little research has focused on how people prioritize the encoding of information. Recent research has shown that people can and do selectively remember information assigned with higher, relative to lower, importance. However, the mechanisms underlying this prioritization process and the consequences of these processes are still not well understood. In the present study, we sought to better understand these prioritization processes and whether implementing these processes comes at the cost of memory accuracy, by increasing false memories. We used a modified form of the Deese/Roediger-McDermott (DRM) paradigm, in which participants studied DRM lists, with each list paired with low, medium, or high point values. In Experiment 1, encoding higher values led to more false memories than did encoding lower values, possibly because prioritizing information enhanced relational processing among high-value words. In Experiment 2, disrupting relational processing selectively reduced false memories for high-value words. Finally, in Experiment 3, facilitating relational processing selectively increased false memories for low-value words. These findings suggest that while prioritizing information can enhance true memory, this process concomitantly increases false memories. Furthermore, the mechanism underlying these prioritization processes depends on the ability to successfully engage in relational processing. Thus, how we prioritize the encoding of incoming information can come at a cost in terms of accurate memory.
An Approach for Integrating the Prioritization of Functional and Nonfunctional Requirements
Dabbagh, Mohammad; Lee, Sai Peck
2014-01-01
Due to the budgetary deadlines and time to market constraints, it is essential to prioritize software requirements. The outcome of requirements prioritization is an ordering of requirements which need to be considered first during the software development process. To achieve a high quality software system, both functional and nonfunctional requirements must be taken into consideration during the prioritization process. Although several requirements prioritization methods have been proposed so far, no particular method or approach is presented to consider both functional and nonfunctional requirements during the prioritization stage. In this paper, we propose an approach which aims to integrate the process of prioritizing functional and nonfunctional requirements. The outcome of applying the proposed approach produces two separate prioritized lists of functional and non-functional requirements. The effectiveness of the proposed approach has been evaluated through an empirical experiment aimed at comparing the approach with the two state-of-the-art-based approaches, analytic hierarchy process (AHP) and hybrid assessment method (HAM). Results show that our proposed approach outperforms AHP and HAM in terms of actual time-consumption while preserving the quality of the results obtained by our proposed approach at a high level of agreement in comparison with the results produced by the other two approaches. PMID:24982987
An approach for integrating the prioritization of functional and nonfunctional requirements.
Dabbagh, Mohammad; Lee, Sai Peck
2014-01-01
Due to the budgetary deadlines and time to market constraints, it is essential to prioritize software requirements. The outcome of requirements prioritization is an ordering of requirements which need to be considered first during the software development process. To achieve a high quality software system, both functional and nonfunctional requirements must be taken into consideration during the prioritization process. Although several requirements prioritization methods have been proposed so far, no particular method or approach is presented to consider both functional and nonfunctional requirements during the prioritization stage. In this paper, we propose an approach which aims to integrate the process of prioritizing functional and nonfunctional requirements. The outcome of applying the proposed approach produces two separate prioritized lists of functional and non-functional requirements. The effectiveness of the proposed approach has been evaluated through an empirical experiment aimed at comparing the approach with the two state-of-the-art-based approaches, analytic hierarchy process (AHP) and hybrid assessment method (HAM). Results show that our proposed approach outperforms AHP and HAM in terms of actual time-consumption while preserving the quality of the results obtained by our proposed approach at a high level of agreement in comparison with the results produced by the other two approaches.
Given the minimal developmental neurotoxicity data available for the large number of new and existing chemicals, there is a critical need for alternative methods to identify and prioritize chemicals for further testing. We outline a developmental neurotoxicity screening approach ...
The ability to focus on the most biologically relevant contaminants affecting aquatic ecosystems can be challenging because toxicity-assessment programs have not kept pace with the growing number of contaminants requiring testing. Because it has proven effective at assessing the ...
Many active pharmaceutical ingredients (APIs) have been detected in aquatic systems around the world. These systems typically receive continual municipal sewage inputs, which results in pseudo-persistent exposures of aquatic animals to APIs, thus enhancing their bioaccumulative p...
Pharmaceuticals are increasingly found in aquatic environments near wastewater treatment plant discharge, and may be of particular concern to aquatic life given their pseudo-persistence. The large number of detected pharmaceuticals necessitates a prioritization method for hazard...
20180416 - Understanding the Biology and Technology of ToxCast and Tox21 Assays (SETAC Durham NC)
The ToxCast high-throughput toxicity (HTT) testing methods have been developed to evaluate the hazard potential of diverse environmental, industrial and consumer product chemicals. The main goal is prioritizing the compounds of greatest concern for more detailed toxicological stu...
An Evaluation of 25 Selected ToxCast Chemicals in Medium-Throughput Assays to Detect Genotoxicity
ABSTRACTToxCast is a multi-year effort to develop a cost-effective approach for the US EPA to prioritize chemicals for toxicity testing. Initial evaluation of more than 500 high-throughput (HT) microwell-based assays without metabolic activation showed that most lacked high speci...
INTER-SPECIES COMPARISONS AND SAR MODELLING OF ESTROGENICITY USING RAINBOW TROUT ER BINDING DATA
The U.S. EPA has been mandated to screen industrial chemicals and pesticides for potential endocrine activity. Structure-activity relationships (SARs) to predict receptor binding are being developed as a first step to rank and prioritize chemicals for testing in bioassays. First ...
DEVELOPMENT OF AN OBJECTIVE AND QUANTIFIABLE TERATOLOGICAL SCREEN FOR USE IN ZEBRAFISH LARVAE.
To address EPA’s need to prioritize large numbers of chemicals for testing, a rapid, cost-effective in vivo screen for potential developmental toxicity using an alternative vertebrate species (zebrafish;Danio rerio) has been developed. A component of that screen is the observatio...
Identification and Prioritization of Chemical Mixtures from Environmental Residue Data
High throughput toxicity testing has greatly improved the speed at which single chemicals can be screened using in vitro methods. However, people are not exposed to a single chemical at a time, rather to a mixture of chemicals. Even with the increased speed of these methods, te...
The toxicity-testing paradigm has evolved to include high-throughput (HT) methods for addressing the increasing need to screen hundreds to thousands of chemicals rapidly. Approaches that involve in vitro screening assays, in silico predictions of exposure concentrations, and phar...
Identification, Curation, and Prioritization of Food-Use Chemicals in ToxCast (SOT)
Evaluating the thousands of chemicals that are directly added to or come in contact with food poses a great challenge due to the time, cost, and sheer volume of data necessary to thoroughly conduct comprehensive toxicological testing. This study compiled a list of food-use chemic...
Traditional toxicity testing involves a large investment in resources, often using low-throughput in vivo animal studies for limited numbers of chemicals. An alternative strategy is the emergence of high-throughput (HT) in vitro assays as a rapid, cost-efficient means to screen t...
Various models have been developed to predict the relative binding affinity (RBA) of chemicals to estrogen receptors (ER). These models can be used prioritize chemicals for further tiered biological testing to assess the potential for endocrine disruption. One shortcoming of mode...
Surrogates and indicator groups have been proposed as useful tools for selecting areas for conservation when the knowledge of species distributions is limited. Tests of these concepts often produce a wide range of results which depend on the surrogates chosen as well as the spat...
In vitro, high-throughput approaches have been widely recommended as an approach to screen chemicals for the potential to cause developmental neurotoxicity and prioritize them for additional testing. The choice of cellular models for such an approach will have important ramificat...
Learn by Doing - Phase I of the ToxCast Research Program
In 2007, the USEPA embarked on a multi-year, multi-million dollar research program to develop and evaluate a new approach to prioritizing the toxicity testing of environmental chemicals. ToxCast was divided into three main phases of effort – a proof of concept, an expansion and ...
There is a need for more efficient and cost-effective methods for identifying, characterizing and prioritizing chemicals which may result in developmental neurotoxicity. One approach is to utilize in vitro test systems which recapitulate the critical processes of nervous system d...
Considerations in Use of the EPA’s ToxCast Data for Environmental Toxicology (SETAC)
The US EPA has developed the ToxCast program to prioritize chemicals for selective toxicity testing. ToxCast relies on extensive bioactivity profiling using a panel of biochemical and cellular assays that measure chemicals effects on potential molecular initiating events and key ...
Due to their toxicity and persistence in the environment, brominated flame retardants (BFRs) are being phased out of commercial use, leading to the increased use of alternative chemicals such as the organophosphorus flame retardants (OPFRs). Due to the structural similarity of th...
The cost of testing chemicals as reproductive toxicants precludes the possibility of evaluating large chemical inventories without a robust strategic approach for setting priorities. The use of quantitative structure-activity relationships (QSARs) in early hazard identification m...
ERIC Educational Resources Information Center
Ferguson, Gail M.
2013-01-01
The current study tests a prediction of Relational Discrepancy Theory (RDT; i.e., emotional distress will not accompany discrepancies in hierarchical relationships) for family obligations discrepancies among adolescent-parent dyads in Jamaica, a moderately collectivistic and hierarchical society. Ninety-five dyads reported psychological adjustment…
Malki, K; Pain, O; Tosto, M G; Du Rietz, E; Carboni, L; Schalkwyk, L C
2015-01-01
Despite moderate heritability estimates, progress in uncovering the molecular substrate underpinning major depressive disorder (MDD) has been slow. In this study, we used prefrontal cortex (PFC) gene expression from a genetic rat model of MDD to inform probe set prioritization in PFC in a human post-mortem study to uncover genes and gene pathways associated with MDD. Gene expression differences between Flinders sensitive (FSL) and Flinders resistant (FRL) rat lines were statistically evaluated using the RankProd, non-parametric algorithm. Top ranking probe sets in the rat study were subsequently used to prioritize orthologous selection in a human PFC in a case–control post-mortem study on MDD from the Stanley Brain Consortium. Candidate genes in the human post-mortem study were then tested against a matched control sample using the RankProd method. A total of 1767 probe sets were differentially expressed in the PFC between FSL and FRL rat lines at (q⩽0.001). A total of 898 orthologous probe sets was found on Affymetrix's HG-U95A chip used in the human study. Correcting for the number of multiple, non-independent tests, 20 probe sets were found to be significantly dysregulated between human cases and controls at q⩽0.05. These probe sets tagged the expression profile of 18 human genes (11 upregulated and seven downregulated). Using an integrative rat–human study, a number of convergent genes that may have a role in pathogenesis of MDD were uncovered. Eighty percent of these genes were functionally associated with a key stress response signalling cascade, involving NF-κB (nuclear factor kappa-light-chain-enhancer of activated B cells), AP-1 (activator protein 1) and ERK/MAPK, which has been systematically associated with MDD, neuroplasticity and neurogenesis. PMID:25734512
High Throughput Biodegradation-Screening Test To Prioritize and Evaluate Chemical Biodegradability.
Martin, Timothy J; Goodhead, Andrew K; Acharya, Kishor; Head, Ian M; Snape, Jason R; Davenport, Russell J
2017-06-20
Comprehensive assessment of environmental biodegradability of pollutants is limited by the use of low throughput systems. These are epitomized by the Organisation for Economic Cooperation and Development (OECD) Ready Biodegradability Tests (RBTs), where one sample from an environment may be used to assess a chemical's ability to readily biodegrade or persist universally in that environment. This neglects the considerable spatial and temporal microbial variation inherent in any environment. Inaccurate designations of biodegradability or persistence can occur as a result. RBTs are central in assessing the biodegradation fate of chemicals and inferring exposure concentrations in environmental risk assessments. We developed a colorimetric assay for the reliable quantification of suitable aromatic compounds in a high throughput biodegradation screening test (HT-BST). The HT-BST accurately differentiated and prioritized a range of structurally diverse aromatic compounds on the basis of their assigned relative biodegradabilities and quantitative structure-activity relationship (QSAR) model outputs. Approximately 20 000 individual biodegradation tests were performed, returning analogous results to conventional RBTs. The effect of substituent group structure and position on biodegradation potential demonstrated a significant correlation (P < 0.05) with Hammett's constant for substituents on position 3 of the phenol ring. The HT-BST may facilitate the rapid screening of 100 000 chemicals reportedly manufactured in Europe and reduce the need for higher-tier fate and effects tests.
Prioritizing strategies for comprehensive liver cancer control in Asia: a conjoint analysis.
Bridges, John F P; Dong, Liming; Gallego, Gisselle; Blauvelt, Barri M; Joy, Susan M; Pawlik, Timothy M
2012-10-30
Liver cancer is a complex and burdensome disease, with Asia accounting for 75% of known cases. Comprehensive cancer control requires the use of multiple strategies, but various stakeholders may have different views as to which strategies should have the highest priority. This study identified priorities across multiple strategies for comprehensive liver cancer control (CLCC) from the perspective of liver cancer clinical, policy, and advocacy stakeholders in China, Japan, South Korea and Taiwan. Concordance of priorities was assessed across the region and across respondent roles. Priorities for CLCC were examined as part of a cross-sectional survey of liver cancer experts. Respondents completed several conjoint-analysis choice tasks to prioritize 11 strategies. In each task, respondents judged which of two competing CLCC plans, consisting of mutually exclusive and exhaustive subsets of the strategies, would have the greatest impact. The dependent variable was the chosen plan, which was then regressed on the strategies of different plans. The restricted least squares (RLS) method was utilized to compare aggregate and stratified models, and t-tests and Wald tests were used to test for significance and concordance, respectively. Eighty respondents (69.6%) were eligible and completed the survey. Their primary interests were hepatitis (26%), hepatocellular carcinoma (HCC) (58%), metastatic liver cancer (10%) and transplantation (6%). The most preferred strategies were monitoring at-risk populations (p<0.001), clinician education (p<0.001), and national guidelines (p<0.001). Most priorities were concordant across sites except for three strategies: transplantation infrastructure (p=0.009) was valued lower in China, measuring social burden (p=0.037) was valued higher in Taiwan, and national guidelines (p=0.025) was valued higher in China. Priorities did not differ across stakeholder groups (p=0.438). Priorities for CLCC in Asia include monitoring at-risk populations, clinician education, national guidelines, multidisciplinary management, public awareness and centers of excellence. As most priorities are relatively concordant across the region, multilateral approaches to addressing comprehensive liver cancer would be beneficial. However, where priorities are discordant among sites, such as transplantation infrastructure, strategies should be tailored to local needs.
Priority rating : stormwater outfall prioritization scheme
DOT National Transportation Integrated Search
1996-10-01
The prioritization system, which compares the impacts of one outfall to another : and makes a determination of their overall impacts, was developed in the : Prioritization Method for Retrofitting Highways with Stormwater BMPs, prepared : by the Water...
Determination and prioritizing of addiction prevention factors in delfan city, iran.
Mirzaei, Davod; Zamani, Bibi Eshrat; Mousavi, Sayyed Hojat
2011-01-01
In recent decades, drug abuse has been one of the most important problems of human societies and has been imposing enormous charges to them. Exposing addicts to infectious diseases, social and economic harmful impacts, expensive and reversibility of treatment methods have caused that drug abuse prevention programs be more inexpensive and more effective than treatment. One of the most important methods of drug abuse prevention is identification and prioritization of them according to scientific methods. The purpose of this study was to investigate addiction prevention methods among adolescents and teenagers from the viewpoints of addicts, their parents, authorities and prioritizing the prevention methods based on analytical hierarchy process (AHP) model in Delfan city, Iran. Statistical samples included 17 authorities, 42 addicts, and 23 parents that have been selected through purposive sampling. Data collection instruments involved structured and semi-structured interviews. Data were analyzed based on quantitative and qualitative methods, encoding and categorization. In this study, AHP model was used for prioritizing the prevention methods. This model is one of the most efficient and comprehensive designed techniques for multi-criteria decision making; it formulates the possibility of natural complex problems as hierarchy. The results indicated that the most important methods of drug abuse prevention were using media, case studies, planning for leisure times, educating social skills, integrating drug prevention methods in religious customs and respect to teenagers. Among these factors, the media and respect to adolescents with weights 0.3321 and 0.2389 had the highest preferences for the prevention of drug addiction, respectively. Planning for leisure time with weight of 0.1349 had the lowest importance than media and teenager respectful factor and higher priority than religion customs, dating and learning lessons factors. On the contrary, integrating in religion customs, using case studies with weights 0.1145, 0.1114 and 0.0680 had the lowest preferences, respectively, and can be considered in later settings. The interviewees mentioned the most important addiction prevention methods in respect to teenagers, religious customs, media, dating skills, learning lessons from examples and attention to the leisure times among which the media has been the most efficient method. Because, publicity of the media as a national media is available to the public and it is not dedicated for a special group or class of people and everyone can use it regardless of his literacy and knowledge level.
Chapter 3: choosing the important outcomes for a systematic review of a medical test.
Segal, Jodi B
2012-06-01
In this chapter of the Evidence-based Practice Centers Methods Guide for Medical Tests, we describe how the decision to use a medical test generates a broad range of outcomes and that each of these outcomes should be considered for inclusion in a systematic review. Awareness of these varied outcomes affects how a decision maker balances the benefits and risks of the test; therefore, a systematic review should present the evidence on these diverse outcomes. The key outcome categories include clinical management outcomes and direct health effects; emotional, social, cognitive, and behavioral responses to testing; legal and ethical outcomes, and costs. We describe the challenges of incorporating these outcomes in a systematic review, suggest a framework for generating potential outcomes for inclusion, and describe the role of stakeholders in choosing the outcomes for study. Finally, we give examples of systematic reviews that either included a range of outcomes or that might have done so. The following are the key messages in this chapter: Consider both the outcomes that are relevant to the process of testing and those that are relevant to the results of the test. Consider inclusion of outcomes in all five domains: clinical management effects, direct test effects; emotional, social, cognitive and behavioral effects; legal and ethical effects, and costs. Consider to which group the outcomes of testing are most relevant. Given resource limitations, prioritize which outcomes to include. This decision depends on the needs of the stakeholder(s), who should be assisted in prioritizing the outcomes for inclusion.
Feasibility of Including Green Tea Products for an Analytically Verified Dietary Supplement Database
Saldanha, Leila; Dwyer, Johanna; Andrews, Karen; Betz, Joseph; Harnely, James; Pehrsson, Pamela; Rimmer, Catherine; Savarala, Sushma
2015-01-01
The Dietary Supplement Ingredient Database (DSID) is a federally funded, publicly accessible dietary supplement database that currently contains analytically-derived information on micronutrients in selected adult and children’s multivitamin and mineral (MVM) supplements. Other constituents in dietary supplement products such as botanicals are also of interest and thus are being considered for inclusion in the DSID. Thirty-eight constituents, mainly botanicals were identified and prioritized by a federal interagency committee. Green tea was selected from this list as the botanical for expansion of the DSID. This paper describes the process for prioritizing dietary ingredients in the DSID. It also discusses the criteria for inclusion of these ingredients, and the approach for selecting and testing products for the green tea pilot study. PMID:25817236
Johnson, Emily J.; Won, Christina S.; Köck, Kathleen; Paine, Mary F.
2017-01-01
Natural products, including botanical dietary supplements and exotic drinks, represent an ever-increasing share of the health care market. The parallel ever-increasing popularity of self-medicating with natural products increases the likelihood of co-consumption with conventional drugs, raising concerns for unwanted natural product-drug interactions. Assessing the drug interaction liability of natural products is challenging due to the complex and variable chemical composition inherent to these products, necessitating a streamlined preclinical testing approach to prioritize precipitant individual constituents for further investigation. Such an approach was evaluated in the current work to prioritize constituents in the model natural product, grapefruit juice, as inhibitors of intestinal organic anion-transporting peptide (OATP)-mediated uptake. Using OATP2B1-expressing MDCKII cells and the probe substrate estrone 3-sulfate, IC50s were determined for constituents representative of the flavanone (naringin, naringenin, hesperidin), furanocoumarin (bergamottin, 6′,7′-dihydroxybergamottin), and polymethoxyflavone (nobiletin and tangeretin) classes contained in grapefruit juice juice. Nobiletin was the most potent (IC50, 3.7 μM); 6′,7′-dihydroxybergamottin, naringin, naringenin, and tangeretin were moderately potent (IC50, 20–50 μM); and bergamottin and hesperidin were the least potent (IC50, >300 μM) OATP2B1 inhibitors. Intestinal absorption simulations based on physiochemical properties were used to determine ratios of unbound concentration to IC50 for each constituent within enterocytes and to prioritize in order of pre-defined cut-off values. This streamlined approach could be applied to other natural products that contain multiple precipitants of natural product-drug interactions. PMID:28032362
ERIC Educational Resources Information Center
Nadeau, Kacie M.
2017-01-01
The most recent phase of curriculum reform in the era of accountability is the Common Core State Standards (CCSS) which have essentially reshaped the landscape of public education. Its objective of preparing K-12 students for college and career upon high school graduation have prioritized English language arts, mathematics, and science over social…
ERIC Educational Resources Information Center
Rodriguez, Daniela Cristina
2011-01-01
In Mexico, as in many other countries, HIV/AIDS strategies are developed at the federal level and implemented at the state level. Local programs are expected to use data, in particular surveillance data, to drive their decisions on programmatic activities and prioritize populations with which the program will engage. Since the early 1980s Mexico…
ERIC Educational Resources Information Center
Jessup-Anger, Jody E.; Wawrzynski, Matthew R.; Yao, Christina W.
2011-01-01
This qualitative study employed a constructivist, case study approach to explore how faculty made meaning of their experiences in a newly developed residential college at a large, land-grant research university in the Midwest. Findings revealed that faculty focused on determining how to prioritize the numerous opportunities for involvement while…
Predictive features of breast cancer on Mexican screening mammography patients
NASA Astrophysics Data System (ADS)
Rodriguez-Rojas, Juan; Garza-Montemayor, Margarita; Trevino-Alvarado, Victor; Tamez-Pena, José Gerardo
2013-02-01
Breast cancer is the most common type of cancer worldwide. In response, breast cancer screening programs are becoming common around the world and public programs now serve millions of women worldwide. These programs are expensive, requiring many specialized radiologists to examine all images. Nevertheless, there is a lack of trained radiologists in many countries as in Mexico, which is a barrier towards decreasing breast cancer mortality, pointing at the need of a triaging system that prioritizes high risk cases for prompt interpretation. Therefore we explored in an image database of Mexican patients whether high risk cases can be distinguished using image features. We collected a set of 200 digital screening mammography cases from a hospital in Mexico, and assigned low or high risk labels according to its BIRADS score. Breast tissue segmentation was performed using an automatic procedure. Image features were obtained considering only the segmented region on each view and comparing the bilateral di erences of the obtained features. Predictive combinations of features were chosen using a genetic algorithms based feature selection procedure. The best model found was able to classify low-risk and high-risk cases with an area under the ROC curve of 0.88 on a 150-fold cross-validation test. The features selected were associated to the differences of signal distribution and tissue shape on bilateral views. The model found can be used to automatically identify high risk cases and trigger the necessary measures to provide prompt treatment.
Locus Coeruleus Activity Strengthens Prioritized Memories Under Arousal.
Clewett, David V; Huang, Ringo; Velasco, Rico; Lee, Tae-Ho; Mather, Mara
2018-02-07
Recent models posit that bursts of locus ceruleus (LC) activity amplify neural gain such that limited attention and encoding resources focus even more on prioritized mental representations under arousal. Here, we tested this hypothesis in human males and females using fMRI, neuromelanin MRI, and pupil dilation, a biomarker of arousal and LC activity. During scanning, participants performed a monetary incentive encoding task in which threat of punishment motivated them to prioritize encoding of scene images over superimposed objects. Threat of punishment elicited arousal and selectively enhanced memory for goal-relevant scenes. Furthermore, trial-level pupil dilations predicted better scene memory under threat, but were not related to object memory outcomes. fMRI analyses revealed that greater threat-evoked pupil dilations were positively associated with greater scene encoding activity in LC and parahippocampal cortex, a region specialized to process scene information. Across participants, this pattern of LC engagement for goal-relevant encoding was correlated with neuromelanin signal intensity, providing the first evidence that LC structure relates to its activation pattern during cognitive processing. Threat also reduced dynamic functional connectivity between high-priority (parahippocampal place area) and lower-priority (lateral occipital cortex) category-selective visual cortex in ways that predicted increased memory selectivity. Together, these findings support the idea that, under arousal, LC activity selectively strengthens prioritized memory representations by modulating local and functional network-level patterns of information processing. SIGNIFICANCE STATEMENT Adaptive behavior relies on the ability to select and store important information amid distraction. Prioritizing encoding of task-relevant inputs is especially critical in threatening or arousing situations, when forming these memories is essential for avoiding danger in the future. However, little is known about the arousal mechanisms that support such memory selectivity. Using fMRI, neuromelanin MRI, and pupil measures, we demonstrate that locus ceruleus (LC) activity amplifies neural gain such that limited encoding resources focus even more on prioritized mental representations under arousal. For the first time, we also show that LC structure relates to its involvement in threat-related encoding processes. These results shed new light on the brain mechanisms by which we process important information when it is most needed. Copyright © 2018 the authors 0270-6474/18/381558-17$15.00/0.
Prioritizing pharmaceuticals in municipal wastewater
Oral presentation at SETAC North America 32nd annual meeting, describing our prioritization of active pharmaceutical ingredients (APIs), based on estimates of risks posed by API residues originating from municipal wastewater. Goals of this project include prioritization of APIs f...
Brown, Andrew D; Marotta, Thomas R
2017-02-01
Incorrect imaging protocol selection can contribute to increased healthcare cost and waste. To help healthcare providers improve the quality and safety of medical imaging services, we developed and evaluated three natural language processing (NLP) models to determine whether NLP techniques could be employed to aid in clinical decision support for protocoling and prioritization of magnetic resonance imaging (MRI) brain examinations. To test the feasibility of using an NLP model to support clinical decision making for MRI brain examinations, we designed three different medical imaging prediction tasks, each with a unique outcome: selecting an examination protocol, evaluating the need for contrast administration, and determining priority. We created three models for each prediction task, each using a different classification algorithm-random forest, support vector machine, or k-nearest neighbor-to predict outcomes based on the narrative clinical indications and demographic data associated with 13,982 MRI brain examinations performed from January 1, 2013 to June 30, 2015. Test datasets were used to calculate the accuracy, sensitivity and specificity, predictive values, and the area under the curve. Our optimal results show an accuracy of 82.9%, 83.0%, and 88.2% for the protocol selection, contrast administration, and prioritization tasks, respectively, demonstrating that predictive algorithms can be used to aid in clinical decision support for examination protocoling. NLP models developed from the narrative clinical information provided by referring clinicians and demographic data are feasible methods to predict the protocol and priority of MRI brain examinations. Copyright © 2017 The Association of University Radiologists. Published by Elsevier Inc. All rights reserved.
Characterizing tuberculosis genotype clusters along the United States-Mexico border.
Baker, B J; Moonan, P K
2014-03-01
We examined the growth of tuberculosis (TB) genotype clusters during 2005-2010 in the United States, categorized by country of origin and ethnicity of the index case and geographic proximity to the US-Mexico border at the time of TB diagnosis. Nationwide, 38.9% of cases subsequent to Mexico-born index cases were US-born. Among clusters following US-born Hispanic and US-born non-Hispanic index cases, respectively 29.2% and 5.3% of subsequent cluster members were Mexico-born. In border areas, the majority of subsequent cases were Mexico-born following US-born Hispanic (56.4%) and US-born non-Hispanic (55.6%) index cases. These findings suggest that TB transmission commonly occurs between US-born and Mexico-born persons. Along the US-Mexico border, prioritizing TB genotype clusters following US-born index cases for investigation may prevent subsequent cases among both US-born and Mexico-born persons.
47 CFR 10.410 - Prioritization.
Code of Federal Regulations, 2010 CFR
2010-10-01
... 47 Telecommunication 1 2010-10-01 2010-10-01 false Prioritization. 10.410 Section 10.410 Telecommunication FEDERAL COMMUNICATIONS COMMISSION GENERAL COMMERCIAL MOBILE ALERT SYSTEM Alert Message Requirements § 10.410 Prioritization. A Participating CMS Provider is required to transmit Presidential Alerts...
Bird conservation would complement landslide prevention in the Central Andes of Colombia
Ocampo-Peñuela, Natalia
2015-01-01
Conservation and restoration priorities often focus on separate ecosystem problems. Inspired by the November 11th (2011) landslide event near Manizales, and the current poor results of Colombia’s Article 111 of Law 99 of 1993 as a conservation measure in this country, we set out to prioritize conservation and restoration areas where landslide prevention would complement bird conservation in the Central Andes. This area is one of the most biodiverse places on Earth, but also one of the most threatened. Using the case of the Rio Blanco Reserve, near Manizales, we identified areas for conservation where endemic and small-range bird diversity was high, and where landslide risk was also high. We further prioritized restoration areas by overlapping these conservation priorities with a forest cover map. Restoring forests in bare areas of high landslide risk and important bird diversity yields benefits for both biodiversity and people. We developed a simple landslide susceptibility model using slope, forest cover, aspect, and stream proximity. Using publicly available bird range maps, refined by elevation, we mapped concentrations of endemic and small-range bird species. We identified 1.54 km2 of potential restoration areas in the Rio Blanco Reserve, and 886 km2 in the Central Andes region. By prioritizing these areas, we facilitate the application of Article 111 which requires local and regional governments to invest in land purchases for the conservation of watersheds. PMID:25737819
Le, Duc-Hau; Pham, Van-Huy
2017-06-15
Finding gene-disease and disease-disease associations play important roles in the biomedical area and many prioritization methods have been proposed for this goal. Among them, approaches based on a heterogeneous network of genes and diseases are considered state-of-the-art ones, which achieve high prediction performance and can be used for diseases with/without known molecular basis. Here, we developed a Cytoscape app, namely HGPEC, based on a random walk with restart algorithm on a heterogeneous network of genes and diseases. This app can prioritize candidate genes and diseases by employing a heterogeneous network consisting of a network of genes/proteins and a phenotypic disease similarity network. Based on the rankings, novel disease-gene and disease-disease associations can be identified. These associations can be supported with network- and rank-based visualization as well as evidences and annotations from biomedical data. A case study on prediction of novel breast cancer-associated genes and diseases shows the abilities of HGPEC. In addition, we showed prominence in the performance of HGPEC compared to other tools for prioritization of candidate disease genes. Taken together, our app is expected to effectively predict novel disease-gene and disease-disease associations and support network- and rank-based visualization as well as biomedical evidences for such the associations.
Bird conservation would complement landslide prevention in the Central Andes of Colombia.
Ocampo-Peñuela, Natalia; Pimm, Stuart L
2015-01-01
Conservation and restoration priorities often focus on separate ecosystem problems. Inspired by the November 11th (2011) landslide event near Manizales, and the current poor results of Colombia's Article 111 of Law 99 of 1993 as a conservation measure in this country, we set out to prioritize conservation and restoration areas where landslide prevention would complement bird conservation in the Central Andes. This area is one of the most biodiverse places on Earth, but also one of the most threatened. Using the case of the Rio Blanco Reserve, near Manizales, we identified areas for conservation where endemic and small-range bird diversity was high, and where landslide risk was also high. We further prioritized restoration areas by overlapping these conservation priorities with a forest cover map. Restoring forests in bare areas of high landslide risk and important bird diversity yields benefits for both biodiversity and people. We developed a simple landslide susceptibility model using slope, forest cover, aspect, and stream proximity. Using publicly available bird range maps, refined by elevation, we mapped concentrations of endemic and small-range bird species. We identified 1.54 km(2) of potential restoration areas in the Rio Blanco Reserve, and 886 km(2) in the Central Andes region. By prioritizing these areas, we facilitate the application of Article 111 which requires local and regional governments to invest in land purchases for the conservation of watersheds.
76 FR 21936 - Aviation Rulemaking Advisory Committee-New Task
Federal Register 2010, 2011, 2012, 2013, 2014
2011-04-19
... ARAC activity and solicits membership for the new Rulemaking Prioritization Working Group. FOR FURTHER... Rulemaking Prioritization Working Group will specifically address, in part, Recommendation 22: ``The... available * * *.'' The objective of the Rulemaking Prioritization Working Group is to provide advice and...
DOT National Transportation Integrated Search
1998-07-01
A prioritization process has been prepared by the University of California, Davis, for use by the Oregon Department of Transportation (ODOT) in selecting multimodal mobility improvement projects to fund, given a budget constraint. The process involve...
Performance Optimization of Priority Assisted CSMA/CA Mechanism of 802.15.6 under Saturation Regime
Shakir, Mustafa; Rehman, Obaid Ur; Rahim, Mudassir; Alrajeh, Nabil; Khan, Zahoor Ali; Khan, Mahmood Ashraf; Niaz, Iftikhar Azim; Javaid, Nadeem
2016-01-01
Due to the recent development in the field of Wireless Sensor Networks (WSNs), the Wireless Body Area Networks (WBANs) have become a major area of interest for the developers and researchers. Human body exhibits postural mobility due to which distance variation occurs and the status of connections amongst sensors change time to time. One of the major requirements of WBAN is to prolong the network lifetime without compromising on other performance measures, i.e., delay, throughput and bandwidth efficiency. Node prioritization is one of the possible solutions to obtain optimum performance in WBAN. IEEE 802.15.6 CSMA/CA standard splits the nodes with different user priorities based on Contention Window (CW) size. Smaller CW size is assigned to higher priority nodes. This standard helps to reduce delay, however, it is not energy efficient. In this paper, we propose a hybrid node prioritization scheme based on IEEE 802.15.6 CSMA/CA to reduce energy consumption and maximize network lifetime. In this scheme, optimum performance is achieved by node prioritization based on CW size as well as power in respective user priority. Our proposed scheme reduces the average back off time for channel access due to CW based prioritization. Additionally, power based prioritization for a respective user priority helps to minimize required number of retransmissions. Furthermore, we also compare our scheme with IEEE 802.15.6 CSMA/CA standard (CW assisted node prioritization) and power assisted node prioritization under postural mobility in WBAN. Mathematical expressions are derived to determine the accurate analytical model for throughput, delay, bandwidth efficiency, energy consumption and life time for each node prioritization scheme. With the intention of analytical model validation, we have performed the simulations in OMNET++/MIXIM framework. Analytical and simulation results show that our proposed hybrid node prioritization scheme outperforms other node prioritization schemes in terms of average network delay, average throughput, average bandwidth efficiency and network lifetime. PMID:27598167
A large-scale benchmark of gene prioritization methods.
Guala, Dimitri; Sonnhammer, Erik L L
2017-04-21
In order to maximize the use of results from high-throughput experimental studies, e.g. GWAS, for identification and diagnostics of new disease-associated genes, it is important to have properly analyzed and benchmarked gene prioritization tools. While prospective benchmarks are underpowered to provide statistically significant results in their attempt to differentiate the performance of gene prioritization tools, a strategy for retrospective benchmarking has been missing, and new tools usually only provide internal validations. The Gene Ontology(GO) contains genes clustered around annotation terms. This intrinsic property of GO can be utilized in construction of robust benchmarks, objective to the problem domain. We demonstrate how this can be achieved for network-based gene prioritization tools, utilizing the FunCoup network. We use cross-validation and a set of appropriate performance measures to compare state-of-the-art gene prioritization algorithms: three based on network diffusion, NetRank and two implementations of Random Walk with Restart, and MaxLink that utilizes network neighborhood. Our benchmark suite provides a systematic and objective way to compare the multitude of available and future gene prioritization tools, enabling researchers to select the best gene prioritization tool for the task at hand, and helping to guide the development of more accurate methods.
Epidemiology of Tuberculosis in Young Children in the United States
Pang, Jenny; Teeter, Larry D.; Katz, Dolly J.; Davidow, Amy L.; Miranda, Wilson; Wall, Kirsten; Ghosh, Smita; Stein-Hart, Trudy; Restrepo, Blanca I.; Reves, Randall; Graviss, Edward A.
2016-01-01
OBJECTIVES To estimate tuberculosis (TB) rates among young children in the United States by children’s and parents’ birth origins and describe the epidemiology of TB among young children who are foreign-born or have at least 1 foreign-born parent. METHODS Study subjects were children <5 years old diagnosed with TB in 20 US jurisdictions during 2005–2006. TB rates were calculated from jurisdictions’ TB case counts and American Community Survey population estimates. An observational study collected demographics, immigration and travel histories, and clinical and source case details from parental interviews and health department and TB surveillance records. RESULTS Compared with TB rates among US-born children with US-born parents, rates were 32 times higher in foreign-born children and 6 times higher in US-born children with foreign-born parents. Most TB cases (53%) were among the 29% of children who were US born with foreign-born parents. In the observational study, US-born children with foreign-born parents were more likely than foreign-born children to be infants (30% vs 7%), Hispanic (73% vs 37%), diagnosed through contact tracing (40% vs 7%), and have an identified source case (61% vs 19%); two-thirds of children were exposed in the United States. CONCLUSIONS Young children who are US born of foreign-born parents have relatively high rates of TB and account for most cases in this age group. Prompt diagnosis and treatment of adult source cases, effective contact investigations prioritizing young contacts, and targeted testing and treatment of latent TB infection are necessary to reduce TB morbidity in this population. PMID:24515517
The synoptic approach is a landscape-level assessment tool for geographic prioritization of wetland protection and restoration efforts. Prioritization becomes necessary when effort ? including time and money ? is limited, forcing managers to select a subset of locations. The ap...
The Toxicological Prioritization Index (ToxPi) decision support framework was previously developed to facilitate incorporation of diverse data to prioritize chemicals based on potential hazard. This ToxPi index was demonstrated by considering results of bioprofiling related to po...
North, Frederick; Varkey, Prathiba; Caraballo, Pedro; Vsetecka, Darlene; Bartel, Greg
2007-10-11
Complex decision support software can require significant effort in maintenance and enhancement. A quality improvement tool, the prioritization matrix, was successfully used to guide software enhancement of algorithms in a symptom assessment call center.
Inter-Individual Variability in High-Throughput Risk Prioritization of Environmental Chemicals (Sot)
We incorporate realistic human variability into an open-source high-throughput (HT) toxicokinetics (TK) modeling framework for use in a next-generation risk prioritization approach. Risk prioritization involves rapid triage of thousands of environmental chemicals, most which have...
We incorporate inter-individual variability into an open-source high-throughput (HT) toxicokinetics (TK) modeling framework for use in a next-generation risk prioritization approach. Risk prioritization involves rapid triage of thousands of environmental chemicals, most which hav...
Increasing awareness about endocrine disrupting chemicals (EDCs) in the environment has driven concern about their potential impact on human health and wildlife. Tens of thousands of natural and synthetic xenobiotics are presently in commerce with little to no toxicity data and t...
The need to assess large numbers of chemicals for their potential toxicities has resulted in increased emphasis on medium- and high-throughput in vitro screening approaches. For such approaches to be useful, efficient and reliable data analysis and hit detection methods are also ...
Over time, toxicity-testing paradigms have progressed from low-throughput in vivo animal studies for limited numbers of chemicals to high-throughput (HT) in vitro screening assays for thousands of chemicals. Such HT in vitro methods, along with HT in silico predictions of popula...
ERIC Educational Resources Information Center
Foran, Christine A.; Mannion, Cynthia; Rutherford, Gayle
2017-01-01
The aim of our study was to explore the perceptions of elementary teachers who routinely prioritized physical activity in their classrooms. Researchers are reporting improved student academic test results following physical activity sessions, however, classroom teachers are challenged in balancing curricular and other expectations. Hence, teachers…
Abstract: There are tens of thousands of man-made chemicals to which humans are exposed, but only a fraction of these have the extensive in vivo toxicity data used in most traditional risk assessments. This lack of data, coupled with concerns about testing costs and animal use, a...
There are a number of risk management decisions, which range from prioritization for testing to quantitative risk assessments. The utility of in vitro studies in these decisions depends on how well the results of such data can be qualitatively and quantitatively extrapolated to i...
ERIC Educational Resources Information Center
Martinez, Ron; Schmitt, Norbert
2012-01-01
There is little dispute that formulaic sequences form an important part of the lexicon, but to date there has been no principled way to prioritize the inclusion of such items in pedagogic materials, such as ESL/EFL textbooks or tests of vocabulary knowledge. While wordlists have been used for decades, they have only provided information about…
ERIC Educational Resources Information Center
Gobert, Janice D.; Koedinger, Kenneth R.
2011-01-01
The National frameworks for science emphasize inquiry skills (NRC, 1996), however, in typical classroom practice, science learning often focuses on rote learning in part because science process skills are difficult to assess (Fadel, Honey, & Pasnick, 2007) and rote knowledge is prioritized on high-stakes tests. Short answer assessments of…
Babaoglu, Kerim; Simeonov, Anton; Irwin, John J.; Nelson, Michael E.; Feng, Brian; Thomas, Craig J.; Cancian, Laura; Costi, M. Paola; Maltby, David A.; Jadhav, Ajit; Inglese, James; Austin, Christopher P.; Shoichet, Brian K.
2009-01-01
High-throughput screening (HTS) is widely used in drug discovery. Especially for screens of unbiased libraries, false positives can dominate “hit lists”; their origins are much debated. Here we determine the mechanism of every active hit from a screen of 70,563 unbiased molecules against β-lactamase using quantitative HTS (qHTS). Of the 1274 initial inhibitors, 95% were detergent-sensitive and were classified as aggregators. Among the 70 remaining were 25 potent, covalent-acting β-lactams. Mass spectra, counter-screens, and crystallography identified 12 as promiscuous covalent inhibitors. The remaining 33 were either aggregators or irreproducible. No specific reversible inhibitors were found. We turned to molecular docking to prioritize molecules from the same library for testing at higher concentrations. Of 16 tested, 2 were modest inhibitors. Subsequent X-ray structures corresponded to the docking prediction. Analog synthesis improved affinity to 8 µM. These results suggest that it may be the physical behavior of organic molecules, not their reactivity, that accounts for most screening artifacts. Structure-based methods may prioritize weak-but-novel chemotypes in unbiased library screens. PMID:18333608
Csiszar, Susan A; Meyer, David E; Dionisio, Kathie L; Egeghy, Peter; Isaacs, Kristin K; Price, Paul S; Scanlon, Kelly A; Tan, Yu-Mei; Thomas, Kent; Vallero, Daniel; Bare, Jane C
2016-11-01
Life Cycle Assessment (LCA) is a decision-making tool that accounts for multiple impacts across the life cycle of a product or service. This paper presents a conceptual framework to integrate human health impact assessment with risk screening approaches to extend LCA to include near-field chemical sources (e.g., those originating from consumer products and building materials) that have traditionally been excluded from LCA. A new generation of rapid human exposure modeling and high-throughput toxicity testing is transforming chemical risk prioritization and provides an opportunity for integration of screening-level risk assessment (RA) with LCA. The combined LCA and RA approach considers environmental impacts of products alongside risks to human health, which is consistent with regulatory frameworks addressing RA within a sustainability mindset. A case study is presented to juxtapose LCA and risk screening approaches for a chemical used in a consumer product. The case study demonstrates how these new risk screening tools can be used to inform toxicity impact estimates in LCA and highlights needs for future research. The framework provides a basis for developing tools and methods to support decision making on the use of chemicals in products.
Teaching and evaluating critical thinking in respiratory care.
Mishoe, Shelley C; Hernlen, Kitty
2005-09-01
The capacity to perform critical thinking in respiratory care may be enhanced through awareness and education to improve skills, abilities, and opportunities. The essential skills for critical thinking in respiratory care include prioritizing, anticipating, troubleshooting, communicating, negotiating, decision making, and reflecting. In addition to these skills, critical thinkers exhibit certain characteristics such as critical evaluation, judgment,insight, motivation, and lifelong learning. The teaching of critical thinking may be accomplished though problem-based learning using an evidenced-based approach to solve clinical problems similar to those encountered in professional practice. Other traditional strategies such as discussion, debate, case study, and case presentations can be used. Web-based curriculum and technologic advances have created opportunities such as bulletin boards, real-time chats, and interactive media tools that can incorporate critical thinking. Many concerns and controversies surround the assessment of critical thinking, and individuals who administer critical thinking tests must be aware of the strengths and limitations of these assessment tools, as well as their relevance to the workplace. The foundational works reported in this article summarize the current status of assessment of critical thinking and can stimulate further investigation and application of the skills, characteristics, educational strategies, and measurement of critical thinking in respiratory care.
Single Operator Control of Multiple UAS: A Supervisory Delegation Approach
NASA Technical Reports Server (NTRS)
Shively, Jay
2017-01-01
This presentation will be given as part of the UAS EXCOM Science and Research Panel's (SARP) workshop on multiple UAS controlled by a single operator. Participants were asked to identify public use cases for multiple Unmanned Aircraft Systems (UAS) control and identify research, policy, and technical gaps in those operations. The purpose of this workshop is to brainstorm, categorize, and prioritize those use cases and gaps. Here, I will discuss research performed on this topic when I worked for the Army and on-going work within the division and a NATO working group on Human-Autonomy Teaming.
Wolf, Douglas C.; Bachman, Ammie; Barrett, Gordon; Bellin, Cheryl; Goodman, Jay I.; Jensen, Elke; Moretto, Angelo; McMullin, Tami; Pastoor, Timothy P.; Schoeny, Rita; Slezak, Brian; Wend, Korinna; Embry, Michelle R.
2016-01-01
ABSTRACT The HESI-led RISK21 effort has developed a framework supporting the use of twenty-first century technology in obtaining and using information for chemical risk assessment. This framework represents a problem formulation-based, exposure-driven, tiered data acquisition approach that leads to an informed decision on human health safety to be made when sufficient evidence is available. It provides a transparent and consistent approach to evaluate information in order to maximize the ability of assessments to inform decisions and to optimize the use of resources. To demonstrate the application of the framework’s roadmap and matrix, this case study evaluates a large number of chemicals that could be present in drinking water. The focus is to prioritize which of these should be considered for human health risk as individual contaminants. The example evaluates 20 potential drinking water contaminants, using the tiered RISK21 approach in combination with graphical representation of information at each step, using the RISK21 matrix. Utilizing the framework, 11 of the 20 chemicals were assigned low priority based on available exposure data alone, which demonstrated that exposure was extremely low. The remaining nine chemicals were further evaluated, using refined estimates of toxicity based on readily available data, with three deemed high priority for further evaluation. In the present case study, it was determined that the greatest value of additional information would be from improved exposure models and not from additional hazard characterization. PMID:26451723
A traits-based approach for prioritizing species for monitoring and surrogacy selection
Pracheil, Brenda M.; McManamay, Ryan A.; Bevelhimer, Mark S.; ...
2016-11-28
The bar for justifying the use of vertebrate animals for study is being increasingly raised, thus requiring increased rigor for species selection and study design. Although we have power analyses to provide quantitative backing for the numbers of organisms used, quantitative backing for selection of study species is not frequently employed. This can be especially important when measuring the impacts of ecosystem alteration, when study species must be chosen that are both sensitive to the alteration and of sufficient abundance for study. Just as important is providing justification for designation of surrogate species for study, especially when the species ofmore » interest is rare or of conservation concern and selection of an appropriate surrogate can have legal implications. In this study, we use a combination of GIS, a fish traits database and multivariate statistical analyses to quantitatively prioritize species for study and to determine potential study surrogate species. We provide two case studies to illustrate our quantitative, traits-based approach for designating study species and surrogate species. In the first case study, we select broadly representative fish species to understand the effects of turbine passage on adult fishes based on traits that suggest sensitivity to turbine passage. In our second case study, we present a framework for selecting a surrogate species for an endangered species. Lastly, we suggest that our traits-based framework can provide quantitative backing and added justification to selection of study species while expanding the inference space of study results.« less
Wolf, Douglas C; Bachman, Ammie; Barrett, Gordon; Bellin, Cheryl; Goodman, Jay I; Jensen, Elke; Moretto, Angelo; McMullin, Tami; Pastoor, Timothy P; Schoeny, Rita; Slezak, Brian; Wend, Korinna; Embry, Michelle R
2016-01-01
The HESI-led RISK21 effort has developed a framework supporting the use of twenty-first century technology in obtaining and using information for chemical risk assessment. This framework represents a problem formulation-based, exposure-driven, tiered data acquisition approach that leads to an informed decision on human health safety to be made when sufficient evidence is available. It provides a transparent and consistent approach to evaluate information in order to maximize the ability of assessments to inform decisions and to optimize the use of resources. To demonstrate the application of the framework's roadmap and matrix, this case study evaluates a large number of chemicals that could be present in drinking water. The focus is to prioritize which of these should be considered for human health risk as individual contaminants. The example evaluates 20 potential drinking water contaminants, using the tiered RISK21 approach in combination with graphical representation of information at each step, using the RISK21 matrix. Utilizing the framework, 11 of the 20 chemicals were assigned low priority based on available exposure data alone, which demonstrated that exposure was extremely low. The remaining nine chemicals were further evaluated, using refined estimates of toxicity based on readily available data, with three deemed high priority for further evaluation. In the present case study, it was determined that the greatest value of additional information would be from improved exposure models and not from additional hazard characterization.
A traits-based approach for prioritizing species for monitoring and surrogacy selection
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pracheil, Brenda M.; McManamay, Ryan A.; Bevelhimer, Mark S.
The bar for justifying the use of vertebrate animals for study is being increasingly raised, thus requiring increased rigor for species selection and study design. Although we have power analyses to provide quantitative backing for the numbers of organisms used, quantitative backing for selection of study species is not frequently employed. This can be especially important when measuring the impacts of ecosystem alteration, when study species must be chosen that are both sensitive to the alteration and of sufficient abundance for study. Just as important is providing justification for designation of surrogate species for study, especially when the species ofmore » interest is rare or of conservation concern and selection of an appropriate surrogate can have legal implications. In this study, we use a combination of GIS, a fish traits database and multivariate statistical analyses to quantitatively prioritize species for study and to determine potential study surrogate species. We provide two case studies to illustrate our quantitative, traits-based approach for designating study species and surrogate species. In the first case study, we select broadly representative fish species to understand the effects of turbine passage on adult fishes based on traits that suggest sensitivity to turbine passage. In our second case study, we present a framework for selecting a surrogate species for an endangered species. Lastly, we suggest that our traits-based framework can provide quantitative backing and added justification to selection of study species while expanding the inference space of study results.« less
Patient-centered prioritization of bladder cancer research.
Smith, Angela B; Chisolm, Stephanie; Deal, Allison; Spangler, Alejandra; Quale, Diane Z; Bangs, Rick; Jones, J Michael; Gore, John L
2018-05-04
Patient-centered research requires the meaningful involvement of patients and caregivers throughout the research process. The objective of this study was to create a process for sustainable engagement for research prioritization within oncology. From December 2014 to 2016, a network of engaged patients for research prioritization was created in partnership with the Bladder Cancer Advocacy Network (BCAN): the BCAN Patient Survey Network (PSN). The PSN leveraged an online bladder cancer community with additional recruitment through print advertisements and social media campaigns. Prioritized research questions were developed through a modified Delphi process and were iterated through multidisciplinary working groups and a repeat survey. In year 1 of the PSN, 354 patients and caregivers responded to the research prioritization survey; the number of responses increased to 1034 in year 2. The majority of respondents had non-muscle-invasive bladder cancer (NMIBC), and the mean time since diagnosis was 5 years. Stakeholder-identified questions for noninvasive, invasive, and metastatic disease were prioritized by the PSN. Free-text questions were sorted with thematic mapping. Several questions submitted by respondents were among the prioritized research questions. A final prioritized list of research questions was disseminated to various funding agencies, and a highly ranked NMIBC research question was included as a priority area in the 2017 Patient-Centered Outcomes Research Institute announcement of pragmatic trial funding. Patient engagement is needed to identify high-priority research questions in oncology. The BCAN PSN provides a successful example of an engagement infrastructure for annual research prioritization in bladder cancer. The creation of an engagement network sets the groundwork for additional phases of engagement, including design, conduct, and dissemination. Cancer 2018. © 2018 American Cancer Society. © 2018 American Cancer Society.
From Pivot to Symmetry Integrating Africa in the Rebalance to Asia
2014-02-13
point of view, the competition is already lost, as a natural evolution of the global balance of powers. In that case, the American laissez - faire ...prioritize its involvements across the globe in an environment of finite resources. In this regard, the subsequent international leadership void...capitalizing on burden sharing and distributive leadership among allies and African partners, avoiding the pitfalls of bluntly and unilaterally imposed
Optimizing the Prioritization of Natural Disaster Recovery Projects
2007-03-01
collection, and basic utility and infrastructure restoration. The restoration of utilities can include temporary bridges, temporary water and sewage lines...interrupted such as in the case of the 9/11 disaster. Perhaps next time our enemies may target our power grid or water systems. It is the duty of...Transportation The amount and type of transportation infrastructure damage a repair project addresses Water The amount and type of water
Integrated Risk Index of Chemical Aquatic Pollution (IRICAP): case studies in Iberian rivers.
Fàbrega, Francesc; Marquès, Montse; Ginebreda, Antoni; Kuzmanovic, Maja; Barceló, Damià; Schuhmacher, Marta; Domingo, José L; Nadal, Martí
2013-12-15
The hazard of chemical compounds can be prioritized according to their PBT (persistence, bioaccumulation, toxicity) properties by using Self-Organizing Maps (SOM). The objective of the present study was to develop an Integrated Risk Index of Chemical Aquatic Pollution (IRICAP), useful to evaluate the risk associated to the exposure of chemical mixtures contained in river waters. Four Spanish river basins were considered as case-studies: Llobregat, Ebro, Jucar and Guadalquivir. A SOM-based hazard index (HI) was estimated for 205 organic compounds. IRICAP was calculated as the product of the HI by the concentration of each pollutant, and the results of all substances were aggregated. Finally, Pareto distribution was applied to the ranked lists of compounds in each site to prioritize those chemicals with the most significant incidence on the IRICAP. According to the HI outcomes, perfluoroalkyl substances, as well as specific illicit drugs and UV filters, were among the most hazardous compounds. Xylazine was identified as one of the chemicals with the highest contribution to the total IRICAP value in the different river basins, together with other pharmaceutical products such as loratadine and azaperol. These organic compounds should be proposed as target chemicals in the implementation of monitoring programs by regulatory organizations. Copyright © 2013 Elsevier B.V. All rights reserved.
We incorporate inter-individual variability into an open-source high-throughput (HT) toxicokinetics (TK) modeling framework for use in a next-generation risk prioritization approach. Risk prioritization involves rapid triage of thousands of environmental chemicals, most which hav...
Prioritization of Disease Susceptibility Genes Using LSM/SVD.
Gong, Lejun; Yang, Ronggen; Yan, Qin; Sun, Xiao
2013-12-01
Understanding the role of genetics in diseases is one of the most important tasks in the postgenome era. It is generally too expensive and time consuming to perform experimental validation for all candidate genes related to disease. Computational methods play important roles for prioritizing these candidates. Herein, we propose an approach to prioritize disease genes using latent semantic mapping based on singular value decomposition. Our hypothesis is that similar functional genes are likely to cause similar diseases. Measuring the functional similarity between known disease susceptibility genes and unknown genes is to predict new disease susceptibility genes. Taking autism as an instance, the analysis results of the top ten genes prioritized demonstrate they might be autism susceptibility genes, which also indicates our approach could discover new disease susceptibility genes. The novel approach of disease gene prioritization could discover new disease susceptibility genes, and latent disease-gene relations. The prioritized results could also support the interpretive diversity and experimental views as computational evidence for disease researchers.
Martínez García, Laura; Pardo-Hernandez, Hector; Superchi, Cecilia; Niño de Guzman, Ena; Ballesteros, Monica; Ibargoyen Roteta, Nora; McFarlane, Emma; Posso, Margarita; Roqué I Figuls, Marta; Rotaeche Del Campo, Rafael; Sanabria, Andrea Juliana; Selva, Anna; Solà, Ivan; Vernooij, Robin W M; Alonso-Coello, Pablo
2017-06-01
The aim of the study was to identify and describe strategies to prioritize the updating of systematic reviews (SRs), health technology assessments (HTAs), or clinical guidelines (CGs). We conducted an SR of studies describing one or more methods to prioritize SRs, HTAs, or CGs for updating. We searched MEDLINE (PubMed, from 1966 to August 2016) and The Cochrane Methodology Register (The Cochrane Library, Issue 8 2016). We hand searched abstract books, reviewed reference lists, and contacted experts. Two reviewers independently screened the references and extracted data. We included 14 studies. Six studies were classified as descriptive (6 of 14, 42.9%) and eight as implementation studies (8 of 14, 57.1%). Six studies reported an updating strategy (6 of 14, 42.9%), six a prioritization process (6 of 14, 42.9%), and two a prioritization criterion (2 of 14, 14.2%). Eight studies focused on SRs (8 of 14, 57.1%), six studies focused on CGs (6 of 14, 42.9%), and none were about HTAs. We identified 76 prioritization criteria that can be applied when prioritizing documents for updating. The most frequently cited criteria were as follows: available evidence (19 of 76, 25.0%), clinical relevance (10 of 76; 13.2%), and users' interest (10 of 76; 13.2%). There is wide variability and suboptimal reporting of the methods used to develop and implement processes to prioritize updating of SRs, HTAs, and CGs. Copyright © 2017 Elsevier Inc. All rights reserved.
[Implementation of quality of care indicators for third-level public hospitals in Mexico].
Saturno-Hernández, Pedro Jesús; Martínez-Nicolás, Ismael; Poblano-Verástegui, Ofelia; Vértiz-Ramírez, José de Jesús; Suárez-Ortiz, Erasto Cosme; Magaña-Izquierdo, Manuel; Kawa-Karasik, Simón
2017-01-01
To select, pilot test and implement a set of indicators for tertiary public hospitals. Quali-quantitative study in four stages: identification of indicators used internationally; selection and prioritization by utility, feasibility and reliability; exploration of the quality of sources of information in six hospitals; pilot feasibility and reliability, and follow-up measurement. From 143 indicators, 64 were selected and eight were prioritized. The scan revealed sources of information deficient. In the pilot, three indicators were feasible with reliability limited. Has conducted workshops to improve records and sources of information; nine hospitals reported measurements of a quarter. Eight priority indicators could not be measured immediately due to limitations in the data sources for its construction. It is necessary to improve mechanisms of registration and processing of data in this group of hospital.
FY11 Facility Assessment Study for Aeronautics Test Program
NASA Technical Reports Server (NTRS)
Loboda, John A.; Sydnor, George H.
2013-01-01
This paper presents the approach and results for the Aeronautics Test Program (ATP) FY11 Facility Assessment Project. ATP commissioned assessments in FY07 and FY11 to aid in the understanding of the current condition and reliability of its facilities and their ability to meet current and future (five year horizon) test requirements. The principle output of the assessment was a database of facility unique, prioritized investments projects with budgetary cost estimates. This database was also used to identify trends for the condition of facility systems.
Dare, Anna J.; Lee, Katherine C.; Bleicher, Josh; Elobu, Alex E.; Kamara, Thaim B.; Liko, Osborne; Luboga, Samuel; Danlop, Akule; Kune, Gabriel; Hagander, Lars; Leather, Andrew J. M.; Yamey, Gavin
2016-01-01
Background Little is known about the social and political factors that influence priority setting for different health services in low- and middle-income countries (LMICs), yet these factors are integral to understanding how national health agendas are established. We investigated factors that facilitate or prevent surgical care from being prioritized in LMICs. Methods and Findings We undertook country case studies in Papua New Guinea, Uganda, and Sierra Leone, using a qualitative process-tracing method. We conducted 74 semi-structured interviews with stakeholders involved in health agenda setting and surgical care in these countries. Interviews were triangulated with published academic literature, country reports, national health plans, and policies. Data were analyzed using a conceptual framework based on four components (actor power, ideas, political contexts, issue characteristics) to assess national factors influencing priority for surgery. Political priority for surgical care in the three countries varies. Priority was highest in Papua New Guinea, where surgical care is firmly embedded within national health plans and receives significant domestic and international resources, and much lower in Uganda and Sierra Leone. Factors influencing whether surgical care was prioritized were the degree of sustained and effective domestic advocacy by the local surgical community, the national political and economic environment in which health policy setting occurs, and the influence of international actors, particularly donors, on national agenda setting. The results from Papua New Guinea show that a strong surgical community can generate priority from the ground up, even where other factors are unfavorable. Conclusions National health agenda setting is a complex social and political process. To embed surgical care within national health policy, sustained advocacy efforts, effective framing of the problem and solutions, and country-specific data are required. Political, technical, and financial support from regional and international partners is also important. PMID:27186645
Kreitler, Jason R.; Schloss, Carrie A.; Soong, Oliver; Lee Hannah,; Davis, Frank W.
2015-01-01
Balancing society’s competing needs of development and conservation requires careful consideration of tradeoffs. Renewable energy development and biodiversity conservation are often considered beneficial environmental goals. The direct footprint and disturbance of renewable energy, however, can displace species’ habitat and negatively impact populations and natural communities if sited without ecological consideration. Offsets have emerged as a potentially useful tool to mitigate residual impacts after trying to avoid, minimize, or restore affected sites. Yet the problem of efficiently designing a set of offset sites becomes increasingly complex where many species or many sites are involved. Spatial conservation prioritization tools are designed to handle this problem, but have seen little application to offset siting and analysis. To address this need we designed an offset siting support tool for the Desert Renewable Energy Conservation Plan (DRECP) of California, and present a case study of hypothetical impacts from solar development in the Western Mojave subsection. We compare two offset scenarios designed to mitigate a hypothetical 15,331 ha derived from proposed utility-scale solar energy development (USSED) projects. The first scenario prioritizes offsets based precisely on impacted features, while the second scenario offsets impacts to maximize biodiversity conservation gains in the region. The two methods only agree on 28% of their prioritized sites and differ in meeting species-specific offset goals. Differences between the two scenarios highlight the importance of clearly specifying choices and priorities for offset siting and mitigation in general. Similarly, the effects of background climate and land use change may lessen the durability or effectiveness of offsets if not considered. Our offset siting support tool was designed specifically for the DRECP area, but with minor code modification could work well in other offset analyses, and could provide continuing support for a potentially innovative mitigation solution to environmental impacts.
Timbre Brownfield Prioritization Tool to support effective brownfield regeneration.
Pizzol, Lisa; Zabeo, Alex; Klusáček, Petr; Giubilato, Elisa; Critto, Andrea; Frantál, Bohumil; Martinát, Standa; Kunc, Josef; Osman, Robert; Bartke, Stephan
2016-01-15
In the last decade, the regeneration of derelict or underused sites, fully or partly located in urban areas (or so called "brownfields"), has become more common, since free developable land (or so called "greenfields") has more and more become a scare and, hence, more expensive resource, especially in densely populated areas. Although the regeneration of brownfield sites can offer development potentials, the complexity of these sites requires considerable efforts to successfully complete their revitalization projects and the proper selection of promising sites is a pre-requisite to efficiently allocate the limited financial resources. The identification and analysis of success factors for brownfield sites regeneration can support investors and decision makers in selecting those sites which are the most advantageous for successful regeneration. The objective of this paper is to present the Timbre Brownfield Prioritization Tool (TBPT), developed as a web-based solution to assist stakeholders responsible for wider territories or clusters of brownfield sites (portfolios) to identify which brownfield sites should be preferably considered for redevelopment or further investigation. The prioritization approach is based on a set of success factors properly identified through a systematic stakeholder engagement procedure. Within the TBPT these success factors are integrated by means of a Multi Criteria Decision Analysis (MCDA) methodology, which includes stakeholders' requalification objectives and perspectives related to the brownfield regeneration process and takes into account the three pillars of sustainability (economic, social and environmental dimensions). The tool has been applied to the South Moravia case study (Czech Republic), considering two different requalification objectives identified by local stakeholders, namely the selection of suitable locations for the development of a shopping centre and a solar power plant, respectively. The application of the TBPT to the case study showed that it is flexible and easy to adapt to different local contexts, allowing the assessors to introduce locally relevant parameters identified according to their expertise and considering the availability of local data. Copyright © 2015 Elsevier Ltd. All rights reserved.
Kreitler, Jason; Schloss, Carrie A; Soong, Oliver; Hannah, Lee; Davis, Frank W
2015-01-01
Balancing society's competing needs of development and conservation requires careful consideration of tradeoffs. Renewable energy development and biodiversity conservation are often considered beneficial environmental goals. The direct footprint and disturbance of renewable energy, however, can displace species' habitat and negatively impact populations and natural communities if sited without ecological consideration. Offsets have emerged as a potentially useful tool to mitigate residual impacts after trying to avoid, minimize, or restore affected sites. Yet the problem of efficiently designing a set of offset sites becomes increasingly complex where many species or many sites are involved. Spatial conservation prioritization tools are designed to handle this problem, but have seen little application to offset siting and analysis. To address this need we designed an offset siting support tool for the Desert Renewable Energy Conservation Plan (DRECP) of California, and present a case study of hypothetical impacts from solar development in the Western Mojave subsection. We compare two offset scenarios designed to mitigate a hypothetical 15,331 ha derived from proposed utility-scale solar energy development (USSED) projects. The first scenario prioritizes offsets based precisely on impacted features, while the second scenario offsets impacts to maximize biodiversity conservation gains in the region. The two methods only agree on 28% of their prioritized sites and differ in meeting species-specific offset goals. Differences between the two scenarios highlight the importance of clearly specifying choices and priorities for offset siting and mitigation in general. Similarly, the effects of background climate and land use change may lessen the durability or effectiveness of offsets if not considered. Our offset siting support tool was designed specifically for the DRECP area, but with minor code modification could work well in other offset analyses, and could provide continuing support for a potentially innovative mitigation solution to environmental impacts.
Kreitler, Jason; Schloss, Carrie A.; Soong, Oliver; Hannah, Lee; Davis, Frank W.
2015-01-01
Balancing society’s competing needs of development and conservation requires careful consideration of tradeoffs. Renewable energy development and biodiversity conservation are often considered beneficial environmental goals. The direct footprint and disturbance of renewable energy, however, can displace species’ habitat and negatively impact populations and natural communities if sited without ecological consideration. Offsets have emerged as a potentially useful tool to mitigate residual impacts after trying to avoid, minimize, or restore affected sites. Yet the problem of efficiently designing a set of offset sites becomes increasingly complex where many species or many sites are involved. Spatial conservation prioritization tools are designed to handle this problem, but have seen little application to offset siting and analysis. To address this need we designed an offset siting support tool for the Desert Renewable Energy Conservation Plan (DRECP) of California, and present a case study of hypothetical impacts from solar development in the Western Mojave subsection. We compare two offset scenarios designed to mitigate a hypothetical 15,331 ha derived from proposed utility-scale solar energy development (USSED) projects. The first scenario prioritizes offsets based precisely on impacted features, while the second scenario offsets impacts to maximize biodiversity conservation gains in the region. The two methods only agree on 28% of their prioritized sites and differ in meeting species-specific offset goals. Differences between the two scenarios highlight the importance of clearly specifying choices and priorities for offset siting and mitigation in general. Similarly, the effects of background climate and land use change may lessen the durability or effectiveness of offsets if not considered. Our offset siting support tool was designed specifically for the DRECP area, but with minor code modification could work well in other offset analyses, and could provide continuing support for a potentially innovative mitigation solution to environmental impacts. PMID:26529595
Johnson, Emily J; Won, Christina S; Köck, Kathleen; Paine, Mary F
2017-04-01
Natural products, including botanical dietary supplements and exotic drinks, represent an ever-increasing share of the health-care market. The parallel ever-increasing popularity of self-medicating with natural products increases the likelihood of co-consumption with conventional drugs, raising concerns for unwanted natural product-drug interactions. Assessing the drug interaction liability of natural products is challenging due to the complex and variable chemical composition inherent to these products, necessitating a streamlined preclinical testing approach to prioritize precipitant individual constituents for further investigation. Such an approach was evaluated in the current work to prioritize constituents in the model natural product, grapefruit juice, as inhibitors of intestinal organic anion-transporting peptide (OATP)-mediated uptake. Using OATP2B1-expressing MDCKII cells (Madin-Darby canine kidney type II) and the probe substrate estrone 3-sulfate, IC 50s were determined for constituents representative of the flavanone (naringin, naringenin, hesperidin), furanocoumarin (bergamottin, 6',7'-dihydroxybergamottin) and polymethoxyflavone (nobiletin and tangeretin) classes contained in grapefruit juice. Nobiletin was the most potent (IC 50 , 3.7 μm); 6',7'-dihydroxybergamottin, naringin, naringenin and tangeretin were moderately potent (IC 50 , 20-50 μm); and bergamottin and hesperidin were the least potent (IC 50 , >300 μm) OATP2B1 inhibitors. Intestinal absorption simulations based on physiochemical properties were used to determine the ratios of unbound concentration to IC 50 for each constituent within enterocytes and to prioritize in order of pre-defined cut-off values. This streamlined approach could be applied to other natural products that contain multiple precipitants of natural product-drug interactions. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.
A ranking system for prescribed burn prioritization in Table Mountain National Park, South Africa.
Cowell, Carly Ruth; Cheney, Chad
2017-04-01
To aid prescribed burn decision making in Table Mountain National Park, in South Africa a priority ranking system was tested. Historically a wildfire suppression strategy was adopted due to wildfires threatening urban areas close to the park, with few prescribed burns conducted. A large percentage of vegetation across the park exceeded the ecological threshold of 15 years. We held a multidisciplinary workshop, to prioritize areas for prescribed burning. Fire Management Blocks were mapped and assessed using the following seven categories: (1) ecological, (2) management, (3) tourism, (4) infrastructure, (5) invasive alien vegetation, (6) wildland-urban interface and (7) heritage. A priority ranking system was used to score each block. The oldest or most threatened vegetation types were not necessarily the top priority blocks. Selected blocks were burnt and burning fewer large blocks proved more effective economically, ecologically and practically due to the limited burning days permitted. The prioritization process was efficient as it could be updated annually following prescribed burns and wildfire incidents. Integration of prescribed burn planning and wildfire suppression strategies resulted in a reduction in operational costs. We recommend protected areas make use of a priority ranking system developed with expert knowledge and stakeholder engagement to determine objective prescribed burn plans. Copyright © 2017 Elsevier Ltd. All rights reserved.
Nallani, Gopinath; Venables, Barney; Constantine, Lisa; Huggett, Duane
2016-05-01
Evaluation of the environmental risk of human pharmaceuticals is now a mandatory component in all new drug applications submitted for approval in EU. With >3000 drugs currently in use, it is not feasible to test each active ingredient, so prioritization is key. A recent review has listed nine prioritization approaches including the fish plasma model (FPM). The present paper focuses on comparison of measured and predicted fish plasma bioconcentration factors (BCFs) of four common over-the-counter/prescribed pharmaceuticals: norethindrone (NET), ibuprofen (IBU), verapamil (VER) and clozapine (CLZ). The measured data were obtained from the earlier published fish BCF studies. The measured BCF estimates of NET, IBU, VER and CLZ were 13.4, 1.4, 0.7 and 31.2, while the corresponding predicted BCFs (based log Kow at pH 7) were 19, 1.0, 7.6 and 30, respectively. These results indicate that the predicted BCFs matched well the measured values. The BCF estimates were used to calculate the human: fish plasma concentration ratios of each drug to predict potential risk to fish. The plasma ratio results show the following order of risk potential for fish: NET > CLZ > VER > IBU. The FPM has value in prioritizing pharmaceutical products for ecotoxicological assessments.
Effects-based chemical category approach for prioritization of low affinity estrogenic chemicals.
Hornung, M W; Tapper, M A; Denny, J S; Kolanczyk, R C; Sheedy, B R; Hartig, P C; Aladjov, H; Henry, T R; Schmieder, P K
2014-01-01
Regulatory agencies are charged with addressing the endocrine disrupting potential of large numbers of chemicals for which there is often little or no data on which to make decisions. Prioritizing the chemicals of greatest concern for further screening for potential hazard to humans and wildlife is an initial step in the process. This paper presents the collection of in vitro data using assays optimized to detect low affinity estrogen receptor (ER) binding chemicals and the use of that data to build effects-based chemical categories following QSAR approaches and principles pioneered by Gilman Veith and colleagues for application to environmental regulatory challenges. Effects-based chemical categories were built using these QSAR principles focused on the types of chemicals in the specific regulatory domain of concern, i.e. non-steroidal industrial chemicals, and based upon a mechanistic hypothesis of how these non-steroidal chemicals of seemingly dissimilar structure to 17ß-estradiol (E2) could interact with the ER via two distinct binding types. Chemicals were also tested to solubility thereby minimizing false negatives and providing confidence in determination of chemicals as inactive. The high-quality data collected in this manner were used to build an ER expert system for chemical prioritization described in a companion article in this journal.
NASA Astrophysics Data System (ADS)
Jiao, Lihong; Amunugama, Kaushalya; Hayes, Matthew B.; Jennings, Michael; Domingo, Azriel; Hou, Chen
2015-08-01
Growing animals must alter their energy budget in the face of environmental changes and prioritize the energy allocation to metabolism for life-sustaining requirements and energy deposition in new biomass growth. We hypothesize that when food availability is low, larvae of holometabolic insects with a short development stage (relative to the low food availability period) prioritize biomass growth at the expense of metabolism. Driven by this hypothesis, we develop a simple theoretical model, based on conservation of energy and allometric scaling laws, for understanding the dynamic energy budget of growing larvae under food restriction. We test the hypothesis by manipulative experiments on fifth instar hornworms at three temperatures. At each temperature, food restriction increases the scaling power of growth rate but decreases that of metabolic rate, as predicted by the hypothesis. During the fifth instar, the energy budgets of larvae change dynamically. The free-feeding larvae slightly decrease the energy allocated to growth as body mass increases and increase the energy allocated to life sustaining. The opposite trends were observed in food restricted larvae, indicating the predicted prioritization in the energy budget under food restriction. We compare the energy budgets of a few endothermic and ectothermic species and discuss how different life histories lead to the differences in the energy budgets under food restriction.
regSNPs: a strategy for prioritizing regulatory single nucleotide substitutions
Teng, Mingxiang; Ichikawa, Shoji; Padgett, Leah R.; Wang, Yadong; Mort, Matthew; Cooper, David N.; Koller, Daniel L.; Foroud, Tatiana; Edenberg, Howard J.; Econs, Michael J.; Liu, Yunlong
2012-01-01
Motivation: One of the fundamental questions in genetics study is to identify functional DNA variants that are responsible to a disease or phenotype of interest. Results from large-scale genetics studies, such as genome-wide association studies (GWAS), and the availability of high-throughput sequencing technologies provide opportunities in identifying causal variants. Despite the technical advances, informatics methodologies need to be developed to prioritize thousands of variants for potential causative effects. Results: We present regSNPs, an informatics strategy that integrates several established bioinformatics tools, for prioritizing regulatory SNPs, i.e. the SNPs in the promoter regions that potentially affect phenotype through changing transcription of downstream genes. Comparing to existing tools, regSNPs has two distinct features. It considers degenerative features of binding motifs by calculating the differences on the binding affinity caused by the candidate variants and integrates potential phenotypic effects of various transcription factors. When tested by using the disease-causing variants documented in the Human Gene Mutation Database, regSNPs showed mixed performance on various diseases. regSNPs predicted three SNPs that can potentially affect bone density in a region detected in an earlier linkage study. Potential effects of one of the variants were validated using luciferase reporter assay. Contact: yunliu@iupui.edu Supplementary information: Supplementary data are available at Bioinformatics online PMID:22611130
Grieger, Khara D; Hansen, Steffen F; Sørensen, Peter B; Baun, Anders
2011-09-01
Conducting environmental risk assessment of engineered nanomaterials has been an extremely challenging endeavor thus far. Moreover, recent findings from the nano-risk scientific community indicate that it is unlikely that many of these challenges will be easily resolved in the near future, especially given the vast variety and complexity of nanomaterials and their applications. As an approach to help optimize environmental risk assessments of nanomaterials, we apply the Worst-Case Definition (WCD) model to identify best estimates for worst-case conditions of environmental risks of two case studies which use engineered nanoparticles, namely nZVI in soil and groundwater remediation and C(60) in an engine oil lubricant. Results generated from this analysis may ultimately help prioritize research areas for environmental risk assessments of nZVI and C(60) in these applications as well as demonstrate the use of worst-case conditions to optimize future research efforts for other nanomaterials. Through the application of the WCD model, we find that the most probable worst-case conditions for both case studies include i) active uptake mechanisms, ii) accumulation in organisms, iii) ecotoxicological response mechanisms such as reactive oxygen species (ROS) production and cell membrane damage or disruption, iv) surface properties of nZVI and C(60), and v) acute exposure tolerance of organisms. Additional estimates of worst-case conditions for C(60) also include the physical location of C(60) in the environment from surface run-off, cellular exposure routes for heterotrophic organisms, and the presence of light to amplify adverse effects. Based on results of this analysis, we recommend the prioritization of research for the selected applications within the following areas: organism active uptake ability of nZVI and C(60) and ecotoxicological response end-points and response mechanisms including ROS production and cell membrane damage, full nanomaterial characterization taking into account detailed information on nanomaterial surface properties, and investigations of dose-response relationships for a variety of organisms. Copyright © 2011 Elsevier B.V. All rights reserved.
Prioritizing individual genetic variants after kernel machine testing using variable selection.
He, Qianchuan; Cai, Tianxi; Liu, Yang; Zhao, Ni; Harmon, Quaker E; Almli, Lynn M; Binder, Elisabeth B; Engel, Stephanie M; Ressler, Kerry J; Conneely, Karen N; Lin, Xihong; Wu, Michael C
2016-12-01
Kernel machine learning methods, such as the SNP-set kernel association test (SKAT), have been widely used to test associations between traits and genetic polymorphisms. In contrast to traditional single-SNP analysis methods, these methods are designed to examine the joint effect of a set of related SNPs (such as a group of SNPs within a gene or a pathway) and are able to identify sets of SNPs that are associated with the trait of interest. However, as with many multi-SNP testing approaches, kernel machine testing can draw conclusion only at the SNP-set level, and does not directly inform on which one(s) of the identified SNP set is actually driving the associations. A recently proposed procedure, KerNel Iterative Feature Extraction (KNIFE), provides a general framework for incorporating variable selection into kernel machine methods. In this article, we focus on quantitative traits and relatively common SNPs, and adapt the KNIFE procedure to genetic association studies and propose an approach to identify driver SNPs after the application of SKAT to gene set analysis. Our approach accommodates several kernels that are widely used in SNP analysis, such as the linear kernel and the Identity by State (IBS) kernel. The proposed approach provides practically useful utilities to prioritize SNPs, and fills the gap between SNP set analysis and biological functional studies. Both simulation studies and real data application are used to demonstrate the proposed approach. © 2016 WILEY PERIODICALS, INC.
77 FR 69637 - Development of Prioritized Therapeutic Area Data Standards; Request for Comments
Federal Register 2010, 2011, 2012, 2013, 2014
2012-11-20
... develop the data standards in collaboration with CDISC and other open standards organizations. FDA is...] Development of Prioritized Therapeutic Area Data Standards; Request for Comments AGENCY: Food and Drug... announcing the intent to prioritize and develop therapeutic area data standards to facilitate the conduct of...
Decision making in prioritization of required operational capabilities
NASA Astrophysics Data System (ADS)
Andreeva, P.; Karev, M.; Kovacheva, Ts.
2015-10-01
The paper describes an expert heuristic approach to prioritization of required operational capabilities in the field of defense. Based on expert assessment and by application of the method of Analytical Hierarchical Process, a methodology for their prioritization has been developed. It has been applied to practical simulation decision making games.
Federal Register 2010, 2011, 2012, 2013, 2014
2011-12-19
... America Invents Act includes provisions for prioritized examination of patent applications. The United States Patent and Trademark Office (Office) implemented the Leahy-Smith America Invents Act prioritized... claims were found allowable, or add new claims, subject only to the limitations applicable to any...
Parsing the life-shortening effects of dietary protein: effects of individual amino acids
Bouchebti, Sofia; Bazazi, Sepideh; Le Hesran, Sophie; Puga, Camille; Latil, Gérard; Simpson, Stephen J.
2017-01-01
High-protein diets shorten lifespan in many organisms. Is it because protein digestion is energetically costly or because the final products (the amino acids) are harmful? To answer this question while circumventing the life-history trade-off between reproduction and longevity, we fed sterile ant workers on diets based on whole proteins or free amino acids. We found that (i) free amino acids shortened lifespan even more than proteins; (ii) the higher the amino acid-to-carbohydrate ratio, the shorter ants lived and the lower their lipid reserves; (iii) for the same amino acid-to-carbohydrate ratio, ants eating free amino acids had more lipid reserves than those eating whole proteins; and (iv) on whole protein diets, ants seem to regulate food intake by prioritizing sugar, while on free amino acid diets, they seem to prioritize amino acids. To test the effect of the amino acid profile, we tested diets containing proportions of each amino acid that matched the ant's exome; surprisingly, longevity was unaffected by this change. We further tested diets with all amino acids under-represented except one, finding that methionine, serine, threonine and phenylalanine are especially harmful. All together, our results show certain amino acids are key elements behind the high-protein diet reduction in lifespan. PMID:28053059
Parsing the life-shortening effects of dietary protein: effects of individual amino acids.
Arganda, Sara; Bouchebti, Sofia; Bazazi, Sepideh; Le Hesran, Sophie; Puga, Camille; Latil, Gérard; Simpson, Stephen J; Dussutour, Audrey
2017-01-11
High-protein diets shorten lifespan in many organisms. Is it because protein digestion is energetically costly or because the final products (the amino acids) are harmful? To answer this question while circumventing the life-history trade-off between reproduction and longevity, we fed sterile ant workers on diets based on whole proteins or free amino acids. We found that (i) free amino acids shortened lifespan even more than proteins; (ii) the higher the amino acid-to-carbohydrate ratio, the shorter ants lived and the lower their lipid reserves; (iii) for the same amino acid-to-carbohydrate ratio, ants eating free amino acids had more lipid reserves than those eating whole proteins; and (iv) on whole protein diets, ants seem to regulate food intake by prioritizing sugar, while on free amino acid diets, they seem to prioritize amino acids. To test the effect of the amino acid profile, we tested diets containing proportions of each amino acid that matched the ant's exome; surprisingly, longevity was unaffected by this change. We further tested diets with all amino acids under-represented except one, finding that methionine, serine, threonine and phenylalanine are especially harmful. All together, our results show certain amino acids are key elements behind the high-protein diet reduction in lifespan. © 2017 The Author(s).
NASA Technical Reports Server (NTRS)
Van Baalen, Mary; Mason, Sara; Foy, Millennia; Wear, Mary; Taiym, Wafa; Moynihan, Shannan; Alexander, David; Hart, Steve; Tarver, William
2015-01-01
Due to recently identified vision changes associated with space flight, JSC Space and Clinical Operations (SCO) implemented broad mission-related vision testing starting in 2009. Optical Coherence Tomography (OCT), 3 Tesla Brain and Orbit MRIs, Optical Biometry were implemented terrestrially for clinical monitoring. While no inflight vision testing was in place, already available onorbit technology was leveraged to facilitate in-flight clinical monitoring, including visual acuity, Amsler grid, tonometry, and ultrasonography. In 2013, on-orbit testing capabilities were expanded to include contrast sensitivity testing and OCT. As these additional testing capabilities have been added, resource prioritization, particularly crew time, is under evaluation.
Hajrahimi, Nafiseh; Dehaghani, Sayed Mehdi Hejazi; Hajrahimi, Nargess; Sarmadi, Sima
2014-01-01
Context: Implementing information technology in the best possible way can bring many advantages such as applying electronic services and facilitating tasks. Therefore, assessment of service providing systems is a way to improve the quality and elevate these systems including e-commerce, e-government, e-banking, and e-learning. Aims: This study was aimed to evaluate the electronic services in the website of Isfahan University of Medical Sciences in order to propose solutions to improve them. Furthermore, we aim to rank the solutions based on the factors that enhance the quality of electronic services by using analytic hierarchy process (AHP) method. Materials and Methods: Non-parametric test was used to assess the quality of electronic services. The assessment of propositions was based on Aqual model and they were prioritized using AHP approach. The AHP approach was used because it directly applies experts’ deductions in the model, and lead to more objective results in the analysis and prioritizing the risks. After evaluating the quality of the electronic services, a multi-criteria decision making frame-work was used to prioritize the proposed solutions. Statistical Analysis Used: Non-parametric tests and AHP approach using Expert Choice software. Results: The results showed that students were satisfied in most of the indicators. Only a few indicators received low satisfaction from students including, design attractiveness, the amount of explanation and details of information, honesty and responsiveness of authorities, and the role of e-services in the user's relationship with university. After interviewing with Information and Communications Technology (ICT) experts at the university, measurement criteria, and solutions to improve the quality were collected. The best solutions were selected by EC software. According to the results, the solution “controlling and improving the process in handling users complaints” is of the utmost importance and authorities have to have it on the website and place great importance on updating this process. Conclusions: Although, 4 out of the 22 indicators used in the test hypothesis were not confirmed, the results show that these assumptions are accepted at 95% confidence level. To improve the quality of electronic services, special attention should be paid to “services interaction.” As the results showed having “controlling and improving the process in handling users complaints” on the website is the first and most important one and the process of “changing brand/factory name/address in the text of the factory license/renewal or modification of manufacturing license/changing the formula” is the least important one. PMID:25540790
Hajrahimi, Nafiseh; Dehaghani, Sayed Mehdi Hejazi; Hajrahimi, Nargess; Sarmadi, Sima
2014-01-01
Implementing information technology in the best possible way can bring many advantages such as applying electronic services and facilitating tasks. Therefore, assessment of service providing systems is a way to improve the quality and elevate these systems including e-commerce, e-government, e-banking, and e-learning. This study was aimed to evaluate the electronic services in the website of Isfahan University of Medical Sciences in order to propose solutions to improve them. Furthermore, we aim to rank the solutions based on the factors that enhance the quality of electronic services by using analytic hierarchy process (AHP) method. Non-parametric test was used to assess the quality of electronic services. The assessment of propositions was based on Aqual model and they were prioritized using AHP approach. The AHP approach was used because it directly applies experts' deductions in the model, and lead to more objective results in the analysis and prioritizing the risks. After evaluating the quality of the electronic services, a multi-criteria decision making frame-work was used to prioritize the proposed solutions. Non-parametric tests and AHP approach using Expert Choice software. The results showed that students were satisfied in most of the indicators. Only a few indicators received low satisfaction from students including, design attractiveness, the amount of explanation and details of information, honesty and responsiveness of authorities, and the role of e-services in the user's relationship with university. After interviewing with Information and Communications Technology (ICT) experts at the university, measurement criteria, and solutions to improve the quality were collected. The best solutions were selected by EC software. According to the results, the solution "controlling and improving the process in handling users complaints" is of the utmost importance and authorities have to have it on the website and place great importance on updating this process. Although, 4 out of the 22 indicators used in the test hypothesis were not confirmed, the results show that these assumptions are accepted at 95% confidence level. To improve the quality of electronic services, special attention should be paid to "services interaction." As the results showed having "controlling and improving the process in handling users complaints" on the website is the first and most important one and the process of "changing brand/factory name/address in the text of the factory license/renewal or modification of manufacturing license/changing the formula" is the least important one.
Deokar, Angela J.; Dellapenna, Alan; DeFiore-Hyrmer, Jolene; Laidler, Matt; Millet, Lisa; Morman, Sara; Myers, Lindsey
2018-01-01
The Centers for Disease Control and Prevention’s (CDC’s) Core Violence and Injury Prevention Program (Core) supports capacity of state violence and injury prevention programs to implement evidence-based interventions. Several Core-funded states prioritized prescription drug overdose (PDO) and leveraged their systems to identify and respond to the epidemic before specific PDO prevention funding was available through CDC. This article describes activities employed by Core-funded states early in the epidemic. Four case examples illustrate states’ approaches within the context of their systems and partners. While Core funding is not sufficient to support a comprehensive PDO prevention program, having Core in place at the beginning of the emerging epidemic had critical implications for identifying the problem and developing systems that were later expanded as additional resources became available. Important components included staffing support to bolster programmatic and epidemiological capacity; diverse and collaborative partnerships; and use of surveillance and evidence-informed best practices to prioritize decision-making. PMID:29189501
Hughes, David; Williams-Jones, Bryn
2013-08-01
In the context of scarce public resources, patient interest groups have increasingly turned to private organizations for financing, including the pharmaceutical industry. This practice puts advocacy groups in a situation of potential conflicts between the interests of patients and those of the drug companies. The interests of patients and industry can converge on issues related to the approval and reimbursement of medications. But even on this issue, interests do not always align perfectly. Using the Quebec example of Coalition Priorité Cancer (CPC) as a case study, we examine the ethical issues raised by such financial relationships in the context of drug reimbursement decision-making. We collected, compiled and analyzed publicly available information on the CPC's organization and activities; this approach allowed us to raise and discuss important questions regarding the possible influence exerted on patient groups by donors. We conclude with some recommendations. Copyright © 2013 Longwoods Publishing.
Image-based tracking and sensor resource management for UAVs in an urban environment
NASA Astrophysics Data System (ADS)
Samant, Ashwin; Chang, K. C.
2010-04-01
Coordination and deployment of multiple unmanned air vehicles (UAVs) requires a lot of human resources in order to carry out a successful mission. The complexity of such a surveillance mission is significantly increased in the case of an urban environment where targets can easily escape from the UAV's field of view (FOV) due to intervening building and line-of-sight obstruction. In the proposed methodology, we focus on the control and coordination of multiple UAVs having gimbaled video sensor onboard for tracking multiple targets in an urban environment. We developed optimal path planning algorithms with emphasis on dynamic target prioritizations and persistent target updates. The command center is responsible for target prioritization and autonomous control of multiple UAVs, enabling a single operator to monitor and control a team of UAVs from a remote location. The results are obtained using extensive 3D simulations in Google Earth using Tangent plus Lyapunov vector field guidance for target tracking.
NASA Astrophysics Data System (ADS)
Chakroun, Mahmoud; Gogu, Grigore; Pacaud, Thomas; Thirion, François
2014-09-01
This study proposes an eco-innovative design process taking into consideration quality and environmental aspects in prioritizing and solving technical engineering problems. This approach provides a synergy between the Life Cycle Assessment (LCA), the nonquality matrix, the Theory of Inventive Problem Solving (TRIZ), morphological analysis and the Analytical Hierarchy Process (AHP). In the sequence of these tools, LCA assesses the environmental impacts generated by the system. Then, for a better consideration of environmental aspects, a new tool is developed, the non-quality matrix, which defines the problem to be solved first from an environmental point of view. The TRIZ method allows the generation of new concepts and contradiction resolution. Then, the morphological analysis offers the possibility of extending the search space of solutions in a design problem in a systematic way. Finally, the AHP identifies the promising solution(s) by providing a clear logic for the choice made. Their usefulness has been demonstrated through their application to a case study involving a centrifugal spreader with spinning discs.
Deokar, Angela J; Dellapenna, Alan; DeFiore-Hyrmer, Jolene; Laidler, Matt; Millet, Lisa; Morman, Sara; Myers, Lindsey
The Centers for Disease Control and Prevention's (CDC's) Core Violence and Injury Prevention Program (Core) supports capacity of state violence and injury prevention programs to implement evidence-based interventions. Several Core-funded states prioritized prescription drug overdose (PDO) and leveraged their systems to identify and respond to the epidemic before specific PDO prevention funding was available through CDC. This article describes activities employed by Core-funded states early in the epidemic. Four case examples illustrate states' approaches within the context of their systems and partners. While Core funding is not sufficient to support a comprehensive PDO prevention program, having Core in place at the beginning of the emerging epidemic had critical implications for identifying the problem and developing systems that were later expanded as additional resources became available. Important components included staffing support to bolster programmatic and epidemiological capacity; diverse and collaborative partnerships; and use of surveillance and evidence-informed best practices to prioritize decision-making.
Risk-based decision-making framework for the selection of sediment dredging option.
Manap, Norpadzlihatun; Voulvoulis, Nikolaos
2014-10-15
The aim of this study was to develop a risk-based decision-making framework for the selection of sediment dredging option. Descriptions using case studies of the newly integrated, holistic and staged framework were followed. The first stage utilized the historical dredging monitoring data and the contamination level in media data into Ecological Risk Assessment phases, which have been altered for benefits in cost, time and simplicity. How Multi-Criteria Decision Analysis (MCDA) can be used to analyze and prioritize dredging areas based on environmental, socio-economic and managerial criteria was described for the next stage. The results from MCDA will be integrated into Ecological Risk Assessment to characterize the degree of contamination in the prioritized areas. The last stage was later described using these findings and analyzed using MCDA, in order to identify the best sediment dredging option, accounting for the economic, environmental and technical aspects of dredging, which is beneficial for dredging and sediment management industries. Copyright © 2014 Elsevier B.V. All rights reserved.
Masino, Aaron J; Dechene, Elizabeth T; Dulik, Matthew C; Wilkens, Alisha; Spinner, Nancy B; Krantz, Ian D; Pennington, Jeffrey W; Robinson, Peter N; White, Peter S
2014-07-21
Exome sequencing is a promising method for diagnosing patients with a complex phenotype. However, variant interpretation relative to patient phenotype can be challenging in some scenarios, particularly clinical assessment of rare complex phenotypes. Each patient's sequence reveals many possibly damaging variants that must be individually assessed to establish clear association with patient phenotype. To assist interpretation, we implemented an algorithm that ranks a given set of genes relative to patient phenotype. The algorithm orders genes by the semantic similarity computed between phenotypic descriptors associated with each gene and those describing the patient. Phenotypic descriptor terms are taken from the Human Phenotype Ontology (HPO) and semantic similarity is derived from each term's information content. Model validation was performed via simulation and with clinical data. We simulated 33 Mendelian diseases with 100 patients per disease. We modeled clinical conditions by adding noise and imprecision, i.e. phenotypic terms unrelated to the disease and terms less specific than the actual disease terms. We ranked the causative gene against all 2488 HPO annotated genes. The median causative gene rank was 1 for the optimal and noise cases, 12 for the imprecision case, and 60 for the imprecision with noise case. Additionally, we examined a clinical cohort of subjects with hearing impairment. The disease gene median rank was 22. However, when also considering the patient's exome data and filtering non-exomic and common variants, the median rank improved to 3. Semantic similarity can rank a causative gene highly within a gene list relative to patient phenotype characteristics, provided that imprecision is mitigated. The clinical case results suggest that phenotype rank combined with variant analysis provides significant improvement over the individual approaches. We expect that this combined prioritization approach may increase accuracy and decrease effort for clinical genetic diagnosis.
Reprioritizing government spending on health: pushing an elephant up the stairs?
Tandon, Ajay; Fleisher, Lisa; Li, Rong; Yap, Wei Aun
2014-01-01
Countries vary widely with respect to the share of government spending on health, a metric that can serve as a proxy for the extent to which health is prioritized by governments. World Health Organization (WHO) data estimate that, in 2011, health's share of aggregate government expenditure averaged 12% in the 170 countries for which data were available. However, country differences were striking: ranging from a low of 1% in Myanmar to a high of 28% in Costa Rica. Some of the observed differences in health's share of government spending across countries are unsurprisingly related to differences in national income. However, significant variations exist in health's share of government spending even after controlling for national income. This paper provides a global overview of health's share of government spending and summarizes some of the key theoretical and empirical perspectives on allocation of public resources to health vis-à-vis other sectors from the perspective of reprioritization, one of the modalities for realizing fiscal space for health. The paper argues that theory and cross-country empirical analyses do not provide clear-cut explanations for the observed variations in government prioritization of health. Standard economic theory arguments that are often used to justify public financing for health are equally applicable to many other sectors including defence, education and infrastructure. To date, empirical work on prioritization has been sparse: available cross-country econometric analyses suggest that factors such as democratization, lower levels of corruption, ethnolinguistic homogeneity and more women in public office are correlated with higher shares of public spending on health; however, these findings are not robust and are sensitive to model specification. Evidence from case studies suggests that country-specific political economy considerations are key, and that results-focused reform efforts - in particular efforts to explicitly expand the breadth and depth of health coverage as opposed to efforts focused only on government budgetary benchmarking targets - are more likely to result in sustained and politically feasible prioritization of health from a fiscal space perspective.
Connection and Community: Diné College Emphasizes Real-World Experience in Public Health
ERIC Educational Resources Information Center
Bauer, Mark
2016-01-01
The Summer Research Enhancement Program (SREP) at Diné College provides students with a solid foundation of public health research methods and includes a hands-on internship in their home community to test their newly acquired skills while enhancing the communities' health. Focusing on health issues prioritized by Navajo health leaders, from…
Multiple drivers shape the types of human-health assessments performed on chemicals by U.S. EPA resulting in chemical assessments are “fit-for-purpose” ranging from prioritization for further testing to full risk assessments. Layered on top of the diverse assessment needs are the...
ERIC Educational Resources Information Center
Gerlinger, Julie; Wo, James C.
2016-01-01
A common response to school violence features the use of security measures to deter serious and violent incidents. However, a second approach, based on school climate theory, suggests that schools exhibiting authoritative school discipline (i.e., high structure and support) might more effectively reduce school disorder. We tested these approaches…
Graham, Jennifer E; Heatley, J Jill
2007-05-01
Raptors may present with a variety of conditions, such as trauma, debilitation, and disease, that necessitate emergency care. Emergency treatment should prioritize stabilization of the patient. Diagnostic testing should be delayed until feasible based on patient status. This article reviews emergency medicine in raptors, including appropriate handling and restraint, hospitalization, triage and patient assessment, sample collection, supportive care, and common emergency presentations.
Political Economy and the NCLB Regime: Accountability, Standards, and High-Stakes Testing
ERIC Educational Resources Information Center
Parkison, Paul
2009-01-01
Focus and institutional policy under the No Child Left Behind Act [NCLB] (U.S. Department of Education 2001) has prioritized the individualistic, market-driven agenda. The NCLB regime has gained hegemony over the political space of public education, and the value and effectiveness of the educational process has become subject to the fetishism of…
Prioritizing Threats to Patient Safety in Rural Primary Care
ERIC Educational Resources Information Center
Singh, Ranjit; Singh, Ashok; Servoss, Timothy J.; Singh, Gurdev
2007-01-01
Context: Rural primary care is a complex environment in which multiple patient safety challenges can arise. To make progress in improving safety with limited resources, each practice needs to identify those safety problems that pose the greatest threat to patients and focus efforts on these. Purpose: To describe and field-test a novel approach to…
ERIC Educational Resources Information Center
Hamlin, Daniel
2017-01-01
Families in deindustrialized cities with high crime rates report prioritizing school safety when opting for charter schools. Yet, very little research has investigated whether charter schools are safer than traditional public schools. This study compares charter and traditional public schools in Detroit, Michigan, on perceived school safety by…
US EPA’s ToxCast program has screened thousands of chemicals in hundreds of mammalian-based HTS assays for biological activity suggestive of potential toxic effects. These data are being used to prioritize toxicity testing to focus on chemicals likely to lead to adverse health ef...
Brief Report: Reduced Prioritization of Facial Threat in Adults with Autism
ERIC Educational Resources Information Center
Sasson, Noah J.; Shasteen, Jonathon R.; Pinkham, Amy E.
2016-01-01
Typically-developing (TD) adults detect angry faces more efficiently within a crowd than non-threatening faces. Prior studies of this social threat superiority effect (TSE) in ASD using tasks consisting of schematic faces and homogeneous crowds have produced mixed results. Here, we employ a more ecologically-valid test of the social TSE and find…
Abstract: There are tens of thousands of man-made chemicals to which humans are exposed, but only a fraction of these have the extensive in vivo toxicity data used in most traditional risk assessments. This lack of data, coupled with concerns about testing costs and animal use, a...
Determining procedures for simulation-based training in radiology: a nationwide needs assessment.
Nayahangan, Leizl Joy; Nielsen, Kristina Rue; Albrecht-Beste, Elisabeth; Bachmann Nielsen, Michael; Paltved, Charlotte; Lindorff-Larsen, Karen Gilboe; Nielsen, Bjørn Ulrik; Konge, Lars
2018-06-01
New training modalities such as simulation are widely accepted in radiology; however, development of effective simulation-based training programs is challenging. They are often unstructured and based on convenience or coincidence. The study objective was to perform a nationwide needs assessment to identify and prioritize technical procedures that should be included in a simulation-based curriculum. A needs assessment using the Delphi method was completed among 91 key leaders in radiology. Round 1 identified technical procedures that radiologists should learn. Round 2 explored frequency of procedure, number of radiologists performing the procedure, risk and/or discomfort for patients, and feasibility for simulation. Round 3 was elimination and prioritization of procedures. Response rates were 67 %, 70 % and 66 %, respectively. In Round 1, 22 technical procedures were included. Round 2 resulted in pre-prioritization of procedures. In round 3, 13 procedures were included in the final prioritized list. The three highly prioritized procedures were ultrasound-guided (US) histological biopsy and fine-needle aspiration, US-guided needle puncture and catheter drainage, and basic abdominal ultrasound. A needs assessment identified and prioritized 13 technical procedures to include in a simulation-based curriculum. The list may be used as guide for development of training programs. • Simulation-based training can supplement training on patients in radiology. • Development of simulation-based training should follow a structured approach. • The CAMES Needs Assessment Formula explores needs for simulation training. • A national Delphi study identified and prioritized procedures suitable for simulation training. • The prioritized list serves as guide for development of courses in radiology.
Booth, Chelsea L
2014-09-01
The Research Prioritization Task Force of the National Action Alliance for Suicide Prevention conducted a stakeholder survey including 716 respondents from 49 U.S. states and 18 foreign countries. To conduct a qualitative analysis on responses from individuals representing four main stakeholder groups: attempt and loss survivors, researchers, providers, and policy/administrators. This article focuses on a qualitative analysis of the early-round, open-ended responses collected in a modified online Delphi process, and, as an illustration of the research method, focuses on analysis of respondents' views of the role of life and emotional skills in suicide prevention. Content analysis was performed using both inductive and deductive code and category development and systematic qualitative methods. After the inductive coding was completed, the same data set was re-coded using the 12 Aspirational Goals (AGs) identified by the Delphi process. Codes and thematic categories produced from the inductive coding process were, in some cases, very similar or identical to the 12 AGs (i.e., those dealing with risk and protective factors, provider training, preventing reattempts, and stigma). Other codes highlighted areas that were not identified as important in the Delphi process (e.g., cultural/social factors of suicide, substance use). Qualitative and mixed-methods research are essential to the future of suicide prevention work. By design, qualitative research is explorative and appropriate for complex, culturally embedded social issues such as suicide. Such research can be used to generate hypotheses for testing and, as in this analysis, illuminate areas that would be missed in an approach that imposed predetermined categories on data. Published by Elsevier Inc.
Evaluation of genome-wide association study results through development of ontology fingerprints
Tsoi, Lam C.; Boehnke, Michael; Klein, Richard L.; Zheng, W. Jim
2009-01-01
Motivation: Genome-wide association (GWA) studies may identify multiple variants that are associated with a disease or trait. To narrow down candidates for further validation, quantitatively assessing how identified genes relate to a phenotype of interest is important. Results: We describe an approach to characterize genes or biological concepts (phenotypes, pathways, diseases, etc.) by ontology fingerprint—the set of Gene Ontology (GO) terms that are overrepresented among the PubMed abstracts discussing the gene or biological concept together with the enrichment p-value of these terms generated from a hypergeometric enrichment test. We then quantify the relevance of genes to the trait from a GWA study by calculating similarity scores between their ontology fingerprints using enrichment p-values. We validate this approach by correctly identifying corresponding genes for biological pathways with a 90% average area under the ROC curve (AUC). We applied this approach to rank genes identified through a GWA study that are associated with the lipid concentrations in plasma as well as to prioritize genes within linkage disequilibrium (LD) block. We found that the genes with highest scores were: ABCA1, lipoprotein lipase (LPL) and cholesterol ester transfer protein, plasma for high-density lipoprotein; low-density lipoprotein receptor, APOE and APOB for low-density lipoprotein; and LPL, APOA1 and APOB for triglyceride. In addition, we identified genes relevant to lipid metabolism from the literature even in cases where such knowledge was not reflected in current annotation of these genes. These results demonstrate that ontology fingerprints can be used effectively to prioritize genes from GWA studies for experimental validation. Contact: zhengw@musc.edu Supplementary information: Supplementary data are available at Bioinformatics online. PMID:19349285
Arts and Humanities Research, Redefining Public Benefit, and Research Prioritization in Ireland
ERIC Educational Resources Information Center
Gibson, Andrew G.; Hazelkorn, Ellen
2017-01-01
This article looks at the effects of a national policy of research prioritization in the years following Ireland's economic crisis. A national research prioritization exercise initiated by policymakers redefined the purpose of higher education research, and designed policies in line with this approach. Placing research for enterprise to the fore,…
Multiple chemical regulatory bodies (US EPA, ECHA, OECD, Health Canada) are currently tasked with prioritizing chemicals for in-depth risk assessments. These prioritization efforts are driven by the fact that there are many chemicals in commerce, or in the environment for which d...
From Prioritizing Objects to Prioritizing Cues: A Developmental Shift for Cognitive Control
ERIC Educational Resources Information Center
Chevalier, Nicolas; Dauvier, Bruno; Blaye, Agnès
2018-01-01
Emerging cognitive control supports increasingly adaptive behaviors and predicts life success, while low cognitive control is a major risk factor during childhood. It is therefore essential to understand how it develops. The present study provides evidence for an age-related shift in the type of information that children prioritize in their…
Multiple chemical regulatory bodies (US EPA, ECHA, OECD, Health Canada) are currently tasked with prioritizing chemicals for in-depth risk assessments. These prioritization efforts are driven by the fact that there are many chemicals in commerce, or in the environment for which d...
The Effect of a Workload-Preview on Task-Prioritization and Task-Performance
ERIC Educational Resources Information Center
Minotra, Dev
2012-01-01
With increased volume and sophistication of cyber attacks in recent years, maintaining situation awareness and effective task-prioritization strategy is critical to the task of cybersecurity analysts. However, high levels of mental-workload associated with the task of cybersecurity analyst's limits their ability to prioritize tasks.…
32 CFR Appendix A to Part 179 - Tables of the Munitions Response Site Prioritization Protocol
Code of Federal Regulations, 2010 CFR
2010-07-01
... 32 National Defense 1 2010-07-01 2010-07-01 false Tables of the Munitions Response Site... OF DEFENSE CLOSURES AND REALIGNMENT MUNITIONS RESPONSE SITE PRIORITIZATION PROTOCOL (MRSPP) Pt. 179, App. A Appendix A to Part 179—Tables of the Munitions Response Site Prioritization Protocol The tables...
32 CFR Appendix A to Part 179 - Tables of the Munitions Response Site Prioritization Protocol
Code of Federal Regulations, 2012 CFR
2012-07-01
... 32 National Defense 1 2012-07-01 2012-07-01 false Tables of the Munitions Response Site... OF DEFENSE CLOSURES AND REALIGNMENT MUNITIONS RESPONSE SITE PRIORITIZATION PROTOCOL (MRSPP) Pt. 179, App. A Appendix A to Part 179—Tables of the Munitions Response Site Prioritization Protocol The tables...
32 CFR Appendix A to Part 179 - Tables of the Munitions Response Site Prioritization Protocol
Code of Federal Regulations, 2014 CFR
2014-07-01
... 32 National Defense 1 2014-07-01 2014-07-01 false Tables of the Munitions Response Site... OF DEFENSE CLOSURES AND REALIGNMENT MUNITIONS RESPONSE SITE PRIORITIZATION PROTOCOL (MRSPP) Pt. 179, App. A Appendix A to Part 179—Tables of the Munitions Response Site Prioritization Protocol The tables...
32 CFR Appendix A to Part 179 - Tables of the Munitions Response Site Prioritization Protocol
Code of Federal Regulations, 2011 CFR
2011-07-01
... 32 National Defense 1 2011-07-01 2011-07-01 false Tables of the Munitions Response Site... OF DEFENSE CLOSURES AND REALIGNMENT MUNITIONS RESPONSE SITE PRIORITIZATION PROTOCOL (MRSPP) Pt. 179, App. A Appendix A to Part 179—Tables of the Munitions Response Site Prioritization Protocol The tables...
32 CFR Appendix A to Part 179 - Tables of the Munitions Response Site Prioritization Protocol
Code of Federal Regulations, 2013 CFR
2013-07-01
... 32 National Defense 1 2013-07-01 2013-07-01 false Tables of the Munitions Response Site... OF DEFENSE CLOSURES AND REALIGNMENT MUNITIONS RESPONSE SITE PRIORITIZATION PROTOCOL (MRSPP) Pt. 179, App. A Appendix A to Part 179—Tables of the Munitions Response Site Prioritization Protocol The tables...