Science.gov

Sample records for applying risk analysis

  1. Risk analysis for confined space entries: Critical analysis of four tools applied to three risk scenarios.

    PubMed

    Burlet-Vienney, Damien; Chinniah, Yuvin; Bahloul, Ali; Roberge, Brigitte

    2016-01-01

    Investigation reports of fatal confined space accidents nearly always point to a problem of identifying or underestimating risks. This paper compares 4 different risk analysis tools developed for confined spaces by applying them to 3 hazardous scenarios. The tools were namely 1. a checklist without risk estimation (Tool A), 2. a checklist with a risk scale (Tool B), 3. a risk calculation without a formal hazard identification stage (Tool C), and 4. a questionnaire followed by a risk matrix (Tool D). Each tool's structure and practical application were studied. Tools A and B gave crude results comparable to those of more analytic tools in less time. Their main limitations were lack of contextual information for the identified hazards and greater dependency on the user's expertise and ability to tackle hazards of different nature. Tools C and D utilized more systematic approaches than tools A and B by supporting risk reduction based on the description of the risk factors. Tool D is distinctive because of 1. its comprehensive structure with respect to the steps suggested in risk management, 2. its dynamic approach to hazard identification, and 3. its use of data resulting from the risk analysis.

  2. A Hygrothermal Risk Analysis Applied to Residential Unvented Attics

    SciTech Connect

    Pallin, Simon B; Kehrer, Manfred

    2013-01-01

    Aresidential building, constructed with an unvented attic, is acommonroof assembly in the United States.The expected hygrothermal performance and service life of the roof are difficult to estimate due to a number of varying parameters.Typical parameters expected to vary are the climate, direction, and slope of the roof as well as the radiation properties of the surface material. Furthermore, influential parameters are indoor moisture excess, air leakages through the attic floor, and leakages from air-handling unit and ventilation ducts. In addition, the type of building materials such as the insulation material and closed or open cell spray polyurethane foam will influence the future performance of the roof. A development of a simulation model of the roof assembly will enable a risk and sensitivity analysis, in which the most important varying parameters on the hygrothermal performance can be determined. The model is designed to perform probabilistic simulations using mathematical and hygrothermal calculation tools. The varying input parameters can be chosen from existing measurements, simulations, or standards. An analysis is applied to determine the risk of consequences, such as mold growth, rot, or energy demand of the HVAC unit. Furthermore, the future performance of the roof can be simulated in different climates to facilitate the design of an efficient and reliable roof construction with the most suitable technical solution and to determine the most appropriate building materials for a given climate

  3. Downside Risk analysis applied to the Hedge Funds universe

    NASA Astrophysics Data System (ADS)

    Perelló, Josep

    2007-09-01

    Hedge Funds are considered as one of the portfolio management sectors which shows a fastest growing for the past decade. An optimal Hedge Fund management requires an appropriate risk metrics. The classic CAPM theory and its Ratio Sharpe fail to capture some crucial aspects due to the strong non-Gaussian character of Hedge Funds statistics. A possible way out to this problem while keeping the CAPM simplicity is the so-called Downside Risk analysis. One important benefit lies in distinguishing between good and bad returns, that is: returns greater or lower than investor's goal. We revisit most popular Downside Risk indicators and provide new analytical results on them. We compute these measures by taking the Credit Suisse/Tremont Investable Hedge Fund Index Data and with the Gaussian case as a benchmark. In this way, an unusual transversal lecture of the existing Downside Risk measures is provided.

  4. An advanced method for flood risk analysis in river deltas, applied to societal flood fatality risks in the Netherlands

    NASA Astrophysics Data System (ADS)

    de Bruijn, K. M.; Diermanse, F. L. M.; Beckers, J. V. L.

    2014-02-01

    This paper discusses the new method developed to analyse flood risks in river deltas. Risk analysis of river deltas is complex, because both storm surges and river discharges may cause flooding and since the effect of upstream breaches on downstream water levels and flood risks must be taken into account. A Monte Carlo based flood risk analysis framework for policy making was developed, which considers both storm surges and river flood waves and includes hydrodynamic interaction effects on flood risks. It was applied to analyse societal flood fatality risks (the probability of events with more than N fatalities) in the Rhine-Meuse delta.

  5. An advanced method for flood risk analysis in river deltas, applied to societal flood fatality risk in the Netherlands

    NASA Astrophysics Data System (ADS)

    de Bruijn, K. M.; Diermanse, F. L. M.; Beckers, J. V. L.

    2014-10-01

    This paper discusses a new method for flood risk assessment in river deltas. Flood risk analysis of river deltas is complex, because both storm surges and river discharges may cause flooding and the effect of upstream breaches on downstream water levels and flood risk must be taken into account. This paper presents a Monte Carlo-based flood risk analysis framework for policy making, which considers both storm surges and river flood waves and includes effects from hydrodynamic interaction on flood risk. It was applied to analyse societal flood fatality risk in the Rhine-Meuse delta.

  6. Hazard analysis and critical control point systems applied to public health risks: the example of seafood.

    PubMed

    Williams, R A; Zorn, D J

    1997-08-01

    The authors describe the way in which the two components of risk analysis--risk assessment and risk management--can be used in conjunction with the hazard analysis and critical control points concept to determine the allocation of resources at potential critical control points. This approach is examined in the context of risks to human health associated with seafood, and in particular with regard to ciguatera poisoning.

  7. Synthesis of semantic modelling and risk analysis methodology applied to animal welfare.

    PubMed

    Bracke, M B M; Edwards, S A; Metz, J H M; Noordhuizen, J P T M; Algers, B

    2008-07-01

    Decision-making on animal welfare issues requires a synthesis of information. For the assessment of farm animal welfare based on scientific information collected in a database, a methodology called 'semantic modelling' has been developed. To date, however, this methodology has not been generally applied. Recently, a qualitative Risk Assessment approach has been published by the European Food Safety Authority (EFSA) for the first time, concerning the welfare of intensively reared calves. This paper reports on a critical analysis of this Risk Assessment (RA) approach from a semantic-modelling (SM) perspective, emphasizing the importance of several seemingly self-evident principles, including the definition of concepts, application of explicit methodological procedures and specification of how underlying values and scientific information lead to the RA output. In addition, the need to include positive aspects of welfare and overall welfare assessments are emphasized. The analysis shows that the RA approach for animal welfare could benefit from SM methodology to support transparent and science-based decision-making.

  8. Risk-informed criticality analysis as applied to waste packages subject to a subsurface igneous intrusion

    NASA Astrophysics Data System (ADS)

    Kimball, Darby Suzan

    Practitioners of many branches of nuclear facility safety use probabilistic risk assessment (PRA) methodology, which evaluates the reliability of a system along with the consequences of various failure states. One important exception is nuclear criticality safety, which traditionally produces binary results (critical or subcritical, based upon value of the effective multiplication factor, keff). For complex systems, criticality safety can benefit from application of the more flexible PRA techniques. A new risk-based technique in criticality safety analysis is detailed. In addition to identifying the most reactive configuration(s) and determining subcriticality, it yields more information about the relative reactivity contributions of various factors. By analyzing a more complete system, confidence that the system will remain subcritical is increased and areas where additional safety features would be most effective are indicated. The first step in the method is to create a criticality event tree (a specialized form of event tree where multiple outcomes stemming from a single event are acceptable). The tree lists events that impact reactivity by changing a system parameter. Next, the value of keff is calculated for the end states using traditional methods like the MCNP code. As calculations progress, the criticality event tree is modified; event branches demonstrated to have little effect on reactivity may be collapsed (thus reducing the total number of criticality runs), and branches may be added if more information is needed to characterize the system. When the criticality event tree is mature, critical limits are determined according to traditional validation techniques. Finally, results are evaluated. Criticality for the system is determined by comparing the value of k eff for each end state to the critical limit derived for those cases. The relative contributions of various events to criticality are identified by comparing end states resulting from different

  9. Applying Uncertainty Analysis to a Risk Assessment for the Pesticide Permethrin

    EPA Science Inventory

    We discuss the application of methods of uncertainty analysis from our previous poster to the problem of a risk assessment for exposure to the food-use pesticide permethrin resulting from residential pesticide crack and crevice application. Exposures are simulated by the SHEDS (S...

  10. Environmental-sanitary risk analysis procedure applied to artificial turf sports fields.

    PubMed

    Ruffino, Barbara; Fiore, Silvia; Zanetti, Maria Chiara

    2013-07-01

    Owing to the extensive use of artificial turfs worldwide, over the past 10 years there has been much discussion about the possible health and environmental problems originating from styrene-butadiene recycled rubber. In this paper, the authors performed a Tier 2 environmental-sanitary risk analysis on five artificial turf sports fields located in the city of Turin (Italy) with the aid of RISC4 software. Two receptors (adult player and child player) and three routes of exposure (direct contact with crumb rubber, contact with rainwater soaking the rubber mat, inhalation of dusts and gases from the artificial turf fields) were considered in the conceptual model. For all the fields and for all the routes, the cumulative carcinogenic risk proved to be lower than 10(-6) and the cumulative non-carcinogenic risk lower than 1. The outdoor inhalation of dusts and gases was the main route of exposure for both carcinogenic and non-carcinogenic substances. The results given by the inhalation pathway were compared with those of a risk assessment carried out on citizens breathing gases and dusts from traffic emissions every day in Turin. For both classes of substances and for both receptors, the inhalation of atmospheric dusts and gases from vehicular traffic gave risk values of one order of magnitude higher than those due to playing soccer on an artificial field.

  11. Common cause evaluations in applied risk analysis of nuclear power plants. [PWR

    SciTech Connect

    Taniguchi, T.; Ligon, D.; Stamatelatos, M.

    1983-04-01

    Qualitative and quantitative approaches were developed for the evaluation of common cause failures (CCFs) in nuclear power plants and were applied to the analysis of the auxiliary feedwater systems of several pressurized water reactors (PWRs). Key CCF variables were identified through a survey of experts in the field and a review of failure experience in operating PWRs. These variables were classified into categories of high, medium, and low defense against a CCF. Based on the results, a checklist was developed for analyzing CCFs of systems. Several known techniques for quantifying CCFs were also reviewed. The information provided valuable insights in the development of a new model for estimating CCF probabilities, which is an extension of and improvement over the Beta Factor method. As applied to the analysis of the PWR auxiliary feedwater systems, the method yielded much more realistic values than the original Beta Factor method for a one-out-of-three system.

  12. A review of dendrogeomorphological research applied to flood risk analysis in Spain

    NASA Astrophysics Data System (ADS)

    Díez-Herrero, A.; Ballesteros, J. A.; Ruiz-Villanueva, V.; Bodoque, J. M.

    2013-08-01

    Over the last forty years, applying dendrogeomorphology to palaeoflood analysis has improved estimates of the frequency and magnitude of past floods worldwide. This paper reviews the main results obtained by applying dendrogeomorphology to flood research in several case studies in Central Spain. These dendrogeomorphological studies focused on the following topics: (1) anatomical analysis to understand the physiological response of trees to flood damage and improve sampling efficiency; (2) compiling robust flood chronologies in ungauged mountain streams, (3) determining flow depth and estimating flood discharge using two-dimensional hydraulic modelling, and comparing them with other palaeostage indicators; (4) calibrating hydraulic model parameters (i.e. Manning roughness); and (5) implementing stochastic-based, cost-benefit analysis to select optimal mitigation measures. The progress made in these areas is presented with suggestions for further research to improve the applicability of dendrogeochronology to palaeoflood studies. Further developments will include new methods for better identification of the causes of specific types of flood damage to trees (e.g. tilted trees) or stable isotope analysis of tree rings to identify the climatic conditions associated with periods of increasing flood magnitude or frequency.

  13. Applying Multiple Criteria Decision Analysis to Comparative Benefit-Risk Assessment: Choosing among Statins in Primary Prevention.

    PubMed

    Tervonen, Tommi; Naci, Huseyin; van Valkenhoef, Gert; Ades, Anthony E; Angelis, Aris; Hillege, Hans L; Postmus, Douwe

    2015-10-01

    Decision makers in different health care settings need to weigh the benefits and harms of alternative treatment strategies. Such health care decisions include marketing authorization by regulatory agencies, practice guideline formulation by clinical groups, and treatment selection by prescribers and patients in clinical practice. Multiple criteria decision analysis (MCDA) is a family of formal methods that help make explicit the tradeoffs that decision makers accept between the benefit and risk outcomes of different treatment options. Despite the recent interest in MCDA, certain methodological aspects are poorly understood. This paper presents 7 guidelines for applying MCDA in benefit-risk assessment and illustrates their use in the selection of a statin drug for the primary prevention of cardiovascular disease. We provide guidance on the key methodological issues of how to define the decision problem, how to select a set of nonoverlapping evaluation criteria, how to synthesize and summarize the evidence, how to translate relative measures to absolute ones that permit comparisons between the criteria, how to define suitable scale ranges, how to elicit partial preference information from the decision makers, and how to incorporate uncertainty in the analysis. Our example on statins indicates that fluvastatin is likely to be the most preferred drug by our decision maker and that this result is insensitive to the amount of preference information incorporated in the analysis.

  14. Applied Behavior Analysis

    ERIC Educational Resources Information Center

    Szapacs, Cindy

    2006-01-01

    Teaching strategies that work for typically developing children often do not work for those diagnosed with an autism spectrum disorder. However, teaching strategies that work for children with autism do work for typically developing children. In this article, the author explains how the principles and concepts of Applied Behavior Analysis can be…

  15. Analysis of agreement between cardiac risk stratification protocols applied to participants of a center for cardiac rehabilitation

    PubMed Central

    Santos, Ana A. S.; Silva, Anne K. F.; Vanderlei, Franciele M.; Christofaro, Diego G. D.; Gonçalves, Aline F. L.; Vanderlei, Luiz C. M.

    2016-01-01

    ABSTRACT Background Cardiac risk stratification is related to the risk of the occurrence of events induced by exercise. Despite the existence of several protocols to calculate risk stratification, studies indicating that there is similarity between these protocols are still unknown. Objective To evaluate the agreement between the existing protocols on cardiac risk rating in cardiac patients. Method The records of 50 patients from a cardiac rehabilitation program were analyzed, from which the following information was extracted: age, sex, weight, height, clinical diagnosis, medical history, risk factors, associated diseases, and the results from the most recent laboratory and complementary tests performed. This information was used for risk stratification of the patients in the protocols of the American College of Sports Medicine, the Brazilian Society of Cardiology, the American Heart Association, the protocol designed by Frederic J. Pashkow, the American Association of Cardiovascular and Pulmonary Rehabilitation, the Société Française de Cardiologie, and the Sociedad Española de Cardiología. Descriptive statistics were used to characterize the sample and the analysis of agreement between the protocols was calculated using the Kappa coefficient. Differences were considered with a significance level of 5%. Results Of the 21 analyses of agreement, 12 were considered significant between the protocols used for risk classification, with nine classified as moderate and three as low. No agreements were classified as excellent. Different proportions were observed in each risk category, with significant differences between the protocols for all risk categories. Conclusion The agreements between the protocols were considered low and moderate and the risk proportions differed between protocols. PMID:27556385

  16. Municipal solid waste management health risk assessment from air emissions for China by applying life cycle analysis.

    PubMed

    Li, Hua; Nitivattananon, Vilas; Li, Peng

    2015-05-01

    This study is to quantify and objectively evaluate the extent of environmental health risks from three waste treatment options suggested by the national municipal solid waste management enhancing strategy (No [2011] 9 of the State Council, promulgated on 19 April 2011), which includes sanitary landfill, waste-to-energy incineration and compost, together with the material recovery facility through a case study in Zhangqiu City of China. It addresses potential chronic health risks from air emissions to residential receptors in the impacted area. It combines field survey, analogue survey, design documents and life cycle inventory methods in defining the source strength of chemicals of potential concern. The modelling of life cycle inventory and air dispersion is via integrated waste management(IWM)-2 and Screening Air Dispersion Model (Version 3.0) (SCREEN3). The health risk assessment is in accordance with United States Environmental Protection Agency guidance Risk Assessment Guidance for Superfund (RAGS), Volume I: Human Health Evaluation Manual (Part F, Supplemental Guidance for Inhalation Risk Assessment). The exposure concentration is based on long-term exposure to the maximum ground level contaminant in air under the 'reasonable worst situation' emissions and then directly compared with reference for concentration and unit risk factor/cancer slope factor derived from the national air quality standard (for a conventional pollutant) and toxicological studies (for a specific pollutant). Results from this study suggest that the option of compost with material recovery facility treatment may pose less negative health impacts than other options; the sensitivity analysis shows that the landfill integrated waste management collection rate has a great influence on the impact results. Further investigation is needed to validate or challenge the findings of this study.

  17. Risk analysis and international trade principles applied to the importation into Canada of caprine embryos from South Africa.

    PubMed

    Evans, B; Faul, A; Bielanski, A; Renwick, S; Van Derlinden, I

    1997-04-01

    Between November 1994 and February 1995 over nine thousand Boer goat embryos were imported into Canada from the Republic of South Africa. This substantial international movement of animal genetics via embryos was achieved through the application of the risk analysis principles prescribed in Section 1.4. of the International Animal Health Code of the Office International des Epizooties (OIE). Integral to the development of the health certification procedures was the application of the fundamental principles of non-discrimination, harmonisation, equivalence and transparency defined in the World Trade Organisation Agreement on Sanitary and Phytosanitary measures. Risk mitigation interventions were founded upon full consideration of the potential for disease transmission by animal embryos as espoused by the International Embryo Transfer Society and the relevant standards contained in Appendix 4.2.3.3. of the OIE International Animal Health Code. All the embryos imported into Canada were implanted into synchronised recipients on arrival. Twenty months later there has been no evidence of disease in either the recipient animals or the resulting animals born in Canada.

  18. Risk analysis before launch

    NASA Astrophysics Data System (ADS)

    Behlert, Rene

    1988-08-01

    A quality methodology is proposed based on risk analysis and observation of technical facts. The procedures for the quantization of a risk are described and examples are given. A closed loop quality analysis is described. Overall mission safety goals are described. The concept of maintenance is developed to evolutionary maintenance. It is shown that a large number of data must be processed to apply the proposed methods. The use of computer data processing is required.

  19. Applied Surface Analysis Workshop.

    DTIC Science & Technology

    1979-10-01

    field of surface analysis attended the Workshop. The list of participants follows. 5! A, I Charles Anderson Albert L. Botkin Case Western Reserve...Louis, MO 63166 University of Dayton 300 College Park Richard Chase Dayton, OH 45469 Case Western Reserve University University Circle Brian E. P...Dayton, OH 45469 300 College Park Dayton, OH 45469 Richard W. Hoffman Case Western Reserve University Martin Kordesch Cleveland, OH 44106 Case Western

  20. Exploring Students at Risk for Reading Comprehension Difficulties in South Korea: The RTI Approach Applying Latent Class Growth Analysis

    ERIC Educational Resources Information Center

    Kim, Dongil; Kim, Woori; Koh, Hyejung; Lee, Jaeho; Shin, Jaehyun; Kim, Heeju

    2014-01-01

    The purpose of this study was to identify students at risk of reading comprehension difficulties by using the responsiveness to intervention (RTI) approach. The participants were 177 students in Grades 1-3 in three elementary schools in South Korea. The students received Tier 1 instruction of RTI from March to May 2011, and their performance was…

  1. Risk Analysis

    NASA Technical Reports Server (NTRS)

    Morring, Frank, Jr.

    2004-01-01

    A National Academies panel says the Hubble Space Telescope is too valuable ;or gamblingon a long-shot robotic mission to extend its service life, and urges Directly contradicting Administrator Sean O'Keefe, who killed a planned fifth shuttle servicing mission to the telescope on grounds it was too dangerous for a human crew in the post-Challenger environment, the expert committee found that upgrades to shuttle safety actually should make it less hazardous to fly to the telescope than it was before Columbia was lost. Risks of a telescope-servicing mission are only marginally greater than the planned missions to the International Space Station (ISS) O'Keefe has authorized, the panel found. After comparing those risks to the dangers inherent in trying to develop a complex space robot in the 39 months remaining in the Hubble s estimated service life, the panel opted for the human mission to save one of the major achievements of the American space program, in the words of Louis J. Lanzerotti, its chairman.

  2. The Relative Importance of the Vadose Zone in Multimedia Risk Assessment Modeling Applied at a National Scale: An Analysis of Benzene Using 3MRA

    NASA Astrophysics Data System (ADS)

    Babendreier, J. E.

    2002-05-01

    Evaluating uncertainty and parameter sensitivity in environmental models can be a difficult task, even for low-order, single-media constructs driven by a unique set of site-specific data. The challenge of examining ever more complex, integrated, higher-order models is a formidable one, particularly in regulatory settings applied on a national scale. Quantitative assessment of uncertainty and sensitivity within integrated, multimedia models that simulate hundreds of sites, spanning multiple geographical and ecological regions, will ultimately require a systematic, comparative approach coupled with sufficient computational power. The Multimedia, Multipathway, and Multireceptor Risk Assessment Model (3MRA) is an important code being developed by the United States Environmental Protection Agency for use in site-scale risk assessment (e.g. hazardous waste management facilities). The model currently entails over 700 variables, 185 of which are explicitly stochastic. The 3MRA can start with a chemical concentration in a waste management unit (WMU). It estimates the release and transport of the chemical throughout the environment, and predicts associated exposure and risk. The 3MRA simulates multimedia (air, water, soil, sediments), pollutant fate and transport, multipathway exposure routes (food ingestion, water ingestion, soil ingestion, air inhalation, etc.), multireceptor exposures (resident, gardener, farmer, fisher, ecological habitats and populations), and resulting risk (human cancer and non-cancer effects, ecological population and community effects). The 3MRA collates the output for an overall national risk assessment, offering a probabilistic strategy as a basis for regulatory decisions. To facilitate model execution of 3MRA for purposes of conducting uncertainty and sensitivity analysis, a PC-based supercomputer cluster was constructed. Design of SuperMUSE, a 125 GHz Windows-based Supercomputer for Model Uncertainty and Sensitivity Evaluation is described

  3. Applying of Decision Tree Analysis to Risk Factors Associated with Pressure Ulcers in Long-Term Care Facilities

    PubMed Central

    Moon, Mikyung

    2017-01-01

    Objectives The purpose of this study was to use decision tree analysis to explore the factors associated with pressure ulcers (PUs) among elderly people admitted to Korean long-term care facilities. Methods The data were extracted from the 2014 National Inpatient Sample (NIS)—data of Health Insurance Review and Assessment Service (HIRA). A MapReduce-based program was implemented to join and filter 5 tables of the NIS. The outcome predicted by the decision tree model was the prevalence of PUs as defined by the Korean Standard Classification of Disease-7 (KCD-7; code L89*). Using R 3.3.1, a decision tree was generated with the finalized 15,856 cases and 830 variables. Results The decision tree displayed 15 subgroups with 8 variables showing 0.804 accuracy, 0.820 sensitivity, and 0.787 specificity. The most significant primary predictor of PUs was length of stay less than 0.5 day. Other predictors were the presence of an infectious wound dressing, followed by having diagnoses numbering less than 3.5 and the presence of a simple dressing. Among diagnoses, “injuries to the hip and thigh” was the top predictor ranking 5th overall. Total hospital cost exceeding 2,200,000 Korean won (US $2,000) rounded out the top 7. Conclusions These results support previous studies that showed length of stay, comorbidity, and total hospital cost were associated with PUs. Moreover, wound dressings were commonly used to treat PUs. They also show that machine learning, such as a decision tree, could effectively predict PUs using big data. PMID:28261530

  4. Women in applied behavior analysis

    PubMed Central

    McSweeney, Frances K.; Donahoe, Patricia; Swindell, Samantha

    2000-01-01

    The status of women in applied behavior analysis was examined by comparing the participation of women in the Journal of Applied Behavior Analysis (JABA) to their participation in three similar journals. For all journals, the percentage of articles with at least one female author, the percentage of authors who are female, and the percentage of articles with a female first author increased from 1978 to 1997. Participation by women in JABA was equal to or greater than participation by women in the comparison journals. However, women appeared as authors on papers in special sections of Behavior Modification substantially more often when the editor was female than when the editor was male. In addition, female membership on the editorial boards of JABA, Behavior Modification, and Behaviour Research and Therapy failed to increase from 1978 to 1997. We conclude that a “glass ceiling” reduces the participation of women at the highest levels of applied behavior analysis and related fields. PMID:22478351

  5. Applying risk adjusted cost-effectiveness (RAC-E) analysis to hospitals: estimating the costs and consequences of variation in clinical practice.

    PubMed

    Karnon, Jonathan; Caffrey, Orla; Pham, Clarabelle; Grieve, Richard; Ben-Tovim, David; Hakendorf, Paul; Crotty, Maria

    2013-06-01

    Cost-effectiveness analysis is well established for pharmaceuticals and medical technologies but not for evaluating variations in clinical practice. This paper describes a novel methodology--risk adjusted cost-effectiveness (RAC-E)--that facilitates the comparative evaluation of applied clinical practice processes. In this application, risk adjustment is undertaken with a multivariate matching algorithm that balances the baseline characteristics of patients attending different settings (e.g., hospitals). Linked, routinely collected data are used to analyse patient-level costs and outcomes over a 2-year period, as well as to extrapolate costs and survival over patient lifetimes. The study reports the relative cost-effectiveness of alternative forms of clinical practice, including a full representation of the statistical uncertainty around the mean estimates. The methodology is illustrated by a case study that evaluates the relative cost-effectiveness of services for patients presenting with acute chest pain across the four main public hospitals in South Australia. The evaluation finds that services provided at two hospitals were dominated, and of the remaining services, the more effective hospital gained life years at a low mean additional cost and had an 80% probability of being the most cost-effective hospital at realistic cost-effectiveness thresholds. Potential determinants of the estimated variation in costs and effects were identified, although more detailed analyses to identify specific areas of variation in clinical practice are required to inform improvements at the less cost-effective institutions.

  6. Simulation Assisted Risk Assessment Applied to Launch Vehicle Conceptual Design

    NASA Technical Reports Server (NTRS)

    Mathias, Donovan L.; Go, Susie; Gee, Ken; Lawrence, Scott

    2008-01-01

    A simulation-based risk assessment approach is presented and is applied to the analysis of abort during the ascent phase of a space exploration mission. The approach utilizes groupings of launch vehicle failures, referred to as failure bins, which are mapped to corresponding failure environments. Physical models are used to characterize the failure environments in terms of the risk due to blast overpressure, resulting debris field, and the thermal radiation due to a fireball. The resulting risk to the crew is dynamically modeled by combining the likelihood of each failure, the severity of the failure environments as a function of initiator and time of the failure, the robustness of the crew module, and the warning time available due to early detection. The approach is shown to support the launch vehicle design process by characterizing the risk drivers and identifying regions where failure detection would significantly reduce the risk to the crew.

  7. FOOD RISK ANALYSIS

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Food risk analysis is a holistic approach to food safety because it considers all aspects of the problem. Risk assessment modeling is the foundation of food risk analysis. Proper design and simulation of the risk assessment model is important to properly predict and control risk. Because of knowl...

  8. Applying STAMP in Accident Analysis

    NASA Technical Reports Server (NTRS)

    Leveson, Nancy; Daouk, Mirna; Dulac, Nicolas; Marais, Karen

    2003-01-01

    Accident models play a critical role in accident investigation and analysis. Most traditional models are based on an underlying chain of events. These models, however, have serious limitations when used for complex, socio-technical systems. Previously, Leveson proposed a new accident model (STAMP) based on system theory. In STAMP, the basic concept is not an event but a constraint. This paper shows how STAMP can be applied to accident analysis using three different views or models of the accident process and proposes a notation for describing this process.

  9. Arctic Risk Management (ARMNet) Network: Linking Risk Management Practitioners and Researchers Across the Arctic Regions of Canada and Alaska To Improve Risk, Emergency and Disaster Preparedness and Mitigation Through Comparative Analysis and Applied Research

    NASA Astrophysics Data System (ADS)

    Garland, A.

    2015-12-01

    The Arctic Risk Management Network (ARMNet) was conceived as a trans-disciplinary hub to encourage and facilitate greater cooperation, communication and exchange among American and Canadian academics and practitioners actively engaged in the research, management and mitigation of risks, emergencies and disasters in the Arctic regions. Its aim is to assist regional decision-makers through the sharing of applied research and best practices and to support greater inter-operability and bilateral collaboration through improved networking, joint exercises, workshops, teleconferences, radio programs, and virtual communications (eg. webinars). Most importantly, ARMNet is a clearinghouse for all information related to the management of the frequent hazards of Arctic climate and geography in North America, including new and emerging challenges arising from climate change, increased maritime polar traffic and expanding economic development in the region. ARMNet is an outcome of the Arctic Observing Network (AON) for Long Term Observations, Governance, and Management Discussions, www.arcus.org/search-program. The AON goals continue with CRIOS (www.ariesnonprofit.com/ARIESprojects.php) and coastal erosion research (www.ariesnonprofit.com/webinarCoastalErosion.php) led by the North Slope Borough Risk Management Office with assistance from ARIES (Applied Research in Environmental Sciences Nonprofit, Inc.). The constituency for ARMNet will include all northern academics and researchers, Arctic-based corporations, First Responders (FRs), Emergency Management Offices (EMOs) and Risk Management Offices (RMOs), military, Coast Guard, northern police forces, Search and Rescue (SAR) associations, boroughs, territories and communities throughout the Arctic. This presentation will be of interest to all those engaged in Arctic affairs, describe the genesis of ARMNet and present the results of stakeholder meetings and webinars designed to guide the next stages of the Project.

  10. Budget Risk & Prioritization Analysis Tool

    SciTech Connect

    Carlos Castillo, Jerel Nelson

    2010-12-31

    BRPAtool performs the following: •Assists managers in making solid decisions on what scope/activities to reduce and/or eliminate, to meet constrained budgets, based on multiple risk factors •Enables analysis of different budget scenarios •Can analyze risks and cost for each activity based on technical, quantifiable risk criteria and management-determined risks •Real-time analysis •Enables managers to determine the multipliers and where funding is best applied •Promotes solid budget defense

  11. Quantitative environmental risk analysis

    SciTech Connect

    Klovning, J.; Nilsen, E.F.

    1995-12-31

    According to regulations relating to implementation and rise of risk analysis in the petroleum activities issued by the Norwegian Petroleum Directorate, it is mandatory for an operator on the Norwegian Continental Shelf to establish acceptance criteria for environmental risk in the activities and carry out environmental risk analysis. This paper presents a {open_quotes}new{close_quotes} method for environmental risk analysis developed by the company. The objective has been to assist the company to meet rules and regulations and to assess and describe the environmental risk in a systematic manner. In the environmental risk analysis the most sensitive biological resource in the affected area is used to assess the environmental damage. The analytical method is based on the methodology for quantitative risk analysis related to loss of life. In addition it incorporates the effect of seasonal fluctuations in the environmental risk evaluations. The paper is describing the function of the main analytical sequences exemplified through an analysis of environmental risk related to exploration drilling in an environmental sensitive area on the Norwegian Continental Shelf.

  12. Risk analysis methodology survey

    NASA Technical Reports Server (NTRS)

    Batson, Robert G.

    1987-01-01

    NASA regulations require that formal risk analysis be performed on a program at each of several milestones as it moves toward full-scale development. Program risk analysis is discussed as a systems analysis approach, an iterative process (identification, assessment, management), and a collection of techniques. These techniques, which range from simple to complex network-based simulation were surveyed. A Program Risk Analysis Handbook was prepared in order to provide both analyst and manager with a guide for selection of the most appropriate technique.

  13. SU-E-T-128: Applying Failure Modes and Effects Analysis to a Risk-Based Quality Management for Stereotactic Radiosurgery in Brazil

    SciTech Connect

    Teixeira, F; Almeida, C de; Huq, M

    2015-06-15

    Purpose: The goal of the present work was to evaluate the process maps for stereotactic radiosurgery (SRS) treatment at three radiotherapy centers in Brazil and apply the FMEA technique to evaluate similarities and differences, if any, of the hazards and risks associated with these processes. Methods: A team, consisting of professionals from different disciplines and involved in the SRS treatment, was formed at each center. Each team was responsible for the development of the process map, and performance of FMEA and FTA. A facilitator knowledgeable in these techniques led the work at each center. The TG100 recommended scales were used for the evaluation of hazard and severity for each step for the major process “treatment planning”. Results: Hazard index given by the Risk Priority Number (RPN) is found to range from 4–270 for various processes and the severity (S) index is found to range from 1–10. The RPN values > 100 and severity value ≥ 7 were chosen to flag safety improvement interventions. Number of steps with RPN ≥100 were found to be 6, 59 and 45 for the three centers. The corresponding values for S ≥ 7 are 24, 21 and 25 respectively. The range of RPN and S values for each center belong to different process steps and failure modes. Conclusion: These results show that interventions to improve safety is different for each center and it is associated with the skill level of the professional team as well as the technology used to provide radiosurgery treatment. The present study will very likely be a model for implementation of risk-based prospective quality management program for SRS treatment in Brazil where currently there are 28 radiotherapy centers performing SRS. A complete FMEA for SRS for these three radiotherapy centers is currently under development.

  14. Evaluation of Cardiovascular Risk Scores Applied to NASA's Astronant Corps

    NASA Technical Reports Server (NTRS)

    Jain, I.; Charvat, J. M.; VanBaalen, M.; Lee, L.; Wear, M. L.

    2014-01-01

    In an effort to improve cardiovascular disease (CVD) risk prediction, this analysis evaluates and compares the applicability of multiple CVD risk scores to the NASA Astronaut Corps which is extremely healthy at selection.

  15. Applying a weed risk assessment approach to GM crops.

    PubMed

    Keese, Paul K; Robold, Andrea V; Myers, Ruth C; Weisman, Sarah; Smith, Joe

    2014-12-01

    Current approaches to environmental risk assessment of genetically modified (GM) plants are modelled on chemical risk assessment methods, which have a strong focus on toxicity. There are additional types of harms posed by plants that have been extensively studied by weed scientists and incorporated into weed risk assessment methods. Weed risk assessment uses robust, validated methods that are widely applied to regulatory decision-making about potentially problematic plants. They are designed to encompass a broad variety of plant forms and traits in different environments, and can provide reliable conclusions even with limited data. The knowledge and experience that underpin weed risk assessment can be harnessed for environmental risk assessment of GM plants. A case study illustrates the application of the Australian post-border weed risk assessment approach to a representative GM plant. This approach is a valuable tool to identify potential risks from GM plants.

  16. Is risk analysis scientific?

    PubMed

    Hansson, Sven Ove; Aven, Terje

    2014-07-01

    This article discusses to what extent risk analysis is scientific in view of a set of commonly used definitions and criteria. We consider scientific knowledge to be characterized by its subject matter, its success in developing the best available knowledge in its fields of study, and the epistemic norms and values that guide scientific investigations. We proceed to assess the field of risk analysis according to these criteria. For this purpose, we use a model for risk analysis in which science is used as a base for decision making on risks, which covers the five elements evidence, knowledge base, broad risk evaluation, managerial review and judgment, and the decision; and that relates these elements to the domains experts and decisionmakers, and to the domains fact-based or value-based. We conclude that risk analysis is a scientific field of study, when understood as consisting primarily of (i) knowledge about risk-related phenomena, processes, events, etc., and (ii) concepts, theories, frameworks, approaches, principles, methods and models to understand, assess, characterize, communicate, and manage risk, in general and for specific applications (the instrumental part).

  17. Translational benchmark risk analysis

    PubMed Central

    Piegorsch, Walter W.

    2010-01-01

    Translational development – in the sense of translating a mature methodology from one area of application to another, evolving area – is discussed for the use of benchmark doses in quantitative risk assessment. Illustrations are presented with traditional applications of the benchmark paradigm in biology and toxicology, and also with risk endpoints that differ from traditional toxicological archetypes. It is seen that the benchmark approach can apply to a diverse spectrum of risk management settings. This suggests a promising future for this important risk-analytic tool. Extensions of the method to a wider variety of applications represent a significant opportunity for enhancing environmental, biomedical, industrial, and socio-economic risk assessments. PMID:20953283

  18. Program risk analysis handbook

    NASA Technical Reports Server (NTRS)

    Batson, R. G.

    1987-01-01

    NASA regulations specify that formal risk analysis be performed on a program at each of several milestones. Program risk analysis is discussed as a systems analysis approach, an iterative process (identification, assessment, management), and a collection of techniques. These techniques, which range from extremely simple to complex network-based simulation, are described in this handbook in order to provide both analyst and manager with a guide for selection of the most appropriate technique. All program risk assessment techniques are shown to be based on elicitation and encoding of subjective probability estimates from the various area experts on a program. Techniques to encode the five most common distribution types are given. Then, a total of twelve distinct approaches to risk assessment are given. Steps involved, good and bad points, time involved, and degree of computer support needed are listed. Why risk analysis should be used by all NASA program managers is discussed. Tools available at NASA-MSFC are identified, along with commercially available software. Bibliography (150 entries) and a program risk analysis check-list are provided.

  19. Risk analysis and meat hygiene.

    PubMed

    Hathaway, S C

    1993-12-01

    Meat hygiene consists of three major activities: post-mortem inspection; monitoring and surveillance for chemical hazards; and maintenance of good hygienic practice throughout all stages between slaughter and consumption of meat. Risk analysis is an applied science of increasing importance to these activities in the following areas: facilitating the distribution of pre-harvest, harvest and post-harvest inspection resources, proportional to the likelihood of public health and animal health hazards; establishing internationally-harmonized standards and specifications which are consistent and science-based; and improving the safety and wholesomeness of meat and meat products in local and international trade. Risk analysis, in one form or another, is well developed with respect to establishing standards and specifications for chemical hazards; methods for risk analysis of post-mortem meat inspection programmes are beginning to emerge. However, risk analysis of microbiological hazards in meat and meat products presents particular difficulties. All areas of application currently suffer from a lack of international agreement on risk assessment and risk management methodology.

  20. Applying evolutionary genetics to developmental toxicology and risk assessment.

    PubMed

    Leung, Maxwell C K; Procter, Andrew C; Goldstone, Jared V; Foox, Jonathan; DeSalle, Robert; Mattingly, Carolyn J; Siddall, Mark E; Timme-Laragy, Alicia R

    2017-03-04

    Evolutionary thinking continues to challenge our views on health and disease. Yet, there is a communication gap between evolutionary biologists and toxicologists in recognizing the connections among developmental pathways, high-throughput screening, and birth defects in humans. To increase our capability in identifying potential developmental toxicants in humans, we propose to apply evolutionary genetics to improve the experimental design and data interpretation with various in vitro and whole-organism models. We review five molecular systems of stress response and update 18 consensual cell-cell signaling pathways that are the hallmark for early development, organogenesis, and differentiation; and revisit the principles of teratology in light of recent advances in high-throughput screening, big data techniques, and systems toxicology. Multiscale systems modeling plays an integral role in the evolutionary approach to cross-species extrapolation. Phylogenetic analysis and comparative bioinformatics are both valuable tools in identifying and validating the molecular initiating events that account for adverse developmental outcomes in humans. The discordance of susceptibility between test species and humans (ontogeny) reflects their differences in evolutionary history (phylogeny). This synthesis not only can lead to novel applications in developmental toxicity and risk assessment, but also can pave the way for applying an evo-devo perspective to the study of developmental origins of health and disease.

  1. How to ensure that the results of climate risk analysis make a difference? - Experience from applied research addressing the challenges of climate change

    NASA Astrophysics Data System (ADS)

    Schneiderbauer, Stefan; Zebisch, Marc; Becker, Daniel; Pedoth, Lydia; Renner, Kathrin; Kienberger, Stefan

    2016-04-01

    Changing climate conditions may have beneficial or adverse effects on the social-ecological systems we are living in. In any case, the possible effects result from complex and interlinked physical and social processes embedded in these systems. Traditional research addresses these bio-physical and societal issues in a separate way. Therefore, in general, studies on risks related to climate change are still mono-disciplinary in nature with an increasing amount of work following a multi-disciplinary approach. The quality and usefulness of the results of such research for policy or decision making in practice may further be limited by study designs that do not acknowledge appropriately the significance of integrating or at least mixing qualitative and quantitative information and knowledge. Finally, the acceptance of study results - particularly when containing some kind of assessments - is often endangered by insufficient and / or late involvement of stakeholders and users. The above mentioned limitations have often been brought up in the recent past. However, despite that a certain consensus could be achieved in the last years recognising the need to tackle these issues, little progress has been made in terms of implementation within the context of (research) studies. This paper elaborates in detail on reasons that hamper the application of - interdisciplinary (i.e. natural and social science), - trans-disciplinary (i.e. co-production of knowledge) and - integrative (i.e. combining qualitative and quantitative approaches) work. It is based on the experience gained through a number of applied climate change vulnerability studies carried out within the context of various GIZ-financed development cooperation projects, a consultancy project for the German Environment Agency as well as the workshop series INQUIMUS, which tackles particularly the issues of mixing qualitative and quantitative research approaches. Potentials and constraints of possible attempts for

  2. DWPF risk analysis summary

    SciTech Connect

    Shedrow, C.B.

    1990-10-01

    This document contains selected risk analysis data from Chapter 9 (Safety Analysis) of the Defense Waste Processing Facility Safety Analysis Report DWPF SAR and draft Addendum 1 to the Waste Tank Farms SAR. Although these data may be revised prior to finalization of the draft SAR and the draft addendum, they are presently the best available information and were therefore used in preparing the risk analysis portion of the DWPF Environmental Analysis (DWPF EA). This information has been extracted from those draft documents an approved under separate cover so that it can be used as reference material for the DWPF EA when it is placed in the public reading rooms. 9 refs., 4 tabs.

  3. The Andrews’ Principles of Risk, Need, and Responsivity as Applied in Drug Abuse Treatment Programs: Meta-Analysis of Crime and Drug Use Outcomes

    PubMed Central

    Prendergast, Michael L.; Pearson, Frank S.; Podus, Deborah; Hamilton, Zachary K.; Greenwell, Lisa

    2013-01-01

    Objectives The purpose of the present meta-analysis was to answer the question: Can the Andrews principles of risk, needs, and responsivity, originally developed for programs that treat offenders, be extended to programs that treat drug abusers? Methods Drawing from a dataset that included 243 independent comparisons, we conducted random-effects meta-regression and ANOVA-analog meta-analyses to test the Andrews principles by averaging crime and drug use outcomes over a diverse set of programs for drug abuse problems. Results For crime outcomes, in the meta-regressions the point estimates for each of the principles were substantial, consistent with previous studies of the Andrews principles. There was also a substantial point estimate for programs exhibiting a greater number of the principles. However, almost all of the 95% confidence intervals included the zero point. For drug use outcomes, in the meta-regressions the point estimates for each of the principles was approximately zero; however, the point estimate for programs exhibiting a greater number of the principles was somewhat positive. All of the estimates for the drug use principles had confidence intervals that included the zero point. Conclusions This study supports previous findings from primary research studies targeting the Andrews principles that those principles are effective in reducing crime outcomes, here in meta-analytic research focused on drug treatment programs. By contrast, programs that follow the principles appear to have very little effect on drug use outcomes. Primary research studies that experimentally test the Andrews principles in drug treatment programs are recommended. PMID:24058325

  4. Assessment of cardiovascular risk in Tunisia: applying the Framingham risk score to national survey data

    PubMed Central

    Saidi, O; Malouche, D; O'Flaherty, M; Ben Mansour, N; A Skhiri, H; Ben Romdhane, H; Bezdah, L

    2016-01-01

    Objective This paper aims to assess the socioeconomic determinants of a high 10 year cardiovascular risk in Tunisia. Setting We used a national population based cross sectional survey conducted in 2005 in Tunisia comprising 7780 subjects. We applied the non-laboratory version of the Framingham equation to estimate the 10 year cardiovascular risk. Participants 8007 participants, aged 35–74 years, were included in the sample but effective exclusion of individuals with cardiovascular diseases and cancer resulted in 7780 subjects (3326 men and 4454 women) included in the analysis. Results Mean age was 48.7 years. Women accounted for 50.5% of participants. According to the Framingham equation, 18.1% (17.25–18.9%) of the study population had a high risk (≥20% within 10 years). The gender difference was striking and statistically significant: 27.2% (25.7–28.7%) of men had a high risk, threefold higher than women (9.7%; 8.8–10.5%). A higher 10 year global cardiovascular risk was associated with social disadvantage in men and women; thus illiterate and divorced individuals, and adults without a professional activity had a significantly higher risk of developing a cardiovascular event in 10 years. Illiterate men were at higher risk than those with secondary and higher education (OR=7.01; 5.49 to 9.14). The risk in illiterate women was more elevated (OR=13.57; 7.58 to 24.31). Those living in an urban area had a higher risk (OR=1.45 (1.19 to 1.76) in men and OR=1.71 (1.35 to 2.18) in women). Conclusions The 10 year global cardiovascular risk in the Tunisian population is already substantially high, affecting almost a third of men and 1 in 10 women, and concentrated in those more socially disadvantaged. PMID:27903556

  5. The basic importance of applied behavior analysis

    PubMed Central

    Epling, W. Frank; Pierce, W. David

    1986-01-01

    We argue that applied behavior analysis is relevant to basic research. Modification studies, and a broad range of investigations that focus on the precipitating and maintaining conditions of socially significant human behavior, have basic importance. Applied behavior analysis may aid basic researchers in the design of externally valid experiments and thereby enhance the theoretical significance of basic research for understanding human behavior. Applied research with humans, directed at culturally-important problems, will help to propagate the science of human behavior. Such a science will also be furthered by analogue experiments that model socially important behavior. Analytical-applied studies and analogue experiments are forms of applied behavior analysis that could suggest new environment-behavior relationships. These relationships could lead to basic research and principles that further the prediction, control, and understanding of behavior. PMID:22478650

  6. Risks, scientific uncertainty and the approach of applying precautionary principle.

    PubMed

    Lo, Chang-fa

    2009-03-01

    The paper intends to clarify the nature and aspects of risks and scientific uncertainty and also to elaborate the approach of application of precautionary principle for the purpose of handling the risk arising from scientific uncertainty. It explains the relations between risks and the application of precautionary principle at international and domestic levels. In the situations where an international treaty has admitted the precautionary principle and in the situation where there is no international treaty admitting the precautionary principle or enumerating the conditions to take measures, the precautionary principle has a role to play. The paper proposes a decision making tool, containing questions to be asked, to help policymakers to apply the principle. It also proposes a "weighing and balancing" procedure to help them decide the contents of the measure to cope with the potential risk and to avoid excessive measures.

  7. Applied Behavior Analysis and Statistical Process Control?

    ERIC Educational Resources Information Center

    Hopkins, B. L.

    1995-01-01

    Incorporating statistical process control (SPC) methods into applied behavior analysis is discussed. It is claimed that SPC methods would likely reduce applied behavior analysts' intimate contacts with problems and would likely yield poor treatment and research decisions. Cases and data presented by Pfadt and Wheeler (1995) are cited as examples.…

  8. Applied Behavior Analysis in Flying Training Research.

    DTIC Science & Technology

    1980-01-01

    often referred to as behavior modification ) which promotes improvements in human learning through an analysis of the contingencies surrounding a...Company, in press. Bandura, A. Principles of behavior modification . New York: Holt, Rinehart & Winston, 1969. Bostow, D.E., & Bailey, J.S. Modification of...tutors for kindergarten children. Journal of Applied Behavior Analysis, 1974, 7, 223-232. Kazdin, A.E. Behavior modification in applied settings

  9. Applying the partitioned multiobjective risk method (PMRM) to portfolio selection.

    PubMed

    Reyes Santos, Joost; Haimes, Yacov Y

    2004-06-01

    The analysis of risk-return tradeoffs and their practical applications to portfolio analysis paved the way for Modern Portfolio Theory (MPT), which won Harry Markowitz a 1992 Nobel Prize in Economics. A typical approach in measuring a portfolio's expected return is based on the historical returns of the assets included in a portfolio. On the other hand, portfolio risk is usually measured using volatility, which is derived from the historical variance-covariance relationships among the portfolio assets. This article focuses on assessing portfolio risk, with emphasis on extreme risks. To date, volatility is a major measure of risk owing to its simplicity and validity for relatively small asset price fluctuations. Volatility is a justified measure for stable market performance, but it is weak in addressing portfolio risk under aberrant market fluctuations. Extreme market crashes such as that on October 19, 1987 ("Black Monday") and catastrophic events such as the terrorist attack of September 11, 2001 that led to a four-day suspension of trading on the New York Stock Exchange (NYSE) are a few examples where measuring risk via volatility can lead to inaccurate predictions. Thus, there is a need for a more robust metric of risk. By invoking the principles of the extreme-risk-analysis method through the partitioned multiobjective risk method (PMRM), this article contributes to the modeling of extreme risks in portfolio performance. A measure of an extreme portfolio risk, denoted by f(4), is defined as the conditional expectation for a lower-tail region of the distribution of the possible portfolio returns. This article presents a multiobjective problem formulation consisting of optimizing expected return and f(4), whose solution is determined using Evolver-a software that implements a genetic algorithm. Under business-as-usual market scenarios, the results of the proposed PMRM portfolio selection model are found to be compatible with those of the volatility-based model

  10. Competing Uses of Underground Systems Related to Energy Supply: Applying Single- and Multiphase Simulations for Site Characterization and Risk-Analysis

    NASA Astrophysics Data System (ADS)

    Kissinger, A.; Walter, L.; Darcis, M.; Flemisch, B.; Class, H.

    2012-04-01

    Global climate change, shortage of resources and the resulting turn towards renewable sources of energy lead to a growing demand for the utilization of subsurface systems. Among these competing uses are Carbon Capture and Storage (CCS), geothermal energy, nuclear waste disposal, "renewable" methane or hydrogen storage as well as the ongoing production of fossil resources like oil, gas, and coal. Besides competing among themselves, these technologies may also create conflicts with essential public interests like water supply. For example, the injection of CO2 into the underground causes an increase in pressure reaching far beyond the actual radius of influence of the CO2 plume, potentially leading to large amounts of displaced salt water. Finding suitable sites is a demanding task for several reasons. Natural systems as opposed to technical systems are always characterized by heterogeneity. Therefore, parameter uncertainty impedes reliable predictions towards capacity and safety of a site. State of the art numerical simulations combined with stochastic approaches need to be used to obtain a more reliable assessment of the involved risks and the radii of influence of the different processes. These simulations may include the modeling of single- and multiphase non-isothermal flow, geo-chemical and geo-mechanical processes in order to describe all relevant physical processes adequately. Stochastic approaches have the aim to estimate a bandwidth of the key output parameters based on uncertain input parameters. Risks for these different underground uses can then be made comparable with each other. Along with the importance and the urgency of the competing processes this may lead to a more profound basis for a decision. Communicating risks to stake holders and a concerned public is crucial for the success of finding a suitable site for CCS (or other subsurface utilization). We present and discuss first steps towards an approach for addressing the issue of competitive

  11. Total Risk Approach in Applying PRA to Criticality Safety

    SciTech Connect

    Huang, S T

    2005-03-24

    As nuclear industry continues marching from an expert-base support to more procedure-base support, it is important to revisit the total risk concept to criticality safety. A key objective of criticality safety is to minimize total criticality accident risk. The purpose of this paper is to assess key constituents of total risk concept pertaining to criticality safety from an operations support perspective and to suggest a risk-informed means of utilizing criticality safety resources for minimizing total risk. A PRA methodology was used to assist this assessment. The criticality accident history was assessed to provide a framework for our evaluation. In supporting operations, the work of criticality safety engineers ranges from knowing the scope and configurations of a proposed operation, performing criticality hazards assessment to derive effective controls, assisting in training operators, response to floor questions, surveillance to ensure implementation of criticality controls, and response to criticality mishaps. In a compliance environment, the resource of criticality safety engineers is increasingly being directed towards tedious documentation effort to meet some regulatory requirements to the effect of weakening the floor support for criticality safety. By applying a fault tree model to identify the major contributors of criticality accidents, a total risk picture is obtained to address relative merits of various actions. Overall, human failure is the key culprit in causing criticality accidents. Factors such as failure to follow procedures, lacks of training, lack of expert support at the floor level etc. are main contributors. Other causes may include lack of effective criticality controls such as inadequate criticality safety evaluation. Not all of the causes are equally important in contributing to criticality mishaps. Applying the limited resources to strengthen the weak links would reduce risk more than continuing emphasis on the strong links of

  12. Caldwell University's Department of Applied Behavior Analysis.

    PubMed

    Reeve, Kenneth F; Reeve, Sharon A

    2016-05-01

    Since 2004, faculty members at Caldwell University have developed three successful graduate programs in Applied Behavior Analysis (i.e., PhD, MA, non-degree programs), increased program faculty from two to six members, developed and operated an on-campus autism center, and begun a stand-alone Applied Behavior Analysis Department. This paper outlines a number of strategies used to advance these initiatives, including those associated with an extensive public relations campaign. We also outline challenges that have limited our programs' growth. These strategies, along with a consideration of potential challenges, might prove useful in guiding academicians who are interested in starting their own programs in behavior analysis.

  13. Quantitative Microbial Risk Assessment Tutorial: Pour Point Analysis of Land-applied Microbial Loadings and Comparison of Simulated and Gaging Station Results

    EPA Science Inventory

    This tutorial demonstrates a pour point analysis • Initiates execution of the SDMPB.• Navigates the SDMPB.• Chooses a pour point within a watershed, delineates the sub-area that contributes to that pour point, and collects data for it.• Considers land applicat...

  14. Probabilistic risk analysis and terrorism risk.

    PubMed

    Ezell, Barry Charles; Bennett, Steven P; von Winterfeldt, Detlof; Sokolowski, John; Collins, Andrew J

    2010-04-01

    Since the terrorist attacks of September 11, 2001, and the subsequent establishment of the U.S. Department of Homeland Security (DHS), considerable efforts have been made to estimate the risks of terrorism and the cost effectiveness of security policies to reduce these risks. DHS, industry, and the academic risk analysis communities have all invested heavily in the development of tools and approaches that can assist decisionmakers in effectively allocating limited resources across the vast array of potential investments that could mitigate risks from terrorism and other threats to the homeland. Decisionmakers demand models, analyses, and decision support that are useful for this task and based on the state of the art. Since terrorism risk analysis is new, no single method is likely to meet this challenge. In this article we explore a number of existing and potential approaches for terrorism risk analysis, focusing particularly on recent discussions regarding the applicability of probabilistic and decision analytic approaches to bioterrorism risks and the Bioterrorism Risk Assessment methodology used by the DHS and criticized by the National Academies and others.

  15. Defining applied behavior analysis: An historical analogy

    PubMed Central

    Deitz, Samuel M.

    1982-01-01

    This article examines two criteria for a definition of applied behavior analysis. The criteria are derived from a 19th century attempt to establish medicine as a scientific field. The first criterion, experimental determinism, specifies the methodological boundaries of an experimental science. The second criterion, philosophic doubt, clarifies the tentative nature of facts and theories derived from those facts. Practices which will advance the science of behavior are commented upon within each criteria. To conclude, the problems of a 19th century form of empiricism in medicine are related to current practices in applied behavior analysis. PMID:22478557

  16. Risk Analysis Virtual ENvironment

    SciTech Connect

    2014-02-10

    RAVEN has 3 major functionalities: 1. Provides a Graphical User Interface for the pre- and post-processing of the RELAP-7 input and output. 2. Provides the capability to model nuclear power plants control logic for the RELAP-7 code and dynamic control of the accident scenario evolution. This capability is based on a software structure that realizes a direct connection between the RELAP-7 solver engine (MOOSE) and a python environment where the variables describing the plant status are accessible in a scripting environment. RAVEN support the generation of the probabilistic scenario control by supplying a wide range of probability and cumulative distribution functions and their inverse functions. 3. Provides a general environment to perform probability risk analysis for RELAP-7, RELAP-5 and any generic MOOSE based applications. The probabilistic analysis is performed by sampling the input space of the coupled code parameters and it is enhanced by using modern artificial intelligence algorithms that accelerate the identification of the areas of major risk (in the input parameter space). This environment also provides a graphical visualization capability to analyze the outcomes. Among other approaches, the classical Monte Carlo and Latin Hypercube sampling algorithms are available. For the acceleration of the convergence of the sampling methodologies, Support Vector Machines, Bayesian regression, and collocation stochastic polynomials chaos are implemented. The same methodologies here described could be used to solve optimization and uncertainties propagation problems using the RAVEN framework.

  17. Applied Behavior Analysis: Beyond Discrete Trial Teaching

    ERIC Educational Resources Information Center

    Steege, Mark W.; Mace, F. Charles; Perry, Lora; Longenecker, Harold

    2007-01-01

    We discuss the problem of autism-specific special education programs representing themselves as Applied Behavior Analysis (ABA) programs when the only ABA intervention employed is Discrete Trial Teaching (DTT), and often for limited portions of the school day. Although DTT has many advantages to recommend its use, it is not well suited to teach…

  18. Positive Behavior Support and Applied Behavior Analysis

    ERIC Educational Resources Information Center

    Johnston, J. M.; Foxx, R. M.; Jacobson, J. W.; Green, G.; Mulick, J. A.

    2006-01-01

    This article reviews the origins and characteristics of the positive behavior support (PBS) movement and examines those features in the context of the field of applied behavior analysis (ABA). We raise a number of concerns about PBS as an approach to delivery of behavioral services and its impact on how ABA is viewed by those in human services. We…

  19. Applied Behavior Analysis as Technological Science.

    ERIC Educational Resources Information Center

    Iwata, Brian A.

    1991-01-01

    To the extent that applied behavior analysis represents a scientific and practical approach to the study of behavior, its technological character is essential. The most serious problem evident in the field is not that the research being done is too technical but that more good research of all types is needed. (JDD)

  20. Applying RESRAD-CHEM for chemical risk assessment

    SciTech Connect

    Cheng, J.J.; Yu, C.

    1995-07-01

    RESRAD-CHEM is a multiple pathway analysis computer code to evaluate chemically contaminated sites; it was developed at Argonne National Laboratory for the US Department of Energy. The code is designed to predict human health risks from exposure to hazardous chemicals and to derive cleanup criteria for chemically contaminated soils. It consists of environmental fate and transport models and is capable of predicting chemical concentrations over time in different environmental media. The methodology used in RESRAD-CHEM for exposure assessment and risk characterization follows the US Environmental Protection Agency`s guidance on Human Health Evaluation for Superfund. A user-friendly interface is incorporated for entering data, operating the code, and displaying results. RESRAD-CHEM is easy to use and is a powerful tool to assess chemical risk from environmental exposure.

  1. Probabilistic Exposure Analysis for Chemical Risk Characterization

    PubMed Central

    Bogen, Kenneth T.; Cullen, Alison C.; Frey, H. Christopher; Price, Paul S.

    2009-01-01

    This paper summarizes the state of the science of probabilistic exposure assessment (PEA) as applied to chemical risk characterization. Current probabilistic risk analysis methods applied to PEA are reviewed. PEA within the context of risk-based decision making is discussed, including probabilistic treatment of related uncertainty, interindividual heterogeneity, and other sources of variability. Key examples of recent experience gained in assessing human exposures to chemicals in the environment, and other applications to chemical risk characterization and assessment, are presented. It is concluded that, although improvements continue to be made, existing methods suffice for effective application of PEA to support quantitative analyses of the risk of chemically induced toxicity that play an increasing role in key decision-making objectives involving health protection, triage, civil justice, and criminal justice. Different types of information required to apply PEA to these different decision contexts are identified, and specific PEA methods are highlighted that are best suited to exposure assessment in these separate contexts. PMID:19223660

  2. Security risk assessment: applying the concepts of fuzzy logic.

    PubMed

    Bajpai, Shailendra; Sachdeva, Anish; Gupta, J P

    2010-01-15

    Chemical process industries (CPI) handling hazardous chemicals in bulk can be attractive targets for deliberate adversarial actions by terrorists, criminals and disgruntled employees. It is therefore imperative to have comprehensive security risk management programme including effective security risk assessment techniques. In an earlier work, it has been shown that security risk assessment can be done by conducting threat and vulnerability analysis or by developing Security Risk Factor Table (SRFT). HAZOP type vulnerability assessment sheets can be developed that are scenario based. In SRFT model, important security risk bearing factors such as location, ownership, visibility, inventory, etc., have been used. In this paper, the earlier developed SRFT model has been modified using the concepts of fuzzy logic. In the modified SRFT model, two linguistic fuzzy scales (three-point and four-point) are devised based on trapezoidal fuzzy numbers. Human subjectivity of different experts associated with previous SRFT model is tackled by mapping their scores to the newly devised fuzzy scale. Finally, the fuzzy score thus obtained is defuzzyfied to get the results. A test case of a refinery is used to explain the method and compared with the earlier work.

  3. Applying ecological risk principles to watershed assessment and management.

    PubMed

    Serveiss, Victor B

    2002-02-01

    Considerable progress in addressing point source (end of pipe) pollution problems has been made, but it is now recognized that further substantial environmental improvements depend on controlling nonpoint source pollution. A watershed approach is being used more frequently to address these problems because traditional regulatory approaches do not focus on nonpoint sources. The watershed approach is organized around the guiding principles of partnerships, geographic focus, and management based on sound science and data. This helps to focus efforts on the highest priority problems within hydrologically-defined geographic areas. Ecological risk assessment is a process to collect, organize, analyze, and present scientific information to improve decision making. The U.S. Environmental Protection Agency (EPA) sponsored three watershed assessments and found that integrating the watershed approach with ecological risk assessment increases the use of environmental monitoring and assessment data in decision making. This paper describes the basics of the watershed approach, the ecological risk assessment process, and how these two frameworks can be integrated. The three major principles of watershed ecological risk assessment found to be most useful for increasing the use of science in decision making are (1) using assessment endpoints and conceptual models, (2) holding regular interactions between scientists and managers, and (3) developing a focus for multiple stressor analysis. Examples are provided illustrating how these principles were implemented in these assessments.

  4. Multidimensional Risk Analysis: MRISK

    NASA Technical Reports Server (NTRS)

    McCollum, Raymond; Brown, Douglas; O'Shea, Sarah Beth; Reith, William; Rabulan, Jennifer; Melrose, Graeme

    2015-01-01

    Multidimensional Risk (MRISK) calculates the combined multidimensional score using Mahalanobis distance. MRISK accounts for covariance between consequence dimensions, which de-conflicts the interdependencies of consequence dimensions, providing a clearer depiction of risks. Additionally, in the event the dimensions are not correlated, Mahalanobis distance reduces to Euclidean distance normalized by the variance and, therefore, represents the most flexible and optimal method to combine dimensions. MRISK is currently being used in NASA's Environmentally Responsible Aviation (ERA) project o assess risk and prioritize scarce resources.

  5. Positive Behavior Support and Applied Behavior Analysis

    PubMed Central

    Johnston, J.M; Foxx, Richard M; Jacobson, John W; Green, Gina; Mulick, James A

    2006-01-01

    This article reviews the origins and characteristics of the positive behavior support (PBS) movement and examines those features in the context of the field of applied behavior analysis (ABA). We raise a number of concerns about PBS as an approach to delivery of behavioral services and its impact on how ABA is viewed by those in human services. We also consider the features of PBS that have facilitated its broad dissemination and how ABA might benefit from emulating certain practices of the PBS movement. PMID:22478452

  6. Wavelet analysis applied to the IRAS cirrus

    NASA Technical Reports Server (NTRS)

    Langer, William D.; Wilson, Robert W.; Anderson, Charles H.

    1994-01-01

    The structure of infrared cirrus clouds is analyzed with Laplacian pyramid transforms, a form of non-orthogonal wavelets. Pyramid and wavelet transforms provide a means to decompose images into their spatial frequency components such that all spatial scales are treated in an equivalent manner. The multiscale transform analysis is applied to IRAS 100 micrometer maps of cirrus emission in the north Galactic pole region to extract features on different scales. In the maps we identify filaments, fragments and clumps by separating all connected regions. These structures are analyzed with respect to their Hausdorff dimension for evidence of the scaling relationships in the cirrus clouds.

  7. Compendium on Risk Analysis Techniques

    DTIC Science & Technology

    The evolution of risk analysis in the materiel acquisition process is traced from the Secretary Packard memorandum to current AMC guidance. Risk ... analysis is defined and many of the existing techniques are described in light of this definition and their specific role in program management and

  8. Sneak analysis applied to process systems

    NASA Astrophysics Data System (ADS)

    Whetton, Cris

    Traditional safety analyses, such as HAZOP, FMEA, FTA, and MORT, are less than effective at identifying hazards resulting from incorrect 'flow' - whether this be flow of information, actions, electric current, or even the literal flow of process fluids. Sneak Analysis (SA) has existed since the mid nineteen-seventies as a means of identifying such conditions in electric circuits; in which area, it is usually known as Sneak Circuit Analysis (SCA). This paper extends the ideas of Sneak Circuit Analysis to a general method of Sneak Analysis applied to process plant. The methods of SA attempt to capitalize on previous work in the electrical field by first producing a pseudo-electrical analog of the process and then analyzing the analog by the existing techniques of SCA, supplemented by some additional rules and clues specific to processes. The SA method is not intended to replace any existing method of safety analysis; instead, it is intended to supplement such techniques as HAZOP and FMEA by providing systematic procedures for the identification of a class of potential problems which are not well covered by any other method.

  9. Applied spectrophotometry: analysis of a biochemical mixture.

    PubMed

    Trumbo, Toni A; Schultz, Emeric; Borland, Michael G; Pugh, Michael Eugene

    2013-01-01

    Spectrophotometric analysis is essential for determining biomolecule concentration of a solution and is employed ubiquitously in biochemistry and molecular biology. The application of the Beer-Lambert-Bouguer Lawis routinely used to determine the concentration of DNA, RNA or protein. There is however a significant difference in determining the concentration of a given species (RNA, DNA, protein) in isolation (a contrived circumstance) as opposed to determining that concentration in the presence of other species (a more realistic situation). To present the student with a more realistic laboratory experience and also to fill a hole that we believe exists in student experience prior to reaching a biochemistry course, we have devised a three week laboratory experience designed so that students learn to: connect laboratory practice with theory, apply the Beer-Lambert-Bougert Law to biochemical analyses, demonstrate the utility and limitations of example quantitative colorimetric assays, demonstrate the utility and limitations of UV analyses for biomolecules, develop strategies for analysis of a solution of unknown biomolecular composition, use digital micropipettors to make accurate and precise measurements, and apply graphing software.

  10. Object Oriented Risk Analysis Workshop

    NASA Astrophysics Data System (ADS)

    Pons, M. Güell I.; Jaboyedoff, M.

    2009-04-01

    In the framework of the RISET Project (Interfaculty Network of Support to Education and Technology) an educational tool for introducing risk analysis has been developed. This workshop enables to carry a group of students (role-play game) through a step-by-step process of risk identification and quantification. The aim is to assess risk in a characteristic alpine village regarding natural hazards (rockfall, snow avalanche, flooding…) and is oriented to affected objects such as buildings, infrastructures... The workshop contains the following steps: 1.- Planning of the study and definition of stakeholders 2.- Hazard identification 3.- Risk analysis 4.- Risk assessment 5.- Proposition of mitigation measures 6- Risk management and cost-benefit analysis. During the process, information related to past events and useful concepts are provided in order to bring up discussion and decision making. The Risk Matrix and other graphical tools allow having a visual representation of the risk level and help to prioritize counter measures. At the end of the workshop, there is the possibility to compare the results between different groups and print out a summarizing report. This approach provides a rapid and comprehensible risk evaluation. The workshop is accessible from the internet and will be used for educational purposes at bachelor and master level as well as for external persons dealing with risk analysis.

  11. The Components of Microbiological Risk Analysis.

    PubMed

    Liuzzo, Gaetano; Bentley, Stefano; Giacometti, Federica; Serraino, Andrea

    2015-02-03

    The paper describes the process of risk analysis in a food safety perspective. The steps of risk analysis defined as a process consisting of three interconnected components (risk assessment, risk management, and risk communication) are analysed. The different components of the risk assessment, risk management and risk communication are further described.

  12. The Components of Microbiological Risk Analysis

    PubMed Central

    Bentley, Stefano; Giacometti, Federica; Serraino, Andrea

    2015-01-01

    The paper describes the process of risk analysis in a food safety perspective. The steps of risk analysis defined as a process consisting of three interconnected components (risk assessment, risk management, and risk communication) are analysed. The different components of the risk assessment, risk management and risk communication are further described. PMID:27800384

  13. Risk analysis and management

    NASA Technical Reports Server (NTRS)

    Smith, H. E.

    1990-01-01

    Present software development accomplishments are indicative of the emerging interest in and increasing efforts to provide risk assessment backbone tools in the manned spacecraft engineering community. There are indications that similar efforts are underway in the chemical processes industry and are probably being planned for other high risk ground base environments. It appears that complex flight systems intended for extended manned planetary exploration will drive this technology.

  14. Reliability analysis applied to structural tests

    NASA Technical Reports Server (NTRS)

    Diamond, P.; Payne, A. O.

    1972-01-01

    The application of reliability theory to predict, from structural fatigue test data, the risk of failure of a structure under service conditions because its load-carrying capability is progressively reduced by the extension of a fatigue crack, is considered. The procedure is applicable to both safe-life and fail-safe structures and, for a prescribed safety level, it will enable an inspection procedure to be planned or, if inspection is not feasible, it will evaluate the life to replacement. The theory has been further developed to cope with the case of structures with initial cracks, such as can occur in modern high-strength materials which are susceptible to the formation of small flaws during the production process. The method has been applied to a structure of high-strength steel and the results are compared with those obtained by the current life estimation procedures. This has shown that the conventional methods can be unconservative in certain cases, depending on the characteristics of the structure and the design operating conditions. The suitability of the probabilistic approach to the interpretation of the results from full-scale fatigue testing of aircraft structures is discussed and the assumptions involved are examined.

  15. Tissue Microarray Analysis Applied to Bone Diagenesis

    PubMed Central

    Mello, Rafael Barrios; Silva, Maria Regina Regis; Alves, Maria Teresa Seixas; Evison, Martin Paul; Guimarães, Marco Aurelio; Francisco, Rafaella Arrabaca; Astolphi, Rafael Dias; Iwamura, Edna Sadayo Miazato

    2017-01-01

    Taphonomic processes affecting bone post mortem are important in forensic, archaeological and palaeontological investigations. In this study, the application of tissue microarray (TMA) analysis to a sample of femoral bone specimens from 20 exhumed individuals of known period of burial and age at death is described. TMA allows multiplexing of subsamples, permitting standardized comparative analysis of adjacent sections in 3-D and of representative cross-sections of a large number of specimens. Standard hematoxylin and eosin, periodic acid-Schiff and silver methenamine, and picrosirius red staining, and CD31 and CD34 immunohistochemistry were applied to TMA sections. Osteocyte and osteocyte lacuna counts, percent bone matrix loss, and fungal spheroid element counts could be measured and collagen fibre bundles observed in all specimens. Decalcification with 7% nitric acid proceeded more rapidly than with 0.5 M EDTA and may offer better preservation of histological and cellular structure. No endothelial cells could be detected using CD31 and CD34 immunohistochemistry. Correlation between osteocytes per lacuna and age at death may reflect reported age-related responses to microdamage. Methodological limitations and caveats, and results of the TMA analysis of post mortem diagenesis in bone are discussed, and implications for DNA survival and recovery considered. PMID:28051148

  16. On differentiation in applied behavior analysis

    PubMed Central

    Fawcett, Stephen B.

    1985-01-01

    Distinct types of activity in the field of applied behavior analysis are noted and discussed. Four metaphorical types of activity are considered: prospecting, farming, building, and guiding. Prospecting consists of time-limited exploration of a variety of beaviors, populations, or settings. Farming consists of producing new behaviors in the same setting using independent variables provided by the researchers or normally available in the setting. Building consists of combining procedural elements to create new programs or systems or to rehabilitate aspects of existing programs. Guiding involves pointing out connections between the principles of human behavior and the problems, populations, settings, and procedures with which researchers are (or could be) working. Advantages of each sphere are noted, and benefits of this division of labor to the field as a whole are discussed. PMID:22478631

  17. Reachability Analysis Applied to Space Situational Awareness

    NASA Astrophysics Data System (ADS)

    Holzinger, M.; Scheeres, D.

    Several existing and emerging applications of Space Situational Awareness (SSA) relate directly to spacecraft Rendezvous, Proximity Operations, and Docking (RPOD) and Formation / Cluster Flight (FCF). When multiple Resident Space Ob jects (RSOs) are in vicinity of one another with appreciable periods between observations, correlating new RSO tracks to previously known objects becomes a non-trivial problem. A particularly difficult sub-problem is seen when long breaks in observations are coupled with continuous, low- thrust maneuvers. Reachability theory, directly related to optimal control theory, can compute contiguous reachability sets for known or estimated control authority and can support such RSO search and correlation efforts in both ground and on-board settings. Reachability analysis can also directly estimate the minimum control authority of a given RSO. For RPOD and FCF applications, emerging mission concepts such as fractionation drastically increase system complexity of on-board autonomous fault management systems. Reachability theory, as applied to SSA in RPOD and FCF applications, can involve correlation of nearby RSO observations, control authority estimation, and sensor track re-acquisition. Additional uses of reachability analysis are formation reconfiguration, worst-case passive safety, and propulsion failure modes such as a "stuck" thruster. Existing reachability theory is applied to RPOD and FCF regimes. An optimal control policy is developed to maximize the reachability set and optimal control law discontinuities (switching) are examined. The Clohessy-Wiltshire linearized equations of motion are normalized to accentuate relative control authority for spacecraft propulsion systems at both Low Earth Orbit (LEO) and Geostationary Earth Orbit (GEO). Several examples with traditional and low thrust propulsion systems in LEO and GEO are explored to illustrate the effects of relative control authority on the time-varying reachability set surface. Both

  18. Cluster analysis applied to multiparameter geophysical dataset

    NASA Astrophysics Data System (ADS)

    Di Giuseppe, M. G.; Troiano, A.; Troise, C.; De Natale, G.

    2012-04-01

    Multi-parameter acquisition is a common geophysical field practice nowadays. Regularly seismic velocity and attenuation, gravity and electromagnetic dataset are acquired in a certain area, to obtain a complete characterization of the some investigate feature of the subsoil. Such a richness of information is often underestimated, although an integration of the analysis could provide a notable improving in the imaging of the investigated structures, mostly because the handling of distinct parameters and their joint inversion still presents several and severe problems. Post-inversion statistical techniques represent a promising approach to these questions, providing a quick, simple and elegant way to obtain this advantageous but complex integration. We present an approach based on the partition of the analyzed multi parameter dataset in a number of different classes, identified as localized regions of high correlation. These classes, or 'Cluster', are structured in such a way that the observations pertaining to a certain group are more similar to each other than the observations belonging to a different one, according to an optimal logical criterion. Regions of the subsoil sharing the same physical characteristic are so identified, without a-priori or empirical relationship linking the distinct measured parameters. The retrieved imaging results highly affordable in a statistical sense, specifically due to this lack of external hypothesis that are, instead, indispensable in a full joint inversion, were works, as matter of fact, just a real constrain for the inversion process, not seldom of relative consistence. We apply our procedure to a certain number of experimental dataset, related to several structures at very different scales presents in the Campanian district (southern Italy). These structures goes from the shallows evidence of the active fault zone originating the M 7.9 Irpinia earthquake to the main feature characterizing the Campi Flegrei Caldera and the Mt

  19. Magnetic Analysis Techniques Applied to Desert Varnish

    NASA Technical Reports Server (NTRS)

    Schmidgall, E. R.; Moskowitz, B. M.; Dahlberg, E. D.; Kuhlman, K. R.

    2003-01-01

    Desert varnish is a black or reddish coating commonly found on rock samples from arid regions. Typically, the coating is very thin, less than half a millimeter thick. Previous research has shown that the primary components of desert varnish are silicon oxide clay minerals (60%), manganese and iron oxides (20-30%), and trace amounts of other compounds [1]. Desert varnish is thought to originate when windborne particles containing iron and manganese oxides are deposited onto rock surfaces where manganese oxidizing bacteria concentrate the manganese and form the varnish [4,5]. If desert varnish is indeed biogenic, then the presence of desert varnish on rock surfaces could serve as a biomarker, indicating the presence of microorganisms. This idea has considerable appeal, especially for Martian exploration [6]. Magnetic analysis techniques have not been extensively applied to desert varnish. The only previous magnetic study reported that based on room temperature demagnetization experiments, there were noticeable differences in magnetic properties between a sample of desert varnish and the substrate sandstone [7]. Based upon the results of the demagnetization experiments, the authors concluded that the primary magnetic component of desert varnish was either magnetite (Fe3O4) or maghemite ( Fe2O3).

  20. Natural hazard management high education: laboratory of hydrologic and hydraulic risk management and applied geomorphology

    NASA Astrophysics Data System (ADS)

    Giosa, L.; Margiotta, M. R.; Sdao, F.; Sole, A.; Albano, R.; Cappa, G.; Giammatteo, C.; Pagliuca, R.; Piccolo, G.; Statuto, D.

    2009-04-01

    The Environmental Engineering Faculty of University of Basilicata have higher-level course for students in the field of natural hazard. The curriculum provides expertise in the field of prediction, prevention and management of earthquake risk, hydrologic-hydraulic risk, and geomorphological risk. These skills will contribute to the training of specialists, as well as having a thorough knowledge of the genesis and the phenomenology of natural risks, know how to interpret, evaluate and monitor the dynamic of environment and of territory. In addition to basic training in the fields of mathematics and physics, the course of study provides specific lessons relating to seismic and structural dynamics of land, environmental and computational hydraulics, hydrology and applied hydrogeology. In particular in this course there are organized two connected examination arguments: Laboratory of hydrologic and hydraulic risk management and Applied geomorphology. These course foresee the development and resolution of natural hazard problems through the study of a real natural disaster. In the last year, the work project has regarded the collapse of two decantation basins of fluorspar, extracted from some mines in Stava Valley, 19 July 1985, northern Italy. During the development of the course, data and event information has been collected, a guided tour to the places of the disaster has been organized, and finally the application of mathematical models to simulate the disaster and analysis of the results has been carried out. The student work has been presented in a public workshop.

  1. Applying Complexity Theory to Risk in Child Protection Practice

    ERIC Educational Resources Information Center

    Stevens, Irene; Hassett, Peter

    2007-01-01

    This article looks at the application of complexity theory to risk assessment in child protection practice, and how it may help to give a better understanding of risk in relation to protecting vulnerable children. Within the last 20 years increasing use has been made of the term complexity within the natural sciences. In recent times, some of the…

  2. Analysis of the interaction between experimental and applied behavior analysis.

    PubMed

    Virues-Ortega, Javier; Hurtado-Parrado, Camilo; Cox, Alison D; Pear, Joseph J

    2014-01-01

    To study the influences between basic and applied research in behavior analysis, we analyzed the coauthorship interactions of authors who published in JABA and JEAB from 1980 to 2010. We paid particular attention to authors who published in both JABA and JEAB (dual authors) as potential agents of cross-field interactions. We present a comprehensive analysis of dual authors' coauthorship interactions using social networks methodology and key word analysis. The number of dual authors more than doubled (26 to 67) and their productivity tripled (7% to 26% of JABA and JEAB articles) between 1980 and 2010. Dual authors stood out in terms of number of collaborators, number of publications, and ability to interact with multiple groups within the field. The steady increase in JEAB and JABA interactions through coauthors and the increasing range of topics covered by dual authors provide a basis for optimism regarding the progressive integration of basic and applied behavior analysis.

  3. Image analysis applied to luminescence microscopy

    NASA Astrophysics Data System (ADS)

    Maire, Eric; Lelievre-Berna, Eddy; Fafeur, Veronique; Vandenbunder, Bernard

    1998-04-01

    We have developed a novel approach to study luminescent light emission during migration of living cells by low-light imaging techniques. The equipment consists in an anti-vibration table with a hole for a direct output under the frame of an inverted microscope. The image is directly captured by an ultra low- light level photon-counting camera equipped with an image intensifier coupled by an optical fiber to a CCD sensor. This installation is dedicated to measure in a dynamic manner the effect of SF/HGF (Scatter Factor/Hepatocyte Growth Factor) both on activation of gene promoter elements and on cell motility. Epithelial cells were stably transfected with promoter elements containing Ets transcription factor-binding sites driving a luciferase reporter gene. Luminescent light emitted by individual cells was measured by image analysis. Images of luminescent spots were acquired with a high aperture objective and time exposure of 10 - 30 min in photon-counting mode. The sensitivity of the camera was adjusted to a high value which required the use of a segmentation algorithm dedicated to eliminate the background noise. Hence, image segmentation and treatments by mathematical morphology were particularly indicated in these experimental conditions. In order to estimate the orientation of cells during their migration, we used a dedicated skeleton algorithm applied to the oblong spots of variable intensities emitted by the cells. Kinetic changes of luminescent sources, distance and speed of migration were recorded and then correlated with cellular morphological changes for each spot. Our results highlight the usefulness of the mathematical morphology to quantify kinetic changes in luminescence microscopy.

  4. Multiple factors explain injury risk in adolescent elite athletes: applying a biopsychosocial perspective.

    PubMed

    von Rosen, Philip; Frohm, Anna; Kottorp, Anders; Fridén, Cecilia; Heijne, Annette

    2017-02-16

    Many risk factors for injury are presented in the literature, few of those are however consistent and the majority is associated with adult and not adolescent elite athletes. The aim was to identify risk factors for injury in adolescent elite athletes, by applying a biopsychosocial approach. A total of 496 adolescent elite athletes (age range 15-19), participating in 16 different sports, were monitored repeatedly over 52 weeks using a validated questionnaire about injuries, training exposure, sleep, stress, nutrition and competence-based self-esteem. Univariate and multiple cox regression analyses were used to calculate hazard ratios (HR) for risk factors for first reported injury. The main finding was that an increase in training volume, training intensity and at the same time decreasing the sleep volume resulted in a higher risk for injury compared to no change in these variables (HR 2.25, 95% CI, 1.46-3.45, p<0.01), which was the strongest risk factor identified. In addition, an increase by one score of competence-based self-esteem increased the hazard for injury with 1.02 (HR 95% CI, 1.00-1.04, p=0.01). Based on the multiple cox regression analysis, an athlete having the identified risk factors (Risk Index, competence-based self-esteem), with an average competence-based self-esteem score, had more than a threefold increased risk for injury (HR 3.35), compared to an athlete with a low competence-based self-esteem and no change in sleep or training volume.. Our findings confirm injury occurrence as a result of multiple risk factors interacting in complex ways. This article is protected by copyright. All rights reserved.

  5. Risk and value analysis of SETI.

    PubMed

    Billingham, J

    1990-01-01

    This paper attempts to apply a traditional risk and value analysis to the Search for Extraterrestrial Intelligence--SETI. In view of the difficulties of assessing the probability of success, a comparison is made between SETI and a previous search for extraterrestrial life, the biological component of Project Viking. Our application of simple Utility Theory, given some reasonable assumptions, suggests that SETI is at least as worthwhile as the biological experiment on Viking.

  6. Risk and value analysis of SETI

    NASA Technical Reports Server (NTRS)

    Billingham, J.

    1990-01-01

    This paper attempts to apply a traditional risk and value analysis to the Search for Extraterrestrial Intelligence--SETI. In view of the difficulties of assessing the probability of success, a comparison is made between SETI and a previous search for extraterrestrial life, the biological component of Project Viking. Our application of simple Utility Theory, given some reasonable assumptions, suggests that SETI is at least as worthwhile as the biological experiment on Viking.

  7. Moving Forward: Positive Behavior Support and Applied Behavior Analysis

    ERIC Educational Resources Information Center

    Tincani, Matt

    2007-01-01

    A controversy has emerged about the relationship between positive behavior support and applied behavior analysis. Some behavior analysts suggest that positive behavior support and applied behavior analysis are the same (e.g., Carr & Sidener, 2002). Others argue that positive behavior support is harmful to applied behavior analysis (e.g., Johnston,…

  8. Applying the lessons of high risk industries to health care

    PubMed Central

    Hudson, P

    2003-01-01

    High risk industries such as commercial aviation and the oil and gas industry have achieved exemplary safety performance. This paper reviews how they have managed to do that. The primary reasons are the positive attitudes towards safety and the operation of effective formal safety management systems. The safety culture provides an important explanation of why such organisations perform well. An evolutionary model of safety culture is provided in which there is a range of cultures from the pathological through the reactive to the calculative. Later, the proactive culture can evolve towards the generative organisation, an alternative description of the high reliability organisation. The current status of health care is reviewed, arguing that it has a much higher level of accidents and has a reactive culture, lagging behind both high risk industries studied in both attitude and systematic management of patient risks. PMID:14645741

  9. A Course of Instruction in Risk Analysis.

    DTIC Science & Technology

    Contents: Risk analysis course schedule; Problems and perspectives - an introduction to a course of instruction in risk analysis ; Analytical...techniques; Overview of the process of risk analysis ; Network analysis; RISCA: USALMC’s network analyzer program; Case studies in risk analysis ; Armored...vehicle launched bridge (AVLB); Micom-air defense missile warhead/fuze subsystem performance; Helicopter performance risk analysis ; High performance fuze

  10. Experiences of Uav Surveys Applied to Environmental Risk Management

    NASA Astrophysics Data System (ADS)

    Caprioli, M.; Trizzino, R.; Mazzone, F.; Scarano, M.

    2016-06-01

    In this paper the results of some surveys carried out in an area of Apulian territory affected by serious environmental hazard are presented. Unmanned Aerial Vehicles (UAV) are emerging as a key engineering tool for future environmental survey tasks. UAVs are increasingly seen as an attractive low-cost alternative or supplement to aerial and terrestrial photogrammetry due to their low cost, flexibility, availability and readiness for duty. In addition, UAVs can be operated in hazardous or temporarily inaccessible locations, that makes them very suitable for the assessment and management of environmental risk conditions. In order to verify the reliability of these technologies an UAV survey and A LIDAR survey have been carried outalong about 1 km of coast in the Salento peninsula, near the towns of San Foca, Torre dellOrso and SantAndrea( Lecce, Southern Italy). This area is affected by serious environmental risks due to the presence of dangerous rocky cliffs named falesie. The UAV platform was equipped with a photogrammetric measurement system that allowed us to obtain a mobile mapping of the fractured fronts of dangerous rocky cliffs. UAV-images data have been processed using dedicated software (AgisoftPhotoscan). The point clouds obtained from both the UAV and LIDAR surveys have been processed using Cloud Compare software, with the aim of testing the UAV results with respect to the LIDAR ones. The total error obtained was of centimeter-order that is a very satisfactory result. The environmental information has been arranged in an ArcGIS platform in order to assess the risk levels. The possibility to repeat the survey at time intervals more or less close together depending on the measured levels of risk and to compare the output allows following the trend of the dangerous phenomena. In conclusion, for inaccessible locations of dangerous rocky bodies the UAV survey coupled with GIS methodology proved to be a key engineering tool for the management of environmental

  11. The dissection of risk: a conceptual analysis.

    PubMed

    O'Byrne, Patrick

    2008-03-01

    Recently, patient safety has gained popularity in the nursing literature. While this topic is used extensively and has been analyzed thoroughly, some of the concepts upon which it relies, such as risk, have remained undertheorized. In fact, despite its considerable use, the term 'risk' has been largely assumed to be inherently neutral - meaning that its definition and discovery is seen as objective and impartial, and that risk avoidance is natural and logical. Such an oversight in evaluation requires that the concept of risk be thoroughly analyzed as it relates to nursing practices, particularly in relation to those practices surrounding bio-political nursing care, such as public health, as well as other more trendy nursing topics, such as patient safety. Thus, this paper applies the Evolutionary Model of concept analysis to explore 'risk', and expose it as one mechanism of maintaining prescribed/ proscribed social practices. Thereby, an analysis of risk results in the definitions and roles of the discipline and profession of nursing expanding from solely being dedicated to patient care, to include, in addition, its functions as a governmental body that unwittingly maintains hegemonic infrastructures.

  12. Applying Mechanics to Swimming Performance Analysis.

    ERIC Educational Resources Information Center

    Barthels, Katharine

    1989-01-01

    Swimming teachers and coaches can improve their feedback to swimmers, when correcting or refining swim movements, by applying some basic biomechanical concepts relevant to swimming. This article focuses on the biomechanical considerations used in analyzing swimming performance. Techniques for spotting and correcting problems that impede…

  13. Introduction: Conversation Analysis in Applied Linguistics

    ERIC Educational Resources Information Center

    Sert, Olcay; Seedhouse, Paul

    2011-01-01

    This short, introductory paper presents an up-to-date account of works within the field of Applied Linguistics which have been influenced by a Conversation Analytic paradigm. The article reviews recent studies in classroom interaction, materials development, proficiency assessment and language teacher education. We believe that the publication of…

  14. Cancer Risk Assessment: Should New Science be Applied? Workgroup summary

    SciTech Connect

    Richard J. Bull; Antone L. Brooks

    2002-12-15

    OAK-B135 A symposium discussing the implications of certain phenomena observed in radiation biology for cancer risk assessment in general. In July of 2002 a workshop was convened that explored some of the intercellular phenomena that appear to condition responses to carcinogen exposure. Effects that result from communication between cells that appear to either increase the sphere of damage or to modify the sensitivity of cells to further damage were of particular interest. Much of the discussion focused on the effects of ionizing radiation that were transmitted from cells directly hit to cells not receiving direct exposure to radiation (bystander cells). In cell culture, increased rates of mutation, chromosomal aberration, apoptosis, genomic instability, and decreased clonogenic survival have all been observed in cells that have experienced no direct radiation. In addition, there is evidence that low doses of radiation or certain chemicals give rise to adaptive responses in which the treated cells develop resistance to the effects of high doses given in subsequent exposures. Data were presented at the workshop indicating that low dose exposure of animals to radiation and some chemicals frequently reduces the spontaneous rate of mutation in vitro and tumor responses in vivo. Finally, it was concluded that considerable improvement in understanding of how genetic variation may modify the impact of these phenomena is necessary before the risk implications can be fully appreciated. The workshop participants discussed the substantive challenge that these data present with respect to simple linear methodologies that are currently used in cancer risk assessment and attempted to identify broad strategies by which these phenomena may start to be used to refine cancer risk assessment methods in the future.

  15. Applying Personal Genetic Data to Injury Risk Assessment in Athletes

    PubMed Central

    Goodlin, Gabrielle T.; Roos, Andrew K.; Roos, Thomas R.; Hawkins, Claire; Beache, Sydney; Baur, Stephen; Kim, Stuart K.

    2015-01-01

    Recent studies have identified genetic markers associated with risk for certain sports-related injuries and performance-related conditions, with the hope that these markers could be used by individual athletes to personalize their training and diet regimens. We found that we could greatly expand the knowledge base of sports genetic information by using published data originally found in health and disease studies. For example, the results from large genome-wide association studies for low bone mineral density in elderly women can be re-purposed for low bone mineral density in young endurance athletes. In total, we found 124 single-nucleotide polymorphisms associated with: anterior cruciate ligament tear, Achilles tendon injury, low bone mineral density and stress fracture, osteoarthritis, vitamin/mineral deficiencies, and sickle cell trait. Of these single nucleotide polymorphisms, 91% have not previously been used in sports genetics. We conducted a pilot program on fourteen triathletes using this expanded knowledge base of genetic variants associated with sports injury. These athletes were genotyped and educated about how their individual genetic make-up affected their personal risk profile during an hour-long personal consultation. Overall, participants were favorable of the program, found it informative, and most acted upon their genetic results. This pilot program shows that recent genetic research provides valuable information to help reduce sports injuries and to optimize nutrition. There are many genetic studies for health and disease that can be mined to provide useful information to athletes about their individual risk for relevant injuries. PMID:25919592

  16. Reachability Analysis Applied to Space Situational Awareness

    DTIC Science & Technology

    2009-09-01

    12:207-242. [15] L. S. Breger, G. Inalhan, M. Tillerson, J. P. How, “Cooperative spacecraft Formation Flying: Model Predictive Control With Open And...applying them to the nonlinear relative orbit equations of motion, which are appropriate both for general SSA and spacecraft proximity operations... Nonlinear System (RNS). This assumption does not restrict the scope of these results in the context of SSA, as in orbital scenarios control and

  17. Intelligent adversary risk analysis: a bioterrorism risk management model.

    PubMed

    Parnell, Gregory S; Smith, Christopher M; Moxley, Frederick I

    2010-01-01

    The tragic events of 9/11 and the concerns about the potential for a terrorist or hostile state attack with weapons of mass destruction have led to an increased emphasis on risk analysis for homeland security. Uncertain hazards (natural and engineering) have been successfully analyzed using probabilistic risk analysis (PRA). Unlike uncertain hazards, terrorists and hostile states are intelligent adversaries who can observe our vulnerabilities and dynamically adapt their plans and actions to achieve their objectives. This article compares uncertain hazard risk analysis with intelligent adversary risk analysis, describes the intelligent adversary risk analysis challenges, and presents a probabilistic defender-attacker-defender model to evaluate the baseline risk and the potential risk reduction provided by defender investments. The model includes defender decisions prior to an attack; attacker decisions during the attack; defender actions after an attack; and the uncertainties of attack implementation, detection, and consequences. The risk management model is demonstrated with an illustrative bioterrorism problem with notional data.

  18. Applied surface analysis in magnetic storage technology

    NASA Astrophysics Data System (ADS)

    Windeln, Johannes; Bram, Christian; Eckes, Heinz-Ludwig; Hammel, Dirk; Huth, Johanna; Marien, Jan; Röhl, Holger; Schug, Christoph; Wahl, Michael; Wienss, Andreas

    2001-07-01

    This paper gives a synopsis of today's challenges and requirements for a surface analysis and materials science laboratory with a special focus on magnetic recording technology. The critical magnetic recording components, i.e. the protective carbon overcoat (COC), the disk layer structure, the read/write head including the giant-magnetoresistive (GMR) sensor, are described and options for their characterization with specific surface and structure analysis techniques are given. For COC investigations, applications of Raman spectroscopy to the structural analysis and determination of thickness, hydrogen and nitrogen content are discussed. Hardness measurements by atomic force microscopy (AFM) scratching techniques are presented. Surface adsorption phenomena on disk substrates or finished disks are characterized by contact angle analysis or so-called piezo-electric mass adsorption systems (PEMAS), also known as quartz crystal microbalance (QCM). A quickly growing field of applications is listed for various X-ray analysis techniques, such as disk magnetic layer texture analysis for X-ray diffraction, compositional characterization via X-ray fluorescence, compositional analysis with high lateral resolution via electron microprobe analysis. X-ray reflectometry (XRR) has become a standard method for the absolute measurement of individual layer thicknesses contained in multi-layer stacks and thus, is the successor of ellipsometry for this application. Due to the ongoing reduction of critical feature sizes, the analytical challenges in terms of lateral resolution, sensitivity limits and dedicated nano-preparation have been consistently growing and can only be met by state-of-the-art Auger electron spectrometers (AES), transmission electron microscopy (TEM) analysis, time-of-flight-secondary ion mass spectroscopy (ToF-SIMS) characterization, focused ion beam (FIB) sectioning and TEM lamella preparation via FIB. The depth profiling of GMR sensor full stacks was significantly

  19. Putting problem formulation at the forefront of GMO risk analysis.

    PubMed

    Tepfer, Mark; Racovita, Monica; Craig, Wendy

    2013-01-01

    When applying risk assessment and the broader process of risk analysis to decisions regarding the dissemination of genetically modified organisms (GMOs), the process has a tendency to become remarkably complex. Further, as greater numbers of countries consider authorising the large-scale dissemination of GMOs, and as GMOs with more complex traits reach late stages of development, there has been increasing concern about the burden posed by the complexity of risk analysis. We present here an improved approach for GMO risk analysis that gives a central role to problem formulation. Further, the risk analysis strategy has been clarified and simplified in order to make rigorously scientific risk assessment and risk analysis more broadly accessible to diverse stakeholder groups.

  20. Meta-analysis in applied ecology.

    PubMed

    Stewart, Gavin

    2010-02-23

    This overview examines research synthesis in applied ecology and conservation. Vote counting and pooling unweighted averages are widespread despite the superiority of syntheses based on weighted combination of effects. Such analyses allow exploration of methodological uncertainty in addition to consistency of effects across species, space and time, but exploring heterogeneity remains controversial. Meta-analyses are required to generalize in ecology, and to inform evidence-based decision-making, but the more sophisticated statistical techniques and registers of research used in other disciplines must be employed in ecology to fully realize their benefits.

  1. Evaluating the Risks of Clinical Research: Direct Comparative Analysis

    PubMed Central

    Abdoler, Emily; Roberson-Nay, Roxann; Pine, Daniel S.; Wendler, David

    2014-01-01

    Abstract Objectives: Many guidelines and regulations allow children and adolescents to be enrolled in research without the prospect of clinical benefit when it poses minimal risk. However, few systematic methods exist to determine when research risks are minimal. This situation has led to significant variation in minimal risk judgments, raising concern that some children are not being adequately protected. To address this concern, we describe a new method for implementing the widely endorsed “risks of daily life” standard for minimal risk. This standard defines research risks as minimal when they do not exceed the risks posed by daily life activities or routine examinations. Methods: This study employed a conceptual and normative analysis, and use of an illustrative example. Results: Different risks are composed of the same basic elements: Type, likelihood, and magnitude of harm. Hence, one can compare the risks of research and the risks of daily life by comparing the respective basic elements with each other. We use this insight to develop a systematic method, direct comparative analysis, for implementing the “risks of daily life” standard for minimal risk. The method offers a way of evaluating research procedures that pose the same types of risk as daily life activities, such as the risk of experiencing anxiety, stress, or other psychological harm. We thus illustrate how direct comparative analysis can be applied in practice by using it to evaluate whether the anxiety induced by a respiratory CO2 challenge poses minimal or greater than minimal risks in children and adolescents. Conclusions: Direct comparative analysis is a systematic method for applying the “risks of daily life” standard for minimal risk to research procedures that pose the same types of risk as daily life activities. It thereby offers a method to protect children and adolescents in research, while ensuring that important studies are not blocked because of unwarranted concerns about

  2. Nano risk analysis: advancing the science for nanomaterials risk management.

    PubMed

    Shatkin, Jo Anne; Abbott, Linda Carolyn; Bradley, Ann E; Canady, Richard Alan; Guidotti, Tee; Kulinowski, Kristen M; Löfstedt, Ragnar E; Louis, Garrick; MacDonell, Margaret; Macdonell, Margaret; Maynard, Andrew D; Paoli, Greg; Sheremeta, Lorraine; Walker, Nigel; White, Ronald; Williams, Richard

    2010-11-01

    Scientists, activists, industry, and governments have raised concerns about health and environmental risks of nanoscale materials. The Society for Risk Analysis convened experts in September 2008 in Washington, DC to deliberate on issues relating to the unique attributes of nanoscale materials that raise novel concerns about health risks. This article reports on the overall themes and findings of the workshop, uncovering the underlying issues for each of these topics that become recurring themes. The attributes of nanoscale particles and other nanomaterials that present novel issues for risk analysis are evaluated in a risk analysis framework, identifying challenges and opportunities for risk analysts and others seeking to assess and manage the risks from emerging nanoscale materials and nanotechnologies. Workshop deliberations and recommendations for advancing the risk analysis and management of nanotechnologies are presented.

  3. Applied Spectrophotometry: Analysis of a Biochemical Mixture

    ERIC Educational Resources Information Center

    Trumbo, Toni A.; Schultz, Emeric; Borland, Michael G.; Pugh, Michael Eugene

    2013-01-01

    Spectrophotometric analysis is essential for determining biomolecule concentration of a solution and is employed ubiquitously in biochemistry and molecular biology. The application of the Beer-Lambert-Bouguer Lawis routinely used to determine the concentration of DNA, RNA or protein. There is however a significant difference in determining the…

  4. Thermal analysis applied to irradiated propolis

    NASA Astrophysics Data System (ADS)

    Matsuda, Andrea Harumi; Machado, Luci Brocardo; del Mastro, Nélida Lucia

    2002-03-01

    Propolis is a resinous hive product, collected by bees. Raw propolis requires a decontamination procedure and irradiation appears as a promising technique for this purpose. The valuable properties of propolis for food and pharmaceutical industries have led to increasing interest in its technological behavior. Thermal analysis is a chemical analysis that gives information about changes on heating of great importance for technological applications. Ground propolis samples were 60Co gamma irradiated with 0 and 10 kGy. Thermogravimetry curves shown a similar multi-stage decomposition pattern for both irradiated and unirradiated samples up to 600°C. Similarly, through differential scanning calorimetry , a coincidence of melting point of irradiated and unirradiated samples was found. The results suggest that the irradiation process do not interfere on the thermal properties of propolis when irradiated up to 10 kGy.

  5. Biomechanics and motion analysis applied to sports.

    PubMed

    Zheng, N; Barrentine, S W

    2000-05-01

    The development of motion analysis and the application of biomechanical analysis techniques to sports has paralleled the exponential growth of computational and videographic technology. Technological developments have provided for advances in the investigation of the human body and the action of the human body during sports believed to be unobtainable a few years ago. Technological advancements have brought biomechanical applications into a wide range of fields from orthopedics to entertainment. An area that has made tremendous gains using biomechanics is sports science. Coaches, therapists, and physicians are using biomechanics to improve performance, rehabilitation, and the prevention of sports related injuries. Functional analyses of athletic movements that were impossible a few years ago are available and used today. With new advancements, the possibilities for investigating the way a human interacts and reacts to environmental conditions are ever expanding.

  6. Applying thiouracil tagging to mouse transcriptome analysis.

    PubMed

    Gay, Leslie; Karfilis, Kate V; Miller, Michael R; Doe, Chris Q; Stankunas, Kryn

    2014-02-01

    Transcriptional profiling is a powerful approach for studying mouse development, physiology and disease models. Here we describe a protocol for mouse thiouracil tagging (TU tagging), a transcriptome analysis technology that includes in vivo covalent labeling, purification and analysis of cell type-specific RNA. TU tagging enables the isolation of RNA from a given cell population of a complex tissue, avoiding transcriptional changes induced by cell isolation trauma, as well as the identification of actively transcribed RNAs and not preexisting transcripts. Therefore, in contrast to other cell-specific transcriptional profiling methods based on the purification of tagged ribosomes or nuclei, TU tagging provides a direct examination of transcriptional regulation. We describe how to (i) deliver 4-thiouracil to transgenic mice to thio-label cell lineage-specific transcripts, (ii) purify TU-tagged RNA and prepare libraries for Illumina sequencing and (iii) follow a straightforward bioinformatics workflow to identify cell type-enriched or differentially expressed genes. Tissue containing TU-tagged RNA can be obtained in 1 d, RNA-seq libraries can be generated within 2 d and, after sequencing, an initial bioinformatics analysis can be completed in 1 additional day.

  7. Command Process Modeling & Risk Analysis

    NASA Technical Reports Server (NTRS)

    Meshkat, Leila

    2011-01-01

    Commanding Errors may be caused by a variety of root causes. It's important to understand the relative significance of each of these causes for making institutional investment decisions. One of these causes is the lack of standardized processes and procedures for command and control. We mitigate this problem by building periodic tables and models corresponding to key functions within it. These models include simulation analysis and probabilistic risk assessment models.

  8. Subpixels analysis model applied to floodplain monitoring

    NASA Astrophysics Data System (ADS)

    Giraldo Osorio, J. D.; García Galiano, S. G.

    2009-04-01

    The traditional techniques to gauge hydrological events often fail with the extreme events. A particular case is the floods spatial detection. In this work, the remote sensing techniques and Geographic Information Systems (GIS) have been merged to develop a key tool for monitoring of floods. The low density of gauge stations networks in the development countries becomes remote sensing techniques the most suitable and economic way to delimitate the flood area and compute the damages cost. The common classification techniques of satellite images use "hard methods" in the sense of a pixel is assigned to an unique land cover class. For coarse resolution, the pixels inevitably will be mixed, so "soft methods" can be used in order to assign several land cover classes according to the surface fractions covered by each one. The main objective of this work is the dynamic monitoring of floods in large areas, based on satellite images -with moderate spatial resolution but with high time resolution- and Digital Elevation Model (DEM). Classified maps with finer spatial resolution can be built through the methodology of Subpixels Analysis developed. The procedure is supported on both the Linear Mixture Model (LMM) and Spatial Coherence Analysis (SCA) hypothesis. The LMM builds the land cover fraction maps through an optimization procedure which uses Lagrange Multipliers, while the SCA defines the most likely place for the land cover fractions within the coarse pixel using linear programming. A subsequent procedure improves the flooded area identification using both the drainage direction and flow accumulation raster maps derived from DEM of the study zone. The Subpixels Analysis technique was validated using historical data of floods which were obtained from satellite images. The procedure improves the spatial resolution of classified maps from satellite images with coarse resolution, while the "hard methods" keep the spatial resolution from the input coarse satellite image.

  9. Artificial intelligence applied to process signal analysis

    NASA Technical Reports Server (NTRS)

    Corsberg, Dan

    1988-01-01

    Many space station processes are highly complex systems subject to sudden, major transients. In any complex process control system, a critical aspect of the human/machine interface is the analysis and display of process information. Human operators can be overwhelmed by large clusters of alarms that inhibit their ability to diagnose and respond to a disturbance. Using artificial intelligence techniques and a knowledge base approach to this problem, the power of the computer can be used to filter and analyze plant sensor data. This will provide operators with a better description of the process state. Once a process state is recognized, automatic action could be initiated and proper system response monitored.

  10. Scanning methods applied to bitemark analysis

    NASA Astrophysics Data System (ADS)

    Bush, Peter J.; Bush, Mary A.

    2010-06-01

    The 2009 National Academy of Sciences report on forensics focused criticism on pattern evidence subdisciplines in which statements of unique identity are utilized. One principle of bitemark analysis is that the human dentition is unique to the extent that a perpetrator may be identified based on dental traits in a bitemark. Optical and electron scanning methods were used to measure dental minutia and to investigate replication of detail in human skin. Results indicated that being a visco-elastic substrate, skin effectively reduces the resolution of measurement of dental detail. Conclusions indicate caution in individualization statements.

  11. Risk factors analysis of consecutive exotropia

    PubMed Central

    Gong, Qianwen; Wei, Hong; Zhou, Xu; Li, Ziyuan; Liu, Longqian

    2016-01-01

    Abstract To evaluate clinical factors associated with the onset of consecutive exotropia (XT) following esotropia surgery. By a retrospective nested case-control design, we reviewed the medical records of 193 patients who had undergone initial esotropia surgery between 2008 and 2015, and had follow-up longer than 6 months. The probable risk factors were evaluated between groups 1 (consecutive XT) and 2 (non-consecutive exotropia). Pearson chi-square test and Mann–Whitney U test were used for univariate analysis, and conditional logistic regression model was applied for exploring the potential risk factors of consecutive XT. Consecutive exotropia occurred in 23 (11.9%) of 193 patients. Patients who had undergone large bilateral medial rectus recession (BMR) (P = 0.017) had a high risk of developing consecutive XT. Oblique dysfunction (P = 0.001), adduction limitation (P = 0.000) were associated with a high risk of consecutive XT, which was confirmed in the conditional logistic regression analysis. In addition, large amount of BMR (6 mm or more) was associated with higher incidence of adduction limitation (P = 0.045). The surgical methods and preoperative factors did not appear to influence the risk of developing consecutive XT (P > 0.05). The amount of surgery could be optimized to reduce the risk of consecutive XT. The presence of oblique overaction and postoperative adduction limitation may be associated with a high risk of consecutive XT, which may require close supervision, and/or even earlier operation intervention. PMID:27977611

  12. Multivariate analysis applied to tomato hybrid production.

    PubMed

    Balasch, S; Nuez, F; Palomares, G; Cuartero, J

    1984-11-01

    Twenty characters were measured on 60 tomato varieties cultivated in the open-air and in polyethylene plastic-house. Data were analyzed by means of principal components, factorial discriminant methods, Mahalanobis D(2) distances and principal coordinate techniques. Factorial discriminant and Mahalanobis D(2) distances methods, both of which require collecting data plant by plant, lead to similar conclusions as the principal components method that only requires taking data by plots. Characters that make up the principal components in both environments studied are the same, although the relative importance of each one of them varies within the principal components. By combining information supplied by multivariate analysis with the inheritance mode of characters, crossings among cultivars can be experimented with that will produce heterotic hybrids showing characters within previously established limits.

  13. Toward applied behavior analysis of life aloft.

    PubMed

    Brady, J V

    1990-01-01

    This article deals with systems at multiple levels, at least from cell to organization. It also deals with learning, decision making, and other behavior at multiple levels. Technological development of a human behavioral ecosystem appropriate to space environments requires an analytic and synthetic orientation, explicitly experimental in nature, dictated by scientific and pragmatic considerations, and closely approximating procedures of established effectiveness in other areas of natural science. The conceptual basis of such an approach has its roots in environmentalism which has two main features: (1) knowledge comes from experience rather than from innate ideas, divine revelation, or other obscure sources; and (2) action is governed by consequences rather than by instinct, reason, will, beliefs, attitudes or even the currently fashionable cognitions. Without an experimentally derived data base founded upon such a functional analysis of human behavior, the overgenerality of "ecological systems" approaches render them incapable of ensuring the successful establishment of enduring space habitats. Without an experimentally derived function account of individual behavioral variability, a natural science of behavior cannot exist. And without a natural science of behavior, the social sciences will necessarily remain in their current status as disciplines of less than optimal precision or utility. Such a functional analysis of human performance should provide an operational account of behavior change in a manner similar to the way in which Darwin's approach to natural selection accounted for the evolution of phylogenetic lines (i.e., in descriptive, nonteleological terms). Similarly, as Darwin's account has subsequently been shown to be consonant with information obtained at the cellular level, so too should behavior principles ultimately prove to be in accord with an account of ontogenetic adaptation at a biochemical level. It would thus seem obvious that the most

  14. Toward applied behavior analysis of life aloft

    NASA Technical Reports Server (NTRS)

    Brady, J. V.

    1990-01-01

    This article deals with systems at multiple levels, at least from cell to organization. It also deals with learning, decision making, and other behavior at multiple levels. Technological development of a human behavioral ecosystem appropriate to space environments requires an analytic and synthetic orientation, explicitly experimental in nature, dictated by scientific and pragmatic considerations, and closely approximating procedures of established effectiveness in other areas of natural science. The conceptual basis of such an approach has its roots in environmentalism which has two main features: (1) knowledge comes from experience rather than from innate ideas, divine revelation, or other obscure sources; and (2) action is governed by consequences rather than by instinct, reason, will, beliefs, attitudes or even the currently fashionable cognitions. Without an experimentally derived data base founded upon such a functional analysis of human behavior, the overgenerality of "ecological systems" approaches render them incapable of ensuring the successful establishment of enduring space habitats. Without an experimentally derived function account of individual behavioral variability, a natural science of behavior cannot exist. And without a natural science of behavior, the social sciences will necessarily remain in their current status as disciplines of less than optimal precision or utility. Such a functional analysis of human performance should provide an operational account of behavior change in a manner similar to the way in which Darwin's approach to natural selection accounted for the evolution of phylogenetic lines (i.e., in descriptive, nonteleological terms). Similarly, as Darwin's account has subsequently been shown to be consonant with information obtained at the cellular level, so too should behavior principles ultimately prove to be in accord with an account of ontogenetic adaptation at a biochemical level. It would thus seem obvious that the most

  15. Tsunamis: Global Exposure and Local Risk Analysis

    NASA Astrophysics Data System (ADS)

    Harbitz, C. B.; Løvholt, F.; Glimsdal, S.; Horspool, N.; Griffin, J.; Davies, G.; Frauenfelder, R.

    2014-12-01

    The 2004 Indian Ocean tsunami led to a better understanding of the likelihood of tsunami occurrence and potential tsunami inundation, and the Hyogo Framework for Action (HFA) was one direct result of this event. The United Nations International Strategy for Disaster Risk Reduction (UN-ISDR) adopted HFA in January 2005 in order to reduce disaster risk. As an instrument to compare the risk due to different natural hazards, an integrated worldwide study was implemented and published in several Global Assessment Reports (GAR) by UN-ISDR. The results of the global earthquake induced tsunami hazard and exposure analysis for a return period of 500 years are presented. Both deterministic and probabilistic methods (PTHA) are used. The resulting hazard levels for both methods are compared quantitatively for selected areas. The comparison demonstrates that the analysis is rather rough, which is expected for a study aiming at average trends on a country level across the globe. It is shown that populous Asian countries account for the largest absolute number of people living in tsunami prone areas, more than 50% of the total exposed people live in Japan. Smaller nations like Macao and the Maldives are among the most exposed by population count. Exposed nuclear power plants are limited to Japan, China, India, Taiwan, and USA. On the contrary, a local tsunami vulnerability and risk analysis applies information on population, building types, infrastructure, inundation, flow depth for a certain tsunami scenario with a corresponding return period combined with empirical data on tsunami damages and mortality. Results and validation of a GIS tsunami vulnerability and risk assessment model are presented. The GIS model is adapted for optimal use of data available for each study. Finally, the importance of including landslide sources in the tsunami analysis is also discussed.

  16. Rotary spectra analysis applied to static stabilometry.

    PubMed

    Chiaramello, E; Knaflitz, M; Agostini, V

    2011-01-01

    Static stabilometry is a technique aimed at quantifying postural sway during quiet standing in the upright position. Many different models and many different techniques to analyze the trajectories of the Centre of Pressure (CoP) have been proposed. Most of the parameters calculated according to these different approaches are affected by a relevant intra- and inter-subject variability or do not have a clear physiological interpretation. In this study we hypothesize that CoP trajectories have rotational characteristics, therefore we decompose them in clockwise and counter-clockwise components, using the rotary spectra analysis. Rotary spectra obtained studying a population of healthy subjects are described through the group average of spectral parameters, i.e., 95% spectral bandwidth, mean frequency, median frequency, and skewness. Results are reported for the clockwise and the counter-clockwise components and refer to the upright position maintained with eyes open or closed. This study demonstrates that the approach is feasible and that some of the spectral parameters are statistically different between the open and closed eyes conditions. More research is needed to demonstrate the clinical applicability of this approach, but results so far obtained are promising.

  17. Digital photoelastic analysis applied to implant dentistry

    NASA Astrophysics Data System (ADS)

    Ramesh, K.; Hariprasad, M. P.; Bhuvanewari, S.

    2016-12-01

    Development of improved designs of implant systems in dentistry have necessitated the study of stress fields in the implant regions of the mandible/maxilla for better understanding of the biomechanics involved. Photoelasticity has been used for various studies related to dental implants in view of whole field visualization of maximum shear stress in the form of isochromatic contours. The potential of digital photoelasticity has not been fully exploited in the field of implant dentistry. In this paper, the fringe field in the vicinity of the connected implants (All-On-Four® concept) is analyzed using recent advances in digital photoelasticity. Initially, a novel 3-D photoelastic model making procedure, to closely mimic all the anatomical features of the human mandible is proposed. By choosing appropriate orientation of the model with respect to the light path, the essential region of interest were sought to be analysed while keeping the model under live loading conditions. Need for a sophisticated software module to carefully identify the model domain has been brought out. For data extraction, five-step method is used and isochromatics are evaluated by twelve fringe photoelasticity. In addition to the isochromatic fringe field, whole field isoclinic data is also obtained for the first time in implant dentistry, which could throw important information in improving the structural stability of the implant systems. Analysis is carried out for the implant in the molar as well as the incisor region. In addition, the interaction effects of loaded molar implant on the incisor area are also studied.

  18. Goals Analysis Procedure Guidelines for Applying the Goals Analysis Process

    NASA Technical Reports Server (NTRS)

    Motley, Albert E., III

    2000-01-01

    One of the key elements to successful project management is the establishment of the "right set of requirements", requirements that reflect the true customer needs and are consistent with the strategic goals and objectives of the participating organizations. A viable set of requirements implies that each individual requirement is a necessary element in satisfying the stated goals and that the entire set of requirements, taken as a whole, is sufficient to satisfy the stated goals. Unfortunately, it is the author's experience that during project formulation phases' many of the Systems Engineering customers do not conduct a rigorous analysis of the goals and objectives that drive the system requirements. As a result, the Systems Engineer is often provided with requirements that are vague, incomplete, and internally inconsistent. To complicate matters, most systems development methodologies assume that the customer provides unambiguous, comprehensive and concise requirements. This paper describes the specific steps of a Goals Analysis process applied by Systems Engineers at the NASA Langley Research Center during the formulation of requirements for research projects. The objective of Goals Analysis is to identify and explore all of the influencing factors that ultimately drive the system's requirements.

  19. FORTRAN computer program for seismic risk analysis

    USGS Publications Warehouse

    McGuire, Robin K.

    1976-01-01

    A program for seismic risk analysis is described which combines generality of application, efficiency and accuracy of operation, and the advantage of small storage requirements. The theoretical basis for the program is first reviewed, and the computational algorithms used to apply this theory are described. The information required for running the program is listed. Published attenuation functions describing the variation with earthquake magnitude and distance of expected values for various ground motion parameters are summarized for reference by the program user. Finally, suggestions for use of the program are made, an example problem is described (along with example problem input and output) and the program is listed.

  20. Risk analysis of heat recovery steam generator with semi quantitative risk based inspection API 581

    NASA Astrophysics Data System (ADS)

    Prayogo, Galang Sandy; Haryadi, Gunawan Dwi; Ismail, Rifky; Kim, Seon Jin

    2016-04-01

    Corrosion is a major problem that most often occurs in the power plant. Heat recovery steam generator (HRSG) is an equipment that has a high risk to the power plant. The impact of corrosion damage causing HRSG power plant stops operating. Furthermore, it could be threaten the safety of employees. The Risk Based Inspection (RBI) guidelines by the American Petroleum Institute (API) 58 has been used to risk analysis in the HRSG 1. By using this methodology, the risk that caused by unexpected failure as a function of the probability and consequence of failure can be estimated. This paper presented a case study relating to the risk analysis in the HRSG, starting with a summary of the basic principles and procedures of risk assessment and applying corrosion RBI for process industries. The risk level of each HRSG equipment were analyzed: HP superheater has a medium high risk (4C), HP evaporator has a medium-high risk (4C), and the HP economizer has a medium risk (3C). The results of the risk assessment using semi-quantitative method of standard API 581 based on the existing equipment at medium risk. In the fact, there is no critical problem in the equipment components. Damage mechanisms were prominent throughout the equipment is thinning mechanism. The evaluation of the risk approach was done with the aim of reducing risk by optimizing the risk assessment activities.

  1. Treatment Integrity in Applied Behavior Analysis with Children.

    ERIC Educational Resources Information Center

    Gresham, Frank M.; And Others

    1993-01-01

    A review of 158 applied behavior analysis studies with children as subjects, published in the "Journal of Applied Behavior Analysis" between 1980 and 1990, found that (1) 16% measured the accuracy of independent variable implementation, and (2) two-thirds did not operationally define components of the independent variable. Specific recommendations…

  2. 38 CFR 75.115 - Risk analysis.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... preparation of the risk analysis may include data mining if necessary for the development of relevant... sensitive personal information, the risk analysis must also contain operational recommendations for responding to the data breach. Each risk analysis, regardless of findings and operational...

  3. 38 CFR 75.115 - Risk analysis.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... preparation of the risk analysis may include data mining if necessary for the development of relevant... sensitive personal information, the risk analysis must also contain operational recommendations for responding to the data breach. Each risk analysis, regardless of findings and operational...

  4. 38 CFR 75.115 - Risk analysis.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... preparation of the risk analysis may include data mining if necessary for the development of relevant... sensitive personal information, the risk analysis must also contain operational recommendations for responding to the data breach. Each risk analysis, regardless of findings and operational...

  5. Applying Qualitative Hazard Analysis to Support Quantitative Safety Analysis for Proposed Reduced Wake Separation Conops

    NASA Technical Reports Server (NTRS)

    Shortle, John F.; Allocco, Michael

    2005-01-01

    This paper describes a scenario-driven hazard analysis process to identify, eliminate, and control safety-related risks. Within this process, we develop selective criteria to determine the applicability of applying engineering modeling to hypothesized hazard scenarios. This provides a basis for evaluating and prioritizing the scenarios as candidates for further quantitative analysis. We have applied this methodology to proposed concepts of operations for reduced wake separation for closely spaced parallel runways. For arrivals, the process identified 43 core hazard scenarios. Of these, we classified 12 as appropriate for further quantitative modeling, 24 that should be mitigated through controls, recommendations, and / or procedures (that is, scenarios not appropriate for quantitative modeling), and 7 that have the lowest priority for further analysis.

  6. Robust regression applied to fractal/multifractal analysis.

    NASA Astrophysics Data System (ADS)

    Portilla, F.; Valencia, J. L.; Tarquis, A. M.; Saa-Requejo, A.

    2012-04-01

    Fractal and multifractal are concepts that have grown increasingly popular in recent years in the soil analysis, along with the development of fractal models. One of the common steps is to calculate the slope of a linear fit commonly using least squares method. This shouldn't be a special problem, however, in many situations using experimental data the researcher has to select the range of scales at which is going to work neglecting the rest of points to achieve the best linearity that in this type of analysis is necessary. Robust regression is a form of regression analysis designed to circumvent some limitations of traditional parametric and non-parametric methods. In this method we don't have to assume that the outlier point is simply an extreme observation drawn from the tail of a normal distribution not compromising the validity of the regression results. In this work we have evaluated the capacity of robust regression to select the points in the experimental data used trying to avoid subjective choices. Based on this analysis we have developed a new work methodology that implies two basic steps: • Evaluation of the improvement of linear fitting when consecutive points are eliminated based on R p-value. In this way we consider the implications of reducing the number of points. • Evaluation of the significance of slope difference between fitting with the two extremes points and fitted with the available points. We compare the results applying this methodology and the common used least squares one. The data selected for these comparisons are coming from experimental soil roughness transect and simulated based on middle point displacement method adding tendencies and noise. The results are discussed indicating the advantages and disadvantages of each methodology. Acknowledgements Funding provided by CEIGRAM (Research Centre for the Management of Agricultural and Environmental Risks) and by Spanish Ministerio de Ciencia e Innovación (MICINN) through project no

  7. Applying risk assessment models in non-surgical patients: effective risk stratification.

    PubMed

    Eldor, A

    1999-08-01

    Pulmonary embolism and deep vein thrombosis are serious complications of non-surgical patients, but scarcity of data documenting prophylaxis means antithrombotic therapy is rarely used. Prediction of risk is complicated by the variation in the medical conditions associated with venous thromboembolism (VTE), and lack of data defining risk in different groups. Accurate risk assessment is further confounded by inherited or acquired factors for VTE, additional risk due to medical interventions, and interactions between risk factors. Acquired and inherited risk factors may underlie thromboembolic complications in a range of conditions, including pregnancy, ischaemic stroke, myocardial infarction and cancer. Risk stratification may be feasible in non-surgical patients by considering individual risk factors and their cumulative effects. Current risk assessment models require expansion and modification to reflect emerging evidence in the non-surgical field. A large on-going study of prophylaxis with low-molecular-weight heparin in non-surgical patients will clarify our understanding of the components of risk, and assist in developing therapy recommendations.

  8. General Risk Analysis Methodological Implications to Explosives Risk Management Systems,

    DTIC Science & Technology

    An investigation sponsored by the National Science Foundation has produced as one of its results a survey and evaluation of risk analysis methodologies...This paper presents some implications of the survey to risk analysis and decision making for explosives hazards such as may ultimately be

  9. RAMS (Risk Analysis - Modular System) methodology

    SciTech Connect

    Stenner, R.D.; Strenge, D.L.; Buck, J.W.

    1996-10-01

    The Risk Analysis - Modular System (RAMS) was developed to serve as a broad scope risk analysis tool for the Risk Assessment of the Hanford Mission (RAHM) studies. The RAHM element provides risk analysis support for Hanford Strategic Analysis and Mission Planning activities. The RAHM also provides risk analysis support for the Hanford 10-Year Plan development activities. The RAMS tool draws from a collection of specifically designed databases and modular risk analysis methodologies and models. RAMS is a flexible modular system that can be focused on targeted risk analysis needs. It is specifically designed to address risks associated with overall strategy, technical alternative, and `what if` questions regarding the Hanford cleanup mission. RAMS is set up to address both near-term and long-term risk issues. Consistency is very important for any comparative risk analysis, and RAMS is designed to efficiently and consistently compare risks and produce risk reduction estimates. There is a wide range of output information that can be generated by RAMS. These outputs can be detailed by individual contaminants, waste forms, transport pathways, exposure scenarios, individuals, populations, etc. However, they can also be in rolled-up form to support high-level strategy decisions.

  10. Negative reinforcement in applied behavior analysis: an emerging technology.

    PubMed

    Iwata, B A

    1987-01-01

    Although the effects of negative reinforcement on human behavior have been studied for a number of years, a comprehensive body of applied research does not exist at this time. This article describes three aspects of negative reinforcement as it relates to applied behavior analysis: behavior acquired or maintained through negative reinforcement, the treatment of negatively reinforced behavior, and negative reinforcement as therapy. A consideration of research currently being done in these areas suggests the emergence of an applied technology on negative reinforcement.

  11. Negative Reinforcement in Applied Behavior Analysis: An Emerging Technology.

    ERIC Educational Resources Information Center

    Iwata, Brian A.

    1987-01-01

    The article describes three aspects of negative reinforcement as it relates to applied behavior analysis: behavior acquired or maintained through negative reinforcement, the treatment of negatively reinforced behavior, and negative reinforcement as therapy. Current research suggests the emergence of an applied technology on negative reinforcement.…

  12. Animal Research in the "Journal of Applied Behavior Analysis"

    ERIC Educational Resources Information Center

    Edwards, Timothy L.; Poling, Alan

    2011-01-01

    This review summarizes the 6 studies with nonhuman animal subjects that have appeared in the "Journal of Applied Behavior Analysis" and offers suggestions for future research in this area. Two of the reviewed articles described translational research in which pigeons were used to illustrate and examine behavioral phenomena of applied significance…

  13. The possibilities of applying a risk-oriented approach to the NPP reliability and safety enhancement problem

    NASA Astrophysics Data System (ADS)

    Komarov, Yu. A.

    2014-10-01

    An analysis and some generalizations of approaches to risk assessments are presented. Interconnection between different interpretations of the "risk" notion is shown, and the possibility of applying the fuzzy set theory to risk assessments is demonstrated. A generalized formulation of the risk assessment notion is proposed in applying risk-oriented approaches to the problem of enhancing reliability and safety in nuclear power engineering. The solution of problems using the developed risk-oriented approaches aimed at achieving more reliable and safe operation of NPPs is described. The results of studies aimed at determining the need (advisability) to modernize/replace NPP elements and systems are presented together with the results obtained from elaborating the methodical principles of introducing the repair concept based on the equipment technical state. The possibility of reducing the scope of tests and altering the NPP systems maintenance strategy is substantiated using the risk-oriented approach. A probabilistic model for estimating the validity of boric acid concentration measurements is developed.

  14. B. F. Skinner's contributions to applied behavior analysis

    PubMed Central

    Morris, Edward K.; Smith, Nathaniel G.; Altus, Deborah E.

    2005-01-01

    Our paper reviews and analyzes B. F. Skinner's contributions to applied behavior analysis in order to assess his role as the field's originator and founder. We found, first, that his contributions fall into five categorizes: the style and content of his science, his interpretations of typical and atypical human behavior, the implications he drew from his science for application, his descriptions of possible applications, and his own applications to nonhuman and human behavior. Second, we found that he explicitly or implicitly addressed all seven dimensions of applied behavior analysis. These contributions and the dimensions notwithstanding, he neither incorporated the field's scientific (e.g., analytic) and social dimensions (e.g., applied) into any program of published research such that he was its originator, nor did he systematically integrate, advance, and promote the dimensions so to have been its founder. As the founder of behavior analysis, however, he was the father of applied behavior analysis. PMID:22478444

  15. Starlink corn: a risk analysis.

    PubMed Central

    Bucchini, Luca; Goldman, Lynn R

    2002-01-01

    Modern biotechnology has dramatically increased our ability to alter the agronomic traits of plants. Among the novel traits that biotechnology has made available, an important group includes Bacillus thuringiensis-derived insect resistance. This technology has been applied to potatoes, cotton, and corn. Benefits of Bt crops, and biotechnology generally, can be realized only if risks are assessed and managed properly. The case of Starlink corn, a plant modified with a gene that encodes the Bt protein Cry9c, was a severe test of U.S. regulatory agencies. The U.S. Environmental Protection Agency had restricted its use to animal feed due to concern about the potential for allergenicity. However, Starlink corn was later found throughout the human food supply, resulting in food recalls by the Food and Drug Administration and significant disruption of the food supply. Here we examine the regulatory history of Starlink, the assessment framework employed by the U.S. government, assumptions and information gaps, and the key elements of government efforts to manage the product. We explore the impacts on regulations, science, and society and conclude that only significant advances in our understanding of food allergies and improvements in monitoring and enforcement will avoid similar events in the future. Specifically, we need to develop a stronger fundamental basis for predicting allergic sensitization and reactions if novel proteins are to be introduced in this fashion. Mechanisms are needed to assure that worker and community aeroallergen risks are considered. Requirements are needed for the development of valid assays so that enforcement and post market surveillance activities can be conducted. PMID:11781159

  16. Overview of MSFC's Applied Fluid Dynamics Analysis Group Activities

    NASA Technical Reports Server (NTRS)

    Garcia, Roberto; Griffin, Lisa; Williams, Robert

    2003-01-01

    TD64, the Applied Fluid Dynamics Analysis Group, is one of several groups with high-fidelity fluids design and analysis expertise in the Space Transportation Directorate at Marshall Space Flight Center (MSFC). TD64 assists personnel working on other programs. The group participates in projects in the following areas: turbomachinery activities, nozzle activities, combustion devices, and the Columbia accident investigation.

  17. The Significance of Regional Analysis in Applied Geography.

    ERIC Educational Resources Information Center

    Sommers, Lawrence M.

    Regional analysis is central to applied geographic research, contributing to better planning and policy development for a variety of societal problems facing the United States. The development of energy policy serves as an illustration of the capabilities of this type of analysis. The United States has had little success in formulating a national…

  18. Applying Systems Analysis to Program Failure in Organizations.

    ERIC Educational Resources Information Center

    Holt, Margaret E.; And Others

    1986-01-01

    Certain systems analysis techniques can be applied to examinations of program failure in continuing education to locate weaknesses in planning and implementing stages. Questions to guide an analysis and various procedures are recommended. Twelve issues that contribute to failures or discontinuations are identified. (Author/MLW)

  19. Socioeconomic Considerations in Dam Safety Risk Analysis.

    DTIC Science & Technology

    1987-06-01

    The analytical review and summary critique of literature related to risk analysis was conducted for the purpose of highlighting those ideas, concepts...alternative solutions. The critique of the philosophical and analytical bases of risk analysis as further directed toward the specific problem of dam...safety risk analysis . Dam safety is unique in that it represents an extreme situation characteristic of low probability/high consequence event

  20. Quantitative Analysis of the Interdisciplinarity of Applied Mathematics.

    PubMed

    Xie, Zheng; Duan, Xiaojun; Ouyang, Zhenzheng; Zhang, Pengyuan

    2015-01-01

    The increasing use of mathematical techniques in scientific research leads to the interdisciplinarity of applied mathematics. This viewpoint is validated quantitatively here by statistical and network analysis on the corpus PNAS 1999-2013. A network describing the interdisciplinary relationships between disciplines in a panoramic view is built based on the corpus. Specific network indicators show the hub role of applied mathematics in interdisciplinary research. The statistical analysis on the corpus content finds that algorithms, a primary topic of applied mathematics, positively correlates, increasingly co-occurs, and has an equilibrium relationship in the long-run with certain typical research paradigms and methodologies. The finding can be understood as an intrinsic cause of the interdisciplinarity of applied mathematics.

  1. Quantitative Analysis of the Interdisciplinarity of Applied Mathematics

    PubMed Central

    Zhang, Pengyuan

    2015-01-01

    The increasing use of mathematical techniques in scientific research leads to the interdisciplinarity of applied mathematics. This viewpoint is validated quantitatively here by statistical and network analysis on the corpus PNAS 1999–2013. A network describing the interdisciplinary relationships between disciplines in a panoramic view is built based on the corpus. Specific network indicators show the hub role of applied mathematics in interdisciplinary research. The statistical analysis on the corpus content finds that algorithms, a primary topic of applied mathematics, positively correlates, increasingly co-occurs, and has an equilibrium relationship in the long-run with certain typical research paradigms and methodologies. The finding can be understood as an intrinsic cause of the interdisciplinarity of applied mathematics. PMID:26352604

  2. Concepts in critical thinking applied to caries risk assessment in dental education.

    PubMed

    Guzman-Armstrong, Sandra; Warren, John J; Cunningham-Ford, Marsha A; von Bergmann, HsingChi; Johnsen, David C

    2014-06-01

    Much progress has been made in the science of caries risk assessment and ways to analyze caries risk, yet dental education has seen little movement toward the development of frameworks to guide learning and assess critical thinking in caries risk assessment. In the absence of previous proactive implementation of a learning framework that takes the knowledge of caries risk and critically applies it to the patient with the succinctness demanded in the clinical setting, the purpose of this study was to develop a model learning framework that combines the science of caries risk assessment with principles of critical thinking from the education literature. This article also describes the implementation of that model at one dental school and presents some preliminary assessment data.

  3. Analytic concepts for assessing risk as applied to human space flight

    SciTech Connect

    Garrick, B.J.

    1997-04-30

    Quantitative risk assessment (QRA) principles provide an effective framework for quantifying individual elements of risk, including the risk to astronauts and spacecraft of the radiation environment of space flight. The concept of QRA is based on a structured set of scenarios that could lead to different damage states initiated by either hardware failure, human error, or external events. In the context of a spacecraft risk assessment, radiation may be considered as an external event and analyzed in the same basic way as any other contributor to risk. It is possible to turn up the microscope on any particular contributor to risk and ask more detailed questions than might be necessary to simply assess safety. The methods of QRA allow for as much fine structure in the analysis as is desired. For the purpose of developing a basis for comprehensive risk management and considering the tendency to {open_quotes}fear anything nuclear,{close_quotes} radiation risk is a prime candidate for examination beyond that necessary to answer the basic question of risk. Thus, rather than considering only the customary damage states of fatalities or loss of a spacecraft, it is suggested that the full range of damage be analyzed to quantify radiation risk. Radiation dose levels in the form of a risk curve accomplish such a result. If the risk curve is the complementary cumulative distribution function, then it answers the extended question of what is the likelihood of receiving a specific dose of radiation or greater. Such results can be converted to specific health effects as desired. Knowing the full range of the radiation risk of a space mission and the contributors to that risk provides the information necessary to take risk management actions [operational, design, scheduling of missions around solar particle events (SPE), etc.] that clearly control radiation exposure.

  4. Risk Analysis for Resource Planning Optimization

    NASA Technical Reports Server (NTRS)

    Chueng, Kar-Ming

    2008-01-01

    The main purpose of this paper is to introduce a risk management approach that allows planners to quantify the risk and efficiency tradeoff in the presence of uncertainties, and to make forward-looking choices in the development and execution of the plan. Demonstrate a planning and risk analysis framework that tightly integrates mathematical optimization, empirical simulation, and theoretical analysis techniques to solve complex problems.

  5. 38 CFR 75.115 - Risk analysis.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... preparation of the risk analysis may include data mining if necessary for the development of relevant...) INFORMATION SECURITY MATTERS Data Breaches § 75.115 Risk analysis. If a data breach involving sensitive... possible after the data breach, a non-VA entity with relevant expertise in data breach assessment and...

  6. Risk Analysis and Uncertainty: Implications for Counselling

    ERIC Educational Resources Information Center

    Hassenzahl, David

    2004-01-01

    Over the past two decades, the risk analysis community has made substantial advances in understanding and describing uncertainty. Uncertainty is ubiquitous, complex, both quantitative and qualitative in nature, and often irreducible. Uncertainty thus creates a challenge when using risk analysis to evaluate the rationality of group and individual…

  7. 38 CFR 75.115 - Risk analysis.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... preparation of the risk analysis may include data mining if necessary for the development of relevant...) INFORMATION SECURITY MATTERS Data Breaches § 75.115 Risk analysis. If a data breach involving sensitive... possible after the data breach, a non-VA entity with relevant expertise in data breach assessment and...

  8. Two-, Three-, and Four-Factor PCL-R Models in Applied Sex Offender Risk Assessments

    ERIC Educational Resources Information Center

    Weaver, Christopher M.; Meyer, Robert G.; Van Nort, James J.; Tristan, Luciano

    2006-01-01

    The authors compared 2-, 3-, 4-factor, and 2-factor/4-facet Psychopathy Checklist-Revised (PCL-R) models in a previously unpublished sample of 1,566 adult male sex offenders assessed under applied clinical conditions as part of a comprehensive state-mandated community notification risk assessment procedure. "Testlets" significantly…

  9. Carbon Fiber Risk Analysis. [conference

    NASA Technical Reports Server (NTRS)

    1979-01-01

    The scope and status of the effort to assess the risks associated with the accidental release of carbon/graphite fibers from civil aircraft is presented. Vulnerability of electrical and electronic equipment to carbon fibers, dispersal of carbon fibers, effectiveness of filtering systems, impact of fiber induced failures, and risk methodology are among the topics covered.

  10. Topographic Avalanche Risk: DEM Sensitivity Analysis

    NASA Astrophysics Data System (ADS)

    Nazarkulova, Ainura; Strobl, Josef

    2015-04-01

    GIS-based models are frequently used to assess the risk and trigger probabilities of (snow) avalanche releases, based on parameters and geomorphometric derivatives like elevation, exposure, slope, proximity to ridges and local relief energy. Numerous models, and model-based specific applications and project results have been published based on a variety of approaches and parametrizations as well as calibrations. Digital Elevation Models (DEM) come with many different resolution (scale) and quality (accuracy) properties, some of these resulting from sensor characteristics and DEM generation algorithms, others from different DEM processing workflows and analysis strategies. This paper explores the impact of using different types and characteristics of DEMs for avalanche risk modeling approaches, and aims at establishing a framework for assessing the uncertainty of results. The research question is derived from simply demonstrating the differences in release risk areas and intensities by applying identical models to DEMs with different properties, and then extending this into a broader sensitivity analysis. For the quantification and calibration of uncertainty parameters different metrics are established, based on simple value ranges, probabilities, as well as fuzzy expressions and fractal metrics. As a specific approach the work on DEM resolution-dependent 'slope spectra' is being considered and linked with the specific application of geomorphometry-base risk assessment. For the purpose of this study focusing on DEM characteristics, factors like land cover, meteorological recordings and snowpack structure and transformation are kept constant, i.e. not considered explicitly. Key aims of the research presented here are the development of a multi-resolution and multi-scale framework supporting the consistent combination of large area basic risk assessment with local mitigation-oriented studies, and the transferability of the latter into areas without availability of

  11. Alcohol Consumption and Gastric Cancer Risk: A Meta-Analysis

    PubMed Central

    Ma, Ke; Baloch, Zulqarnain; He, Ting-Ting; Xia, Xueshan

    2017-01-01

    Background We sought to determine by meta-analysis the relationship between drinking alcohol and the risk of gastric cancer. Material/Methods A systematic Medline search was performed to identify all published reports of drinking alcohol and the associated risk of gastric cancer. Initially we retrieved 2,494 studies, but after applying inclusion and exclusion criteria, only ten studies were found to be eligible for our meta-analysis. Results Our meta-analysis showed that alcohol consumption elevated the risk of gastric cancer with an odds ratio (OR) of 1.39 (95% CI 1.20–1.61). Additionally, subgroup analysis showed that only a nested case-control report from Sweden did not support this observation. Subgroup analysis of moderate drinking and heavy drinking also confirmed that drinking alcohol increased the risk of gastric cancer. Publication bias analysis (Begg’s and Egger’s tests) showed p values were more than 0.05, suggesting that the 10 articles included in our analysis did not have a publication bias. Conclusions The results from this meta-analysis support the hypothesis that alcohol consumption can increase the risk of gastric cancer; suggesting that effective moderation of alcohol drinking may reduce the risk of gastric cancer. PMID:28087989

  12. Applied behavior analysis: New directions from the laboratory

    PubMed Central

    Epling, W. Frank; Pierce, W. David

    1983-01-01

    Applied behavior analysis began when laboratory based principles were extended to humans inorder to change socially significant behavior. Recent laboratory findings may have applied relevance; however, the majority of basic researchers have not clearly communicated the practical implications of their work. The present paper samples some of the new findings and attempts to demonstrate their applied importance. Schedule-induced behavior which occurs as a by-product of contingencies of reinforcement is discussed. Possible difficulties in treatment and management of induced behaviors are considered. Next, the correlation-based law of effect and the implications of relative reinforcement are explored in terms of applied examples. Relative rate of reinforcement is then extended to the literature dealing with concurrent operants. Concurrent operant models may describe human behavior of applied importance, and several techniques for modification of problem behavior are suggested. As a final concern, the paper discusses several new paradigms. While the practical importance of these models is not clear at the moment, it may be that new practical advantages will soon arise. Thus, it is argued that basic research continues to be of theoretical and practical importance to applied behavior analysis. PMID:22478574

  13. Risk analysis of analytical validations by probabilistic modification of FMEA.

    PubMed

    Barends, D M; Oldenhof, M T; Vredenbregt, M J; Nauta, M J

    2012-05-01

    Risk analysis is a valuable addition to validation of an analytical chemistry process, enabling not only detecting technical risks, but also risks related to human failures. Failure Mode and Effect Analysis (FMEA) can be applied, using a categorical risk scoring of the occurrence, detection and severity of failure modes, and calculating the Risk Priority Number (RPN) to select failure modes for correction. We propose a probabilistic modification of FMEA, replacing the categorical scoring of occurrence and detection by their estimated relative frequency and maintaining the categorical scoring of severity. In an example, the results of traditional FMEA of a Near Infrared (NIR) analytical procedure used for the screening of suspected counterfeited tablets are re-interpretated by this probabilistic modification of FMEA. Using this probabilistic modification of FMEA, the frequency of occurrence of undetected failure mode(s) can be estimated quantitatively, for each individual failure mode, for a set of failure modes, and the full analytical procedure.

  14. Predicting pathogen transport and risk of infection from land-applied biosolids

    NASA Astrophysics Data System (ADS)

    Olson, M. S.; Teng, J.; Kumar, A.; Gurian, P.

    2011-12-01

    Biosolids have been recycled as fertilizer to sustainably improve and maintain productive soils and to stimulate plant growth for over forty years, but may contain low levels of microbial pathogens. The Spreadsheet Microbial Assessment of Risk: Tool for Biosolids ("SMART Biosolids") is an environmental transport, exposure and risk model that compiles knowledge on the occurrence, environmental dispersion and attenuation of biosolids-associated pathogens to estimate microbial risk from biosolids land application. The SMART Biosolids model calculates environmental pathogen concentrations and assesses risk associated with exposure to pathogens from land-applied biosolids through five pathways: 1) inhalation of aerosols from land application sites, 2) consumption of groundwater contaminated by land-applied biosolids, 3) direct ingestion of biosolids-amended soils, 4) ingestion of plants contaminated by land-applied biosolids, and 5) consumption of surface water contaminated by runoff from a land application site. The SMART Biosolids model can be applied under a variety of scenarios, thereby providing insight into effective management practices. This study presents example results of the SMART Biosolids model, focusing on the groundwater and surface water pathways, following biosolids application to a typical site in Michigan. Volumes of infiltration and surface water runoff are calculated following a 100-year storm event. Pathogen transport and attenuation through the subsurface and via surface runoff are modeled, and pathogen concentrations in a downstream well and an adjacent pond are calculated. Risks are calculated for residents of nearby properties. For a 100-year storm event occurring immediately after biosolids application, the surface water pathway produces risks that may be of some concern, but best estimates do not exceed the bounds of what has been considered acceptable risk for recreational water use (Table 1); groundwater risks are very uncertain and at the

  15. Overview of MSFC's Applied Fluid Dynamics Analysis Group Activities

    NASA Technical Reports Server (NTRS)

    Garcia, Roberto; Griffin, Lisa; Williams, Robert

    2002-01-01

    This viewgraph report presents an overview of activities and accomplishments of NASA's Marshall Space Flight Center's Applied Fluid Dynamics Analysis Group. Expertise in this group focuses on high-fidelity fluids design and analysis with application to space shuttle propulsion and next generation launch technologies. Topics covered include: computational fluid dynamics research and goals, turbomachinery research and activities, nozzle research and activities, combustion devices, engine systems, MDA development and CFD process improvements.

  16. Overview of MSFC's Applied Fluid Dynamics Analysis Group Activities

    NASA Technical Reports Server (NTRS)

    Garcia, Roberto; Wang, Tee-See; Griffin, Lisa; Turner, James E. (Technical Monitor)

    2001-01-01

    This document is a presentation graphic which reviews the activities of the Applied Fluid Dynamics Analysis Group at Marshall Space Flight Center (i.e., Code TD64). The work of this group focused on supporting the space transportation programs. The work of the group is in Computational Fluid Dynamic tool development. This development is driven by hardware design needs. The major applications for the design and analysis tools are: turbines, pumps, propulsion-to-airframe integration, and combustion devices.

  17. Animal research in the Journal of Applied Behavior Analysis.

    PubMed

    Edwards, Timothy L; Poling, Alan

    2011-01-01

    This review summarizes the 6 studies with nonhuman animal subjects that have appeared in the Journal of Applied Behavior Analysis and offers suggestions for future research in this area. Two of the reviewed articles described translational research in which pigeons were used to illustrate and examine behavioral phenomena of applied significance (say-do correspondence and fluency), 3 described interventions that changed animals' behavior (self-injury by a baboon, feces throwing and spitting by a chimpanzee, and unsafe trailer entry by horses) in ways that benefited the animals and the people in charge of them, and 1 described the use of trained rats that performed a service to humans (land-mine detection). We suggest that each of these general research areas merits further attention and that the Journal of Applied Behavior Analysis is an appropriate outlet for some of these publications.

  18. Resource allocation using risk analysis

    SciTech Connect

    Bott, T. F.; Eisenhawer, S. W.

    2003-01-01

    Allocating limited resources among competing priorities is an important problem in management. In this paper we describe an approach to resource allocation using risk as a metric. We call this approach the Logic-Evolved Decision (LED) approach because we use logic-models to generate an exhaustive set of competing options and to describe the often highly complex model used for evaluating the risk reduction achieved by different resource allocations among these options. The risk evaluation then proceeds using probabilistic or linguistic input data.

  19. 76 FR 30705 - Problem Formulation for Human Health Risk Assessments of Pathogens in Land-Applied Biosolids

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-05-26

    ... AGENCY Problem Formulation for Human Health Risk Assessments of Pathogens in Land-Applied Biosolids... the availability of a final report titled, ``Problem Formulation for Human Health Risk Assessments of..., contractors, or other parties interested in conducting microbial risk assessments on land- applied...

  20. B. F. Skinner's Contributions to Applied Behavior Analysis

    ERIC Educational Resources Information Center

    Morris, Edward K.; Smith, Nathaniel G.; Altus, Deborah E.

    2005-01-01

    Our paper reviews and analyzes B. F. Skinner's contributions to applied behavior analysis in order to assess his role as the field's originator and founder. We found, first, that his contributions fall into five categorizes: the style and content of his science, his interpretations of typical and atypical human behavior, the implications he drew…

  1. Positive Behavior Support and Applied Behavior Analysis: A Familial Alliance

    ERIC Educational Resources Information Center

    Dunlap, Glen; Carr, Edward G.; Horner, Robert H.; Zarcone, Jennifer R.; Schwartz, Ilene

    2008-01-01

    Positive behavior support (PBS) emerged in the mid-1980s as an approach for understanding and addressing problem behaviors. PBS was derived primarily from applied behavior analysis (ABA). Over time, however, PBS research and practice has incorporated evaluative methods, assessment and intervention procedures, and conceptual perspectives associated…

  2. Progressive-Ratio Schedules and Applied Behavior Analysis

    ERIC Educational Resources Information Center

    Poling, Alan

    2010-01-01

    Establishing appropriate relations between the basic and applied areas of behavior analysis has been of long and persistent interest to the author. In this article, the author illustrates that there is a direct relation between how hard an organism will work for access to an object or activity, as indexed by the largest ratio completed under a…

  3. Overview af MSFC's Applied Fluid Dynamics Analysis Group Activities

    NASA Technical Reports Server (NTRS)

    Garcia, Roberto; Griffin, Lisa; Williams, Robert

    2004-01-01

    This paper presents viewgraphs on NASA Marshall Space Flight Center's Applied Fluid Dynamics Analysis Group Activities. The topics include: 1) Status of programs at MSFC; 2) Fluid Mechanics at MSFC; 3) Relevant Fluid Dynamics Activities at MSFC; and 4) Shuttle Return to Flight.

  4. Applied Behavior Analysis Is a Science And, Therefore, Progressive

    ERIC Educational Resources Information Center

    Leaf, Justin B.; Leaf, Ronald; McEachin, John; Taubman, Mitchell; Ala'i-Rosales, Shahla; Ross, Robert K.; Smith, Tristram; Weiss, Mary Jane

    2016-01-01

    Applied behavior analysis (ABA) is a science and, therefore, involves progressive approaches and outcomes. In this commentary we argue that the spirit and the method of science should be maintained in order to avoid reductionist procedures, stifled innovation, and rote, unresponsive protocols that become increasingly removed from meaningful…

  5. Measurement uncertainty analysis techniques applied to PV performance measurements

    SciTech Connect

    Wells, C.

    1992-10-01

    The purpose of this presentation is to provide a brief introduction to measurement uncertainty analysis, outline how it is done, and illustrate uncertainty analysis with examples drawn from the PV field, with particular emphasis toward its use in PV performance measurements. The uncertainty information we know and state concerning a PV performance measurement or a module test result determines, to a significant extent, the value and quality of that result. What is measurement uncertainty analysis It is an outgrowth of what has commonly been called error analysis. But uncertainty analysis, a more recent development, gives greater insight into measurement processes and tests, experiments, or calibration results. Uncertainty analysis gives us an estimate of the I interval about a measured value or an experiment's final result within which we believe the true value of that quantity will lie. Why should we take the time to perform an uncertainty analysis A rigorous measurement uncertainty analysis: Increases the credibility and value of research results; allows comparisons of results from different labs; helps improve experiment design and identifies where changes are needed to achieve stated objectives (through use of the pre-test analysis); plays a significant role in validating measurements and experimental results, and in demonstrating (through the post-test analysis) that valid data have been acquired; reduces the risk of making erroneous decisions; demonstrates quality assurance and quality control measures have been accomplished; define Valid Data as data having known and documented paths of: Origin, including theory; measurements; traceability to measurement standards; computations; uncertainty analysis of results.

  6. Carbon Fiber Risk Analysis: Conclusions

    NASA Technical Reports Server (NTRS)

    Huston, R. J.

    1979-01-01

    It was concluded that preliminary estimates indicate that the public risk due to accidental release of carbon fiber from air transport aircraft is small. It was also concluded that further work is required to increase confidence in these estimates.

  7. Automating risk analysis of software design models.

    PubMed

    Frydman, Maxime; Ruiz, Guifré; Heymann, Elisa; César, Eduardo; Miller, Barton P

    2014-01-01

    The growth of the internet and networked systems has exposed software to an increased amount of security threats. One of the responses from software developers to these threats is the introduction of security activities in the software development lifecycle. This paper describes an approach to reduce the need for costly human expertise to perform risk analysis in software, which is common in secure development methodologies, by automating threat modeling. Reducing the dependency on security experts aims at reducing the cost of secure development by allowing non-security-aware developers to apply secure development with little to no additional cost, making secure development more accessible. To automate threat modeling two data structures are introduced, identification trees and mitigation trees, to identify threats in software designs and advise mitigation techniques, while taking into account specification requirements and cost concerns. These are the components of our model for automated threat modeling, AutSEC. We validated AutSEC by implementing it in a tool based on data flow diagrams, from the Microsoft security development methodology, and applying it to VOMS, a grid middleware component, to evaluate our model's performance.

  8. Bridging the two cultures of risk analysis

    SciTech Connect

    Jasanoff, S. )

    1993-04-01

    During the past 15 years, risk analysis has come of age as an interdisciplinary field of remarkable breadth, nurturing connections among fields as diverse as mathematics, biostatistics, toxicology, and engineering on one hand, and law, psychology, sociology, and economics on the other hand. In this editorial, the author addresses the question: What has the presence of social scientists in the network meant to the substantive development of the field of risk analysis The answers offered here discuss the substantial progress in bridging the two cultures of risk analysis. Emphasis is made of the continual need for monitoring risk analysis. Topics include: the micro-worlds of risk assessment; constraining assumptions; and exchange programs. 14 refs.

  9. Research in applied mathematics, numerical analysis, and computer science

    NASA Technical Reports Server (NTRS)

    1984-01-01

    Research conducted at the Institute for Computer Applications in Science and Engineering (ICASE) in applied mathematics, numerical analysis, and computer science is summarized and abstracts of published reports are presented. The major categories of the ICASE research program are: (1) numerical methods, with particular emphasis on the development and analysis of basic numerical algorithms; (2) control and parameter identification; (3) computational problems in engineering and the physical sciences, particularly fluid dynamics, acoustics, and structural analysis; and (4) computer systems and software, especially vector and parallel computers.

  10. Initial Risk Analysis and Decision Making Framework

    SciTech Connect

    Engel, David W.

    2012-02-01

    Commercialization of new carbon capture simulation initiative (CCSI) technology will include two key elements of risk management, namely, technical risk (will process and plant performance be effective, safe, and reliable) and enterprise risk (can project losses and costs be controlled within the constraints of market demand to maintain profitability and investor confidence). Both of these elements of risk are incorporated into the risk analysis subtask of Task 7. Thus far, this subtask has developed a prototype demonstration tool that quantifies risk based on the expected profitability of expenditures when retrofitting carbon capture technology on a stylized 650 MW pulverized coal electric power generator. The prototype is based on the selection of specific technical and financial factors believed to be important determinants of the expected profitability of carbon capture, subject to uncertainty. The uncertainty surrounding the technical performance and financial variables selected thus far is propagated in a model that calculates the expected profitability of investments in carbon capture and measures risk in terms of variability in expected net returns from these investments. Given the preliminary nature of the results of this prototype, additional work is required to expand the scope of the model to include additional risk factors, additional information on extant and proposed risk factors, the results of a qualitative risk factor elicitation process, and feedback from utilities and other interested parties involved in the carbon capture project. Additional information on proposed distributions of these risk factors will be integrated into a commercial implementation framework for the purpose of a comparative technology investment analysis.

  11. Recent reinforcement-schedule research and applied behavior analysis

    PubMed Central

    Lattal, Kennon A.; Neef, Nancy A.

    1996-01-01

    Reinforcement schedules are considered in relation to applied behavior analysis by examining several recent laboratory experiments with humans and other animals. The experiments are drawn from three areas of contemporary schedule research: behavioral history effects on schedule performance, the role of instructions in schedule performance of humans, and dynamic schedules of reinforcement. All of the experiments are discussed in relation to the role of behavioral history in current schedule performance. The paper concludes by extracting from the experiments some more general issues concerning reinforcement schedules in applied research and practice. PMID:16795888

  12. Dealing with Uncertainty in Chemical Risk Analysis

    DTIC Science & Technology

    1988-12-01

    0 * (OF 41 C-DEALING WITH UNCERTAINTY IN - CHEMICAL RISK ANALYSIS THESIS David S. Clement Captain, USAF AFIT/GOR/MA/8CD-2 DT[C. ~ELECTEf 2 9 MAR 18...AFIT/GOR/MA/88D-2 DEALING WITH UNCERTAINTY IN CHEMICAL RISK ANALYSIS THESIS David S. Clement Captain, USAF AFIT/GOR/MA/88D-2 DTIC V ~ 27989 Approved...for public release; distribution unlimited S . AFIT/GOR/KA/88D-2 DEALING WITH UNCERTAINTY IN CHEMICAL RISK ANALYSIS THESIS Presented to the Faculty

  13. Analysis of labour risks in the Spanish industrial aerospace sector.

    PubMed

    Laguardia, Juan; Rubio, Emilio; Garcia, Ana; Garcia-Foncillas, Rafael

    2016-01-01

    Labour risk prevention is an activity integrated within Safety and Hygiene at Work in Spain. In 2003, the Electronic Declaration for Accidents at Work, Delt@ (DELTA) was introduced. The industrial aerospace sector is subject to various risks. Our objective is to analyse the Spanish Industrial Aerospace Sector (SIAS) using the ACSOM methodology to assess its labour risks and to prioritise preventive actions. The SIAS and the Services Subsector (SS) were created and the relevant accident rate data were obtained. The ACSOM method was applied through double contrast (deviation and translocation) of the SIAS or SS risk polygon with the considered pattern, accidents from all sectors (ACSOM G) or the SIAS. A list of risks was obtained, ordered by action phases. In the SIAS vs. ACSOM G analysis, radiation risks were the worst, followed by overstrains. Accidents caused by living beings were also significant in the SS vs. SIAE, which will be able to be used to improve Risk Prevention. Radiation is the most significant risk in the SIAS and the SS. Preventive actions will be primary and secondary. ACSOM has shown itself to be a valid tool for the analysis of labour risks.

  14. RiskSOAP: Introducing and applying a methodology of risk self-awareness in road tunnel safety.

    PubMed

    Chatzimichailidou, Maria Mikela; Dokas, Ioannis M

    2016-05-01

    Complex socio-technical systems, such as road tunnels, can be designed and developed with more or less elements that can either positively or negatively affect the capability of their agents to recognise imminent threats or vulnerabilities that possibly lead to accidents. This capability is called risk Situation Awareness (SA) provision. Having as a motive the introduction of better tools for designing and developing systems that are self-aware of their vulnerabilities and react to prevent accidents and losses, this paper introduces the Risk Situation Awareness Provision (RiskSOAP) methodology to the field of road tunnel safety, as a means to measure this capability in this kind of systems. The main objective is to test the soundness and the applicability of RiskSOAP to infrastructure, which is advanced in terms of technology, human integration, and minimum number of safety requirements imposed by international bodies. RiskSOAP is applied to a specific road tunnel in Greece and the accompanying indicator is calculated twice, once for the tunnel design as defined by updated European safety standards and once for the 'as-is' tunnel composition, which complies with the necessary safety requirements, but calls for enhancing safety according to what EU and PIARC further suggest. The derived values indicate the extent to which each tunnel version is capable of comprehending its threats and vulnerabilities based on its elements. The former tunnel version seems to be more enhanced both in terms of it risk awareness capability and safety as well. Another interesting finding is that despite the advanced tunnel safety specifications, there is still room for enriching the safe design and maintenance of the road tunnel.

  15. CUMULATIVE RISK ANALYSIS FOR ORGANOPHOSPHORUS PESTICIDES

    EPA Science Inventory

    Cumulative Risk Analysis for Organophosphorus Pesticides
    R. Woodrow Setzer, Jr. NHEERL MD-74, USEPA, RTP, NC 27711

    The US EPA has recently completed a risk assessment of the effects of exposure to 33 organophosphorous pesticides (OPs) through the diet, water, and resi...

  16. Boston Society's 11th Annual Applied Pharmaceutical Analysis conference.

    PubMed

    Lee, Violet; Liu, Ang; Groeber, Elizabeth; Moghaddam, Mehran; Schiller, James; Tweed, Joseph A; Walker, Gregory S

    2016-02-01

    Boston Society's 11th Annual Applied Pharmaceutical Analysis conference, Hyatt Regency Hotel, Cambridge, MA, USA, 14-16 September 2015 The Boston Society's 11th Annual Applied Pharmaceutical Analysis (APA) conference took place at the Hyatt Regency hotel in Cambridge, MA, on 14-16 September 2015. The 3-day conference affords pharmaceutical professionals, academic researchers and industry regulators the opportunity to collectively participate in meaningful and relevant discussions impacting the areas of pharmaceutical drug development. The APA conference was organized in three workshops encompassing the disciplines of regulated bioanalysis, discovery bioanalysis (encompassing new and emerging technologies) and biotransformation. The conference included a short course titled 'Bioanalytical considerations for the clinical development of antibody-drug conjugates (ADCs)', an engaging poster session, several panel and round table discussions and over 50 diverse talks from leading industry and academic scientists.

  17. Developing an interdisciplinary master's program in applied behavior analysis

    PubMed Central

    Lowenkron, Barry; Mitchell, Lynda

    1995-01-01

    At many universities, faculty interested in behavior analysis are spread across disciplines. This makes difficult the development of behavior-analytically oriented programs, and impedes regular contact among colleagues who share common interests. However, this separation by disciplines can be a source of strength if it is used to develop interdisciplinary programs. In this article we describe how a bottom-up strategy was used to develop two complementary interdisciplinary MS programs in applied behavior analysis, and conclude with a description of the benefits—some obvious, some surprising—that can emerge from the development of such programs. PMID:22478230

  18. Risk analysis of dust explosion scenarios using Bayesian networks.

    PubMed

    Yuan, Zhi; Khakzad, Nima; Khan, Faisal; Amyotte, Paul

    2015-02-01

    In this study, a methodology has been proposed for risk analysis of dust explosion scenarios based on Bayesian network. Our methodology also benefits from a bow-tie diagram to better represent the logical relationships existing among contributing factors and consequences of dust explosions. In this study, the risks of dust explosion scenarios are evaluated, taking into account common cause failures and dependencies among root events and possible consequences. Using a diagnostic analysis, dust particle properties, oxygen concentration, and safety training of staff are identified as the most critical root events leading to dust explosions. The probability adaptation concept is also used for sequential updating and thus learning from past dust explosion accidents, which is of great importance in dynamic risk assessment and management. We also apply the proposed methodology to a case study to model dust explosion scenarios, to estimate the envisaged risks, and to identify the vulnerable parts of the system that need additional safety measures.

  19. Measurement uncertainty analysis techniques applied to PV performance measurements

    SciTech Connect

    Wells, C

    1992-10-01

    The purpose of this presentation is to provide a brief introduction to measurement uncertainty analysis, outline how it is done, and illustrate uncertainty analysis with examples drawn from the PV field, with particular emphasis toward its use in PV performance measurements. The uncertainty information we know and state concerning a PV performance measurement or a module test result determines, to a significant extent, the value and quality of that result. What is measurement uncertainty analysis? It is an outgrowth of what has commonly been called error analysis. But uncertainty analysis, a more recent development, gives greater insight into measurement processes and tests, experiments, or calibration results. Uncertainty analysis gives us an estimate of the I interval about a measured value or an experiment`s final result within which we believe the true value of that quantity will lie. Why should we take the time to perform an uncertainty analysis? A rigorous measurement uncertainty analysis: Increases the credibility and value of research results; allows comparisons of results from different labs; helps improve experiment design and identifies where changes are needed to achieve stated objectives (through use of the pre-test analysis); plays a significant role in validating measurements and experimental results, and in demonstrating (through the post-test analysis) that valid data have been acquired; reduces the risk of making erroneous decisions; demonstrates quality assurance and quality control measures have been accomplished; define Valid Data as data having known and documented paths of: Origin, including theory; measurements; traceability to measurement standards; computations; uncertainty analysis of results.

  20. Cassini nuclear risk analysis with SPARRC

    NASA Astrophysics Data System (ADS)

    Ha, Chuong T.; Deane, Nelson A.

    1998-01-01

    The nuclear risk analysis of the Cassini mission is one of the most comprehensive risk analyses ever conducted for a space nuclear mission. The complexity of postulated accident scenarios and source term definitions, from launch to Earth swingby, has necessitated an extensive series of analyses in order to provide best-estimates of potential consequence results and bounding uncertainty intervals. The Space Accident Radiological Release and Consequence (SPARRC) family of codes, developed by Lockheed Martin to analyze polydispersed source terms and a combination of different atmospheric transport patterns, have been used for the Cassini Final Safety Analysis Report (FSAR). By identifying dominant contributors, the nuclear risk of each mission segment is understood with a high level of confidence. This paper provides the overall analysis process and insights developed from the risk analysis.

  1. Risk analysis approach. [of carbon fiber release

    NASA Technical Reports Server (NTRS)

    Huston, R. J.

    1979-01-01

    The assessment of the carbon fiber hazard is outlined. Program objectives, requirements of the risk analysis, and elements associated with the physical phenomena of the accidental release are described.

  2. Quantitative Microbial Risk Assessment in Occupational Settings Applied to the Airborne Human Adenovirus Infection

    PubMed Central

    Carducci, Annalaura; Donzelli, Gabriele; Cioni, Lorenzo; Verani, Marco

    2016-01-01

    Quantitative Microbial Risk Assessment (QMRA) methodology, which has already been applied to drinking water and food safety, may also be applied to risk assessment and management at the workplace. The present study developed a preliminary QMRA model to assess microbial risk that is associated with inhaling bioaerosols that are contaminated with human adenovirus (HAdV). This model has been applied to air contamination data from different occupational settings, including wastewater systems, solid waste landfills, and toilets in healthcare settings and offices, with different exposure times. Virological monitoring showed the presence of HAdVs in all the evaluated settings, thus confirming that HAdV is widespread, but with different average concentrations of the virus. The QMRA results, based on these concentrations, showed that toilets had the highest probability of viral infection, followed by wastewater treatment plants and municipal solid waste landfills. Our QMRA approach in occupational settings is novel, and certain caveats should be considered. Nonetheless, we believe it is worthy of further discussions and investigations. PMID:27447658

  3. Pathogen risk assessment of land applied wastewater and biosolids: A fuzzy set approach

    SciTech Connect

    Dahab, M.F.; Fuerhacker, M.; Zibuschka, F.

    1998-07-01

    There are major concerns associated with land application of wastewater and biosolids including the potential risk to public health from water-borne pollutants that may enter the food chain and from pathogens that may be present in the wastewater. These risks are of particular concern when wastewater is applied to land where crops are grown as part of the human food chain or when direct human contact with the wastewater may occur. In many communities, toxic chemicals may not be present in the biosolids, or their concentrations may be reduced through source control measures. However, pathogens that enter wastewater from infected individuals cannot be controlled at the source and are often found in wastewater or biosolids applied to land. Public health officials have emphasized that microbial pathogens (or pathogen indicators) should not occur in areas where exposure to humans is likely. Under this criteria, the concept of risk assessment which requires the characterization of the occurrence of pathogens, almost seems to be contradictory to basic public health goals. As the understanding of pathogen and pathogen indicator occurrence becomes better refined, the arguments for finding practical application of risk assessment for pathogenic organisms become more compelling.

  4. Probabilistic risk assessment of veterinary medicines applied to four major aquaculture species produced in Asia.

    PubMed

    Rico, Andreu; Van den Brink, Paul J

    2014-01-15

    Aquaculture production constitutes one of the main sources of pollution with veterinary medicines into the environment. About 90% of the global aquaculture production is produced in Asia and the potential environmental risks associated with the use of veterinary medicines in Asian aquaculture have not yet been properly evaluated. In this study we performed a probabilistic risk assessment for eight different aquaculture production scenarios in Asia by combining up-to-date information on the use of veterinary medicines and aquaculture production characteristics. The ERA-AQUA model was used to perform mass balances of veterinary medicinal treatments applied to aquaculture ponds and to characterize risks for primary producers, invertebrates, and fish potentially exposed to chemical residues through aquaculture effluents. The mass balance calculations showed that, on average, about 25% of the applied drug mass to aquaculture ponds is released into the environment, although this percentage varies with the chemical's properties, the mode of application, the cultured species density, and the water exchange rates in the aquaculture pond scenario. In general, the highest potential environmental risks were calculated for parasitic treatments, followed by disinfection and antibiotic treatments. Pangasius catfish production in Vietnam, followed by shrimp production in China, constitute possible hot-spots for environmental pollution due to the intensity of the aquaculture production and considerable discharge of toxic chemical residues into surrounding aquatic ecosystems. A risk-based ranking of compounds is provided for each of the evaluated scenarios, which offers crucial information for conducting further chemical and biological field and laboratory monitoring research. In addition, we discuss general knowledge gaps and research priorities for performing refined risk assessments of aquaculture medicines in the near future.

  5. Adversarial risk analysis for counterterrorism modeling.

    PubMed

    Rios, Jesus; Rios Insua, David

    2012-05-01

    Recent large-scale terrorist attacks have raised interest in models for resource allocation against terrorist threats. The unifying theme in this area is the need to develop methods for the analysis of allocation decisions when risks stem from the intentional actions of intelligent adversaries. Most approaches to these problems have a game-theoretic flavor although there are also several interesting decision-analytic-based proposals. One of them is the recently introduced framework for adversarial risk analysis, which deals with decision-making problems that involve intelligent opponents and uncertain outcomes. We explore how adversarial risk analysis addresses some standard counterterrorism models: simultaneous defend-attack models, sequential defend-attack-defend models, and sequential defend-attack models with private information. For each model, we first assess critically what would be a typical game-theoretic approach and then provide the corresponding solution proposed by the adversarial risk analysis framework, emphasizing how to coherently assess a predictive probability model of the adversary's actions, in a context in which we aim at supporting decisions of a defender versus an attacker. This illustrates the application of adversarial risk analysis to basic counterterrorism models that may be used as basic building blocks for more complex risk analysis of counterterrorism problems.

  6. Fuzzy risk analysis for nuclear safeguards

    SciTech Connect

    Zardecki, A.

    1993-01-01

    Analysis of a safeguards system, based on the notion of fuzzy sets and linguistic variables, concerns such as complexity and inherent imprecision in estimating the possibility of loss or compromise. The automated risk analysis allows the risk to be determined for an entire system based on estimates for lowest level components and the component proportion. In addition, for each component (asset) the most effective combination of protection mechanisms against a given set of threats is determined. A distinction between bar and featured risk is made.

  7. RISK ANALYSIS, ANALYSIS OF VARIANCE: GETTING MORE FROM OUR DATA

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Analysis of variance (ANOVA) and regression are common statistical techniques used to analyze agronomic experimental data and determine significant differences among yields due to treatments or other experimental factors. Risk analysis provides an alternate and complimentary examination of the same...

  8. Relative risk regression analysis of epidemiologic data.

    PubMed

    Prentice, R L

    1985-11-01

    Relative risk regression methods are described. These methods provide a unified approach to a range of data analysis problems in environmental risk assessment and in the study of disease risk factors more generally. Relative risk regression methods are most readily viewed as an outgrowth of Cox's regression and life model. They can also be viewed as a regression generalization of more classical epidemiologic procedures, such as that due to Mantel and Haenszel. In the context of an epidemiologic cohort study, relative risk regression methods extend conventional survival data methods and binary response (e.g., logistic) regression models by taking explicit account of the time to disease occurrence while allowing arbitrary baseline disease rates, general censorship, and time-varying risk factors. This latter feature is particularly relevant to many environmental risk assessment problems wherein one wishes to relate disease rates at a particular point in time to aspects of a preceding risk factor history. Relative risk regression methods also adapt readily to time-matched case-control studies and to certain less standard designs. The uses of relative risk regression methods are illustrated and the state of development of these procedures is discussed. It is argued that asymptotic partial likelihood estimation techniques are now well developed in the important special case in which the disease rates of interest have interpretations as counting process intensity functions. Estimation of relative risks processes corresponding to disease rates falling outside this class has, however, received limited attention. The general area of relative risk regression model criticism has, as yet, not been thoroughly studied, though a number of statistical groups are studying such features as tests of fit, residuals, diagnostics and graphical procedures. Most such studies have been restricted to exponential form relative risks as have simulation studies of relative risk estimation

  9. Probabilistic Analysis Techniques Applied to Complex Spacecraft Power System Modeling

    NASA Technical Reports Server (NTRS)

    Hojnicki, Jeffrey S.; Rusick, Jeffrey J.

    2005-01-01

    Electric power system performance predictions are critical to spacecraft, such as the International Space Station (ISS), to ensure that sufficient power is available to support all the spacecraft s power needs. In the case of the ISS power system, analyses to date have been deterministic, meaning that each analysis produces a single-valued result for power capability because of the complexity and large size of the model. As a result, the deterministic ISS analyses did not account for the sensitivity of the power capability to uncertainties in model input variables. Over the last 10 years, the NASA Glenn Research Center has developed advanced, computationally fast, probabilistic analysis techniques and successfully applied them to large (thousands of nodes) complex structural analysis models. These same techniques were recently applied to large, complex ISS power system models. This new application enables probabilistic power analyses that account for input uncertainties and produce results that include variations caused by these uncertainties. Specifically, N&R Engineering, under contract to NASA, integrated these advanced probabilistic techniques with Glenn s internationally recognized ISS power system model, System Power Analysis for Capability Evaluation (SPACE).

  10. A risk assessment tool applied to the study of shale gas resources.

    PubMed

    Veiguela, Miguel; Hurtado, Antonio; Eguilior, Sonsoles; Recreo, Fernando; Roqueñi, Nieves; Loredo, Jorge

    2016-11-15

    The implementation of a risk assessment tool with the capacity to evaluate the risks for health, safety and the environment (HSE) from extraction of non-conventional fossil fuel resources by the hydraulic fracturing (fracking) technique can be a useful tool to boost development and progress of the technology and winning public trust and acceptance of this. At the early project stages, the lack of data related the selection of non-conventional gas deposits makes it difficult the use of existing approaches to risk assessment of fluids injected into geologic formations. The qualitative risk assessment tool developed in this work is based on the approach that shale gas exploitation risk is dependent on both the geologic site and the technological aspects. It follows from the Oldenburg's 'Screening and Ranking Framework (SRF)' developed to evaluate potential geologic carbon dioxide (CO2) storage sites. These two global characteristics: (1) characteristics centered on the natural aspects of the site and (2) characteristics centered on the technological aspects of the Project, have been evaluated through user input of Property values, which define Attributes, which define the Characteristics. In order to carry out an individual evaluation of each of the characteristics and the elements of the model, the tool has been implemented in a spreadsheet. The proposed model has been applied to a site with potential for the exploitation of shale gas in Asturias (northwestern Spain) with tree different technological options to test the approach.

  11. Risk Based Inspection Methodology and Software Applied to Atmospheric Storage Tanks

    NASA Astrophysics Data System (ADS)

    Topalis, P.; Korneliussen, G.; Hermanrud, J.; Steo, Y.

    2012-05-01

    A new risk-based inspection (RBI) methodology and software is presented in this paper. The objective of this work is to allow management of the inspections of atmospheric storage tanks in the most efficient way, while, at the same time, accident risks are minimized. The software has been built on the new risk framework architecture, a generic platform facilitating efficient and integrated development of software applications using risk models. The framework includes a library of risk models and the user interface is automatically produced on the basis of editable schemas. This risk-framework-based RBI tool has been applied in the context of RBI for above-ground atmospheric storage tanks (AST) but it has been designed with the objective of being generic enough to allow extension to the process plants in general. This RBI methodology is an evolution of an approach and mathematical models developed for Det Norske Veritas (DNV) and the American Petroleum Institute (API). The methodology assesses damage mechanism potential, degradation rates, probability of failure (PoF), consequence of failure (CoF) in terms of environmental damage and financial loss, risk and inspection intervals and techniques. The scope includes assessment of the tank floor for soil-side external corrosion and product-side internal corrosion and the tank shell courses for atmospheric corrosion and internal thinning. It also includes preliminary assessment for brittle fracture and cracking. The data are structured according to an asset hierarchy including Plant, Production Unit, Process Unit, Tag, Part and Inspection levels and the data are inherited / defaulted seamlessly from a higher hierarchy level to a lower level. The user interface includes synchronized hierarchy tree browsing, dynamic editor and grid-view editing and active reports with drill-in capability.

  12. Risk analysis for plant-made vaccines.

    PubMed

    Kirk, Dwayne D; McIntosh, Kim; Walmsley, Amanda M; Peterson, Robert K D

    2005-08-01

    The production of vaccines in transgenic plants was first proposed in 1990 however no product has yet reached commercialization. There are several risks during the production and delivery stages of this technology, with potential impact on the environment and on human health. Risks to the environment include gene transfer and exposure to antigens or selectable marker proteins. Risks to human health include oral tolerance, allergenicity, inconsistent dosage, worker exposure and unintended exposure to antigens or selectable marker proteins in the food chain. These risks are controllable through appropriate regulatory measures at all stages of production and distribution of a potential plant-made vaccine. Successful use of this technology is highly dependant on stewardship and active risk management by the developers of this technology, and through quality standards for production, which will be set by regulatory agencies. Regulatory agencies can also negatively affect the future viability of this technology by requiring that all risks must be controlled, or by applying conventional regulations which are overly cumbersome for a plant production and oral delivery system. The value of new or replacement vaccines produced in plant cells and delivered orally must be considered alongside the probability and severity of potential risks in their production and use, and the cost of not deploying this technology--the risk of continuing with the status quo alternative.

  13. Differential item functioning analysis by applying multiple comparison procedures.

    PubMed

    Eusebi, Paolo; Kreiner, Svend

    2015-01-01

    Analysis within a Rasch measurement framework aims at development of valid and objective test score. One requirement of both validity and objectivity is that items do not show evidence of differential item functioning (DIF). A number of procedures exist for the assessment of DIF including those based on analysis of contingency tables by Mantel-Haenszel tests and partial gamma coefficients. The aim of this paper is to illustrate Multiple Comparison Procedures (MCP) for analysis of DIF relative to a variable defining a very large number of groups, with an unclear ordering with respect to the DIF effect. We propose a single step procedure controlling the false discovery rate for DIF detection. The procedure applies for both dichotomous and polytomous items. In addition to providing evidence against a hypothesis of no DIF, the procedure also provides information on subset of groups that are homogeneous with respect to the DIF effect. A stepwise MCP procedure for this purpose is also introduced.

  14. Risk Analysis Training within the Army: Current Status, Future Trends,

    DTIC Science & Technology

    risk analysis . Since risk analysis training in the Army is...become involved in risk analysis training. He reviews all risk analysis -related training done in any course at the Center. Also provided is information...expected to use the training. Then the future trend in risk analysis training is presented. New course, course changes and hardware/software changes that will make risk analysis more palatable are

  15. Activity anorexia: An interplay between basic and applied behavior analysis.

    PubMed

    Pierce, W D; Epling, W F; Dews, P B; Estes, W K; Morse, W H; Van Orman, W; Herrnstein, R J

    1994-01-01

    The relationship between basic research with nonhumans and applied behavior analysis is illustrated by our work on activity anorexia. When rats are fed one meal a day and allowed to run on an activity wheel, they run excessively, stop eating, and die of starvation. Convergent evidence, from several different research areas, indicates that the behavior of these animals and humans who self-starve is functionally similar. A biobehavioral theory of activity anorexia is presented that details the cultural contingencies, behavioral processes, and physiology of anorexia. Diagnostic criteria and a three-stage treatment program for activity-based anorexia are outlined. The animal model permits basic research on anorexia that for practical and ethical reasons cannot be conducted with humans. Thus, basic research can have applied importance.

  16. Applying risk and resilience models to predicting the effects of media violence on development.

    PubMed

    Prot, Sara; Gentile, Douglas A

    2014-01-01

    Although the effects of media violence on children and adolescents have been studied for over 50 years, they remain controversial. Much of this controversy is driven by a misunderstanding of causality that seeks the cause of atrocities such as school shootings. Luckily, several recent developments in risk and resilience theories offer a way out of this controversy. Four risk and resilience models are described, including the cascade model, dose-response gradients, pathway models, and turning-point models. Each is described and applied to the existing media effects literature. Recommendations for future research are discussed with regard to each model. In addition, we examine current developments in theorizing that stressors have sensitizing versus steeling effects and recent interest in biological and gene by environment interactions. We also discuss several of the cultural aspects that have supported the polarization and misunderstanding of the literature, and argue that applying risk and resilience models to the theories and data offers a more balanced way to understand the subtle effects of media violence on aggression within a multicausal perspective.

  17. Synchronisation and coupling analysis: applied cardiovascular physics in sleep medicine.

    PubMed

    Wessel, Niels; Riedl, Maik; Kramer, Jan; Muller, Andreas; Penzel, Thomas; Kurths, Jurgen

    2013-01-01

    Sleep is a physiological process with an internal program of a number of well defined sleep stages and intermediate wakefulness periods. The sleep stages modulate the autonomous nervous system and thereby the sleep stages are accompanied by different regulation regimes for the cardiovascular and respiratory system. The differences in regulation can be distinguished by new techniques of cardiovascular physics. The number of patients suffering from sleep disorders increases unproportionally with the increase of the human population and aging, leading to very high expenses in the public health system. Therefore, the challenge of cardiovascular physics is to develop highly-sophisticated methods which are able to, on the one hand, supplement and replace expensive medical devices and, on the other hand, improve the medical diagnostics with decreasing the patient's risk. Methods of cardiovascular physics are used to analyze heart rate, blood pressure and respiration to detect changes of the autonomous nervous system in different diseases. Data driven modeling analysis, synchronization and coupling analysis and their applications to biosignals in healthy subjects and patients with different sleep disorders are presented. Newly derived methods of cardiovascular physics can help to find indicators for these health risks.

  18. Cladistic analysis applied to the classification of volcanoes

    NASA Astrophysics Data System (ADS)

    Hone, D. W. E.; Mahony, S. H.; Sparks, R. S. J.; Martin, K. T.

    2007-11-01

    Cladistics is a systematic method of classification that groups entities on the basis of sharing similar characteristics in the most parsimonious manner. Here cladistics is applied to the classification of volcanoes using a dataset of 59 Quaternary volcanoes and 129 volcanic edifices of the Tohoku region, Northeast Japan. Volcano and edifice characteristics recorded in the database include attributes of volcano size, chemical composition, dominant eruptive products, volcano morphology, dominant landforms, volcano age and eruptive history. Without characteristics related to time the volcanic edifices divide into two groups, with characters related to volcano size, dominant composition and edifice morphology being the most diagnostic. Analysis including time based characteristics yields four groups with a good correlation between these groups and the two groups from the analysis without time for 108 out of 129 volcanic edifices. Thus when characters are slightly changed the volcanoes still form similar groupings. Analysis of the volcanoes both with and without time yields three groups based on compositional, eruptive products and morphological characters. Spatial clusters of volcanic centres have been recognised in the Tohoku region by Tamura et al. ( Earth Planet Sci Lett 197:105 106, 2002). The groups identified by cladistic analysis are distributed unevenly between the clusters, indicating a tendency for individual clusters to form similar kinds of volcanoes with distinctive but coherent styles of volcanism. Uneven distribution of volcano types between clusters can be explained by variations in dominant magma compositions through time, which are reflected in eruption products and volcanic landforms. Cladistic analysis can be a useful tool for elucidating dynamic igneous processes that could be applied to other regions and globally. Our exploratory study indicates that cladistics has promise as a method for classifying volcanoes and potentially elucidating dynamic

  19. Risk Assessment and Integration Team (RAIT) Portfolio Risk Analysis Strategy

    NASA Technical Reports Server (NTRS)

    Edwards, Michelle

    2010-01-01

    Impact at management level: Qualitative assessment of risk criticality in conjunction with risk consequence, likelihood, and severity enable development of an "investment policy" towards managing a portfolio of risks. Impact at research level: Quantitative risk assessments enable researchers to develop risk mitigation strategies with meaningful risk reduction results. Quantitative assessment approach provides useful risk mitigation information.

  20. A method for determining weights for excess relative risk and excess absolute risk when applied in the calculation of lifetime risk of cancer from radiation exposure.

    PubMed

    Walsh, Linda; Schneider, Uwe

    2013-03-01

    Radiation-related risks of cancer can be transported from one population to another population at risk, for the purpose of calculating lifetime risks from radiation exposure. Transfer via excess relative risks (ERR) or excess absolute risks (EAR) or a mixture of both (i.e., from the life span study (LSS) of Japanese atomic bomb survivors) has been done in the past based on qualitative weighting. Consequently, the values of the weights applied and the method of application of the weights (i.e., as additive or geometric weighted means) have varied both between reports produced at different times by the same regulatory body and also between reports produced at similar times by different regulatory bodies. Since the gender and age patterns are often markedly different between EAR and ERR models, it is useful to have an evidence-based method for determining the relative goodness of fit of such models to the data. This paper identifies a method, using Akaike model weights, which could aid expert judgment and be applied to help to achieve consistency of approach and quantitative evidence-based results in future health risk assessments. The results of applying this method to recent LSS cancer incidence models are that the relative EAR weighting by cancer solid cancer site, on a scale of 0-1, is zero for breast and colon, 0.02 for all solid, 0.03 for lung, 0.08 for liver, 0.15 for thyroid, 0.18 for bladder and 0.93 for stomach. The EAR weighting for female breast cancer increases from 0 to 0.3, if a generally observed change in the trend between female age-specific breast cancer incidence rates and attained age, associated with menopause, is accounted for in the EAR model. Application of this method to preferred models from a study of multi-model inference from many models fitted to the LSS leukemia mortality data, results in an EAR weighting of 0. From these results it can be seen that lifetime risk transfer is most highly weighted by EAR only for stomach cancer. However

  1. The ABC’s of Suicide Risk Assessment: Applying a Tripartite Approach to Individual Evaluations

    PubMed Central

    Harris, Keith M.; Syu, Jia-Jia; Lello, Owen D.; Chew, Y. L. Eileen; Willcox, Christopher H.; Ho, Roger H. M.

    2015-01-01

    There is considerable need for accurate suicide risk assessment for clinical, screening, and research purposes. This study applied the tripartite affect-behavior-cognition theory, the suicidal barometer model, classical test theory, and item response theory (IRT), to develop a brief self-report measure of suicide risk that is theoretically-grounded, reliable and valid. An initial survey (n = 359) employed an iterative process to an item pool, resulting in the six-item Suicidal Affect-Behavior-Cognition Scale (SABCS). Three additional studies tested the SABCS and a highly endorsed comparison measure. Studies included two online surveys (Ns = 1007, and 713), and one prospective clinical survey (n = 72; Time 2, n = 54). Factor analyses demonstrated SABCS construct validity through unidimensionality. Internal reliability was high (α = .86-.93, split-half = .90-.94)). The scale was predictive of future suicidal behaviors and suicidality (r = .68, .73, respectively), showed convergent validity, and the SABCS-4 demonstrated clinically relevant sensitivity to change. IRT analyses revealed the SABCS captured more information than the comparison measure, and better defined participants at low, moderate, and high risk. The SABCS is the first suicide risk measure to demonstrate no differential item functioning by sex, age, or ethnicity. In all comparisons, the SABCS showed incremental improvements over a highly endorsed scale through stronger predictive ability, reliability, and other properties. The SABCS is in the public domain, with this publication, and is suitable for clinical evaluations, public screening, and research. PMID:26030590

  2. Quantitative Microbial Risk Assessment Tutorial: Land-applied Microbial Loadings within a 12-Digit HUC

    EPA Science Inventory

    This tutorial reviews screens, icons, and basic functions of the SDMProjectBuilder (SDMPB). It demonstrates how one chooses a 12-digit HUC for analysis, performs an assessment of land-applied microbes by simulating microbial fate and transport using HSPF, and analyzes and visuali...

  3. Filling Terrorism Gaps: VEOs, Evaluating Databases, and Applying Risk Terrain Modeling to Terrorism

    SciTech Connect

    Hagan, Ross F.

    2016-08-29

    This paper aims to address three issues: the lack of literature differentiating terrorism and violent extremist organizations (VEOs), terrorism incident databases, and the applicability of Risk Terrain Modeling (RTM) to terrorism. Current open source literature and publicly available government sources do not differentiate between terrorism and VEOs; furthermore, they fail to define them. Addressing the lack of a comprehensive comparison of existing terrorism data sources, a matrix comparing a dozen terrorism databases is constructed, providing insight toward the array of data available. RTM, a method for spatial risk analysis at a micro level, has some applicability to terrorism research, particularly for studies looking at risk indicators of terrorism. Leveraging attack data from multiple databases, combined with RTM, offers one avenue for closing existing research gaps in terrorism literature.

  4. Confronting deep uncertainties in risk analysis.

    PubMed

    Cox, Louis Anthony

    2012-10-01

    How can risk analysts help to improve policy and decision making when the correct probabilistic relation between alternative acts and their probable consequences is unknown? This practical challenge of risk management with model uncertainty arises in problems from preparing for climate change to managing emerging diseases to operating complex and hazardous facilities safely. We review constructive methods for robust and adaptive risk analysis under deep uncertainty. These methods are not yet as familiar to many risk analysts as older statistical and model-based methods, such as the paradigm of identifying a single "best-fitting" model and performing sensitivity analyses for its conclusions. They provide genuine breakthroughs for improving predictions and decisions when the correct model is highly uncertain. We demonstrate their potential by summarizing a variety of practical risk management applications.

  5. Compatibility of person-centered planning and applied behavior analysis

    PubMed Central

    Holburn, Steve

    2001-01-01

    In response to Osborne (1999), the aims and practices of person-centered planning (PCP) are compared to the basic principles of applied behavior analysis set forth by Baer, Wolf, and Risley (1968, 1987). The principal goal of PCP is social integration of people with disabilities; it qualifies as a socially important behavior, and its problems have been displayed sufficiently. However, social integration is a complex social problem whose solution requires access to system contingencies that influence lifestyles. Nearly all of the component goals of PCP proposed by O'Brien (1987b) have been reliably quantified, although concurrent measurement of outcomes such as friendship, autonomy, and respect presents a formidable challenge. Behavioral principles such as contingency and contextual control are operative within PCP, but problems in achieving reliable implementation appear to impede an experimental analysis. PMID:22478371

  6. Sensitivity analysis applied to stalled airfoil wake and steady control

    NASA Astrophysics Data System (ADS)

    Patino, Gustavo; Gioria, Rafael; Meneghini, Julio

    2014-11-01

    The sensitivity of an eigenvalue to base flow modifications induced by an external force is applied to the global unstable modes associated to the onset of vortex shedding in the wake of a stalled airfoil. In this work, the flow regime is close to the first instability of the system and its associated eigenvalue/eigenmode is determined. The sensitivity analysis to a general punctual external force allows establishing the regions where control devices must be in order to stabilize the global modes. Different types of steady control devices, passive and active, are used in the regions predicted by the sensitivity analysis to check the vortex shedding suppression, i.e. the primary instability bifurcation is delayed. The new eigenvalue, modified by the action of the device, is also calculated. Finally the spectral finite element method is employed to determine flow characteristics before and after of the bifurcation in order to cross check the results.

  7. Classical mechanics approach applied to analysis of genetic oscillators.

    PubMed

    Vasylchenkova, Anastasiia; Mraz, Miha; Zimic, Nikolaj; Moskon, Miha

    2016-04-05

    Biological oscillators present a fundamental part of several regulatory mechanisms that control the response of various biological systems. Several analytical approaches for their analysis have been reported recently. They are, however, limited to only specific oscillator topologies and/or to giving only qualitative answers, i.e., is the dynamics of an oscillator given the parameter space oscillatory or not. Here we present a general analytical approach that can be applied to the analysis of biological oscillators. It relies on the projection of biological systems to classical mechanics systems. The approach is able to provide us with relatively accurate results in the meaning of type of behaviour system reflects (i.e. oscillatory or not) and periods of potential oscillations without the necessity to conduct expensive numerical simulations. We demonstrate and verify the proposed approach on three different implementations of amplified negative feedback oscillator.

  8. Risk assessment of land-applied biosolids-borne triclocarban (TCC).

    PubMed

    Snyder, Elizabeth Hodges; O'Connor, George A

    2013-01-01

    Triclocarban (TCC) is monitored under the USEPA High Production Volume (HPV) chemical program and is predominantly used as the active ingredient in select antibacterial bar soaps and other personal care products. The compound commonly occurs at parts-per-million concentrations in processed wastewater treatment residuals (i.e. biosolids), which are frequently land-applied as fertilizers and soil conditioners. Human and ecological risk assessment parameters measured by the authors in previous studies were integrated with existing data to perform a two-tiered human health and ecological risk assessment of land-applied biosolids-borne TCC. The 14 exposure pathways identified in the Part 503 Biosolids Rule were expanded, and conservative screening-level hazard quotients (HQ values) were first calculated to estimate risk to humans and a variety of terrestrial and aquatic organisms (Tier 1). The majority of biosolids-borne TCC exposure pathways resulted in no screening-level HQ values indicative of significant risks to exposed organisms (including humans), even under worst-case land application scenarios. The two pathways for which the conservative screening-level HQ values exceeded one (i.e. Pathway 10: biosolids➔soil➔soil organism➔predator, and Pathway 16: biosolids➔soil➔surface water➔aquatic organism) were then reexamined using modified parameters and scenarios (Tier 2). Adjusted HQ values remained greater than one for Exposure Pathway 10, with the exception of the final adjusted HQ values under a one-time 5 Mg ha(-1) (agronomic) biosolids loading rate scenario for the American woodcock (Scolopax minor) and short-tailed shrew (Blarina brevicauda). Results were used to prioritize recommendations for future biosolids-borne TCC research, which include additional measurements of toxicological effects and TCC concentrations in environmental matrices at the field level.

  9. Two-, three-, and four-factor PCL-R models in applied sex offender risk assessments.

    PubMed

    Weaver, Christopher M; Meyer, Robert G; Van Nort, James J; Tristan, Luciano

    2006-06-01

    The authors compared 2-, 3-, 4-factor, and 2-factor/4-facet Psychopathy Checklist-Revised (PCL-R) models in a previously unpublished sample of 1,566 adult male sex offenders assessed under applied clinical conditions as part of a comprehensive state-mandated community notification risk assessment procedure. "Testlets" significantly improved the performance of all models. The 3-factor model provided the best fit to the current data, followed by the 2-factor/4-facet model. The 2-factor model was not supported.

  10. Common Methods for Security Risk Analysis

    DTIC Science & Technology

    2007-11-02

    Workshops was particularly influential among Canadian tool-designers in the late 1980’s. These models generally favour a software tool solution simply...tools that have too small a market to justify extensive software development. Also, most of the risk management standards that came out at this...companies developing specialized risk analysis tools, such as the Vulcanizer project of DOMUS Software Inc. The latter incorporated fuzzy logic to

  11. A risk analysis model for radioactive wastes.

    PubMed

    Külahcı, Fatih

    2011-07-15

    Hazardous wastes affect natural environmental systems to a significant extend, and therefore, it is necessary to control their harm through risk analysis. Herein, an effective risk methodology is proposed by considering their uncertain behaviors on stochastic, statistical and probabilistic bases. The basic element is attachment of a convenient probability distribution function (pdf) to a given waste quality measurement sequence. In this paper, (40)K contaminant measurements are adapted for risk assessment application after derivation of necessary fundamental formulations. The spatial contaminant distribution of (40)K is presented in the forms of maps and three-dimensional surfaces.

  12. Shape analysis applied in heavy ion reactions near Fermi energy

    NASA Astrophysics Data System (ADS)

    Zhang, S.; Huang, M.; Wada, R.; Liu, X.; Lin, W.; Wang, J.

    2017-03-01

    A new method is proposed to perform shape analyses and to evaluate their validity in heavy ion collisions near the Fermi energy. In order to avoid erroneous values of shape parameters in the calculation, a test particle method is utilized in which each nucleon is represented by n test particles, similar to that used in the Boltzmann–Uehling–Uhlenbeck (BUU) calculations. The method is applied to the events simulated by an antisymmetrized molecular dynamics model. The geometrical shape of fragments is reasonably extracted when n = 100 is used. A significant deformation is observed for all fragments created in the multifragmentation process. The method is also applied to the shape of the momentum distribution for event classification. In the momentum case, the errors in the eigenvalue calculation become much smaller than those of the geometrical shape analysis and the results become similar between those with and without the test particle method, indicating that in intermediate heavy ion collisions the shape analysis of momentum distribution can be used for the event classification without the test particle method.

  13. Risk Interfaces to Support Integrated Systems Analysis and Development

    NASA Technical Reports Server (NTRS)

    Mindock, Jennifer; Lumpkins, Sarah; Shelhamer, Mark; Anton, Wilma; Havenhill, Maria

    2016-01-01

    Objectives for systems analysis capability: Develop integrated understanding of how a complex human physiological-socio-technical mission system behaves in spaceflight. Why? Support development of integrated solutions that prevent unwanted outcomes (Implementable approaches to minimize mission resources(mass, power, crew time, etc.)); Support development of tools for autonomy (need for exploration) (Assess and maintain resilience -individuals, teams, integrated system). Output of this exercise: -Representation of interfaces based on Human System Risk Board (HSRB) Risk Summary information and simple status based on Human Research Roadmap; Consolidated HSRB information applied to support communication; Point-of-Departure for HRP Element planning; Ability to track and communicate status of collaborations. 4

  14. Climate change, land slide risks and sustainable development, risk analysis and decision support process tool

    NASA Astrophysics Data System (ADS)

    Andersson-sköld, Y. B.; Tremblay, M.

    2011-12-01

    Climate change is in most parts of Sweden expected to result in increased precipitation and increased sea water levels causing flooding, erosion, slope instability and related secondary consequences. Landslide risks are expected to increase with climate change in large parts of Sweden due to increased annual precipitation, more intense precipitation and increased flows combined with dryer summers. In response to the potential climate related risks, and on the commission of the Ministry of Environment, the Swedish Geotechnical Institute (SGI) is at present performing a risk analysis project for the most prominent landslide risk area in Sweden: the Göta river valley. As part of this, a methodology for land slide ex-ante consequence analysis today, and in a future climate, has been developed and applied in the Göta river valley. Human life, settlements, industry, contaminated sites, infrastructure of national importance are invented and assessed important elements at risk. The goal of the consequence analysis is to produce a map of geographically distributed expected losses, which can be combined with a corresponding map displaying landslide probability to describe the risk (the combination of probability and consequence of a (negative) event). The risk analysis is GIS-aided in presenting and visualise the risk and using existing databases for quantification of the consequences represented by ex-ante estimated monetary losses. The results will be used on national, regional and as an indication of the risk on local level, to assess the need of measures to mitigate the risk. The costs and environmental and social impacts to mitigate the risk are expected to be very high but the costs and impacts of a severe landslide are expected to be even higher. Therefore, civil servants have pronounced a need of tools to assess both the vulnerability and a more holistic picture of impacts of climate change adaptation measures. At SGI a tool for the inclusion of sustainability

  15. Automated SEM Modal Analysis Applied to the Diogenites

    NASA Technical Reports Server (NTRS)

    Bowman, L. E.; Spilde, M. N.; Papike, James J.

    1996-01-01

    Analysis of volume proportions of minerals, or modal analysis, is routinely accomplished by point counting on an optical microscope, but the process, particularly on brecciated samples such as the diogenite meteorites, is tedious and prone to error by misidentification of very small fragments, which may make up a significant volume of the sample. Precise volume percentage data can be gathered on a scanning electron microscope (SEM) utilizing digital imaging and an energy dispersive spectrometer (EDS). This form of automated phase analysis reduces error, and at the same time provides more information than could be gathered using simple point counting alone, such as particle morphology statistics and chemical analyses. We have previously studied major, minor, and trace-element chemistry of orthopyroxene from a suite of diogenites. This abstract describes the method applied to determine the modes on this same suite of meteorites and the results of that research. The modal abundances thus determined add additional information on the petrogenesis of the diogenites. In addition, low-abundance phases such as spinels were located for further analysis by this method.

  16. Causal modelling applied to the risk assessment of a wastewater discharge.

    PubMed

    Paul, Warren L; Rokahr, Pat A; Webb, Jeff M; Rees, Gavin N; Clune, Tim S

    2016-03-01

    Bayesian networks (BNs), or causal Bayesian networks, have become quite popular in ecological risk assessment and natural resource management because of their utility as a communication and decision-support tool. Since their development in the field of artificial intelligence in the 1980s, however, Bayesian networks have evolved and merged with structural equation modelling (SEM). Unlike BNs, which are constrained to encode causal knowledge in conditional probability tables, SEMs encode this knowledge in structural equations, which is thought to be a more natural language for expressing causal information. This merger has clarified the causal content of SEMs and generalised the method such that it can now be performed using standard statistical techniques. As it was with BNs, the utility of this new generation of SEM in ecological risk assessment will need to be demonstrated with examples to foster an understanding and acceptance of the method. Here, we applied SEM to the risk assessment of a wastewater discharge to a stream, with a particular focus on the process of translating a causal diagram (conceptual model) into a statistical model which might then be used in the decision-making and evaluation stages of the risk assessment. The process of building and testing a spatial causal model is demonstrated using data from a spatial sampling design, and the implications of the resulting model are discussed in terms of the risk assessment. It is argued that a spatiotemporal causal model would have greater external validity than the spatial model, enabling broader generalisations to be made regarding the impact of a discharge, and greater value as a tool for evaluating the effects of potential treatment plant upgrades. Suggestions are made on how the causal model could be augmented to include temporal as well as spatial information, including suggestions for appropriate statistical models and analyses.

  17. Risk of boron and heavy metal pollution from agro-industrial wastes applied for plant nutrition.

    PubMed

    Seçer, Müzeyyen; Ceylan, Safak; Elmaci, Omer Lütfü; Akdemir, Hüseyin

    2010-09-01

    In this study, the effects of various agro-industrial wastes were investigated when applied to soil alone or in combination with chemical fertilizers, regarding the risks of boron and heavy metal pollution of soils and plants. Nine combinations of production residues from various agro-industries, urban wastes, and mineral fertilizers were applied to potatoes in a field experiment. The content of available boron in the soil differed significantly (p < 0.05) among the applications. Generally, B values were found to be slightly higher when soapstock, prina, and blood were used alone or in combination. Although total Co, Cd, and Pb contents of soils showed no significant differences between the applications, Cr content differed significantly (p < 0.05). No pollution risk was observed in soil in respect to total Co, Cd, Pb, and Cr contents. The amount of boron and heavy metals in leaves showed no significant differences among the applications. Cobalt, Cd, and Pb in leaves were at normal levels whereas Cr was slightly above normal but well under the critical level. Boron was low in tubers and varied significantly between applications such as Co and Cd. The Co content of tubers was high, Cd and Cr contents were below average, and Pb content was between the given values. Some significant correlations were found between soil characteristics and the boron and heavy metal content of soil, leaves, and tubers.

  18. 14 CFR 417.225 - Debris risk analysis.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 14 Aeronautics and Space 4 2010-01-01 2010-01-01 false Debris risk analysis. 417.225 Section 417... OF TRANSPORTATION LICENSING LAUNCH SAFETY Flight Safety Analysis § 417.225 Debris risk analysis. A... § 417.107(b) for debris. A debris risk analysis must account for risk to populations on land,...

  19. From pathways to people: applying the adverse outcome pathway (AOP) for skin sensitization to risk assessment.

    PubMed

    MacKay, Cameron; Davies, Michael; Summerfield, Vicki; Maxwell, Gavin

    2013-01-01

    Consumer safety risk assessment of skin sensitization requires information on both consumer exposure to the ingredient through product use and the hazardous properties of the ingredient. Significant progress has been made in determining the hazard potential of ingredients without animal testing. However, hazard identification is insufficient for risk assessment, and an understanding of the dose-response is needed. Obtaining such knowledge without animal testing is challenging and requires applying available mechanistic knowledge to both assay development and the integration of these data. The recent OECD report "The Adverse Outcome Pathway for Skin Sensitization Initiated by Covalent Binding to Proteins" presents the available mechanistic knowledge of the sensitization response within an adverse outcome pathway (AOP). We propose to use this AOP as the mechanistic basis for physiologically- and mechanistically-based toxicokinetic-toxicodynamic models of the sensitization response. The approach would be informed by non-animal data, provide predictions of the dose-response required for risk assessment, and would be evaluated against human clinical data.

  20. Use, fate and ecological risks of antibiotics applied in tilapia cage farming in Thailand.

    PubMed

    Rico, Andreu; Oliveira, Rhaul; McDonough, Sakchai; Matser, Arrienne; Khatikarn, Jidapa; Satapornvanit, Kriengkrai; Nogueira, António J A; Soares, Amadeu M V M; Domingues, Inês; Van den Brink, Paul J

    2014-08-01

    The use, environmental fate and ecological risks of antibiotics applied in tilapia cage farming were investigated in the Tha Chin and Mun rivers in Thailand. Information on antibiotic use was collected through interviewing 29 farmers, and the concentrations of the most commonly used antibiotics, oxytetracycline (OTC) and enrofloxacin (ENR), were monitored in river water and sediment samples. Moreover, we assessed the toxicity of OTC and ENR on tropical freshwater invertebrates and performed a risk assessment for aquatic ecosystems. All interviewed tilapia farmers reported to routinely use antibiotics. Peak water concentrations for OTC and ENR were 49 and 1.6 μg/L, respectively. Antibiotics were most frequently detected in sediments with concentrations up to 6908 μg/kg d.w. for OTC, and 2339 μg/kg d.w. for ENR. The results of this study indicate insignificant short-term risks for primary producers and invertebrates, but suggest that the studied aquaculture farms constitute an important source of antibiotic pollution.

  1. An integrated risk analysis methodology in a multidisciplinary design environment

    NASA Astrophysics Data System (ADS)

    Hampton, Katrina Renee

    Design of complex, one-of-a-kind systems, such as space transportation systems, is characterized by high uncertainty and, consequently, high risk. It is necessary to account for these uncertainties in the design process to produce systems that are more reliable. Systems designed by including uncertainties and managing them, as well, are more robust and less prone to poor operations as a result of parameter variability. The quantification, analysis and mitigation of uncertainties are challenging tasks as many systems lack historical data. In such an environment, risk or uncertainty quantification becomes subjective because input data is based on professional judgment. Additionally, there are uncertainties associated with the analysis tools and models. Both the input data and the model uncertainties must be considered for a multi disciplinary systems level risk analysis. This research synthesizes an integrated approach for developing a method for risk analysis. Expert judgment methodology is employed to quantify external risk. This methodology is then combined with a Latin Hypercube Sampling - Monte Carlo simulation to propagate uncertainties across a multidisciplinary environment for the overall system. Finally, a robust design strategy is employed to mitigate risk during the optimization process. This type of approach to risk analysis is conducive to the examination of quantitative risk factors. The core of this research methodology is the theoretical framework for uncertainty propagation. The research is divided into three stages or modules. The first two modules include the identification/quantification and propagation of uncertainties. The third module involves the management of uncertainties or response optimization. This final module also incorporates the integration of risk into program decision-making. The risk analysis methodology, is applied to a launch vehicle conceptual design study at NASA Langley Research Center. The launch vehicle multidisciplinary

  2. Applied Behavior Analysis is a Science and, Therefore, Progressive.

    PubMed

    Leaf, Justin B; Leaf, Ronald; McEachin, John; Taubman, Mitchell; Ala'i-Rosales, Shahla; Ross, Robert K; Smith, Tristram; Weiss, Mary Jane

    2016-02-01

    Applied behavior analysis (ABA) is a science and, therefore, involves progressive approaches and outcomes. In this commentary we argue that the spirit and the method of science should be maintained in order to avoid reductionist procedures, stifled innovation, and rote, unresponsive protocols that become increasingly removed from meaningful progress for individuals diagnosed with autism spectrum disorder (ASD). We describe this approach as progressive. In a progressive approach to ABA, the therapist employs a structured yet flexible process, which is contingent upon and responsive to child progress. We will describe progressive ABA, contrast it to reductionist ABA, and provide rationales for both the substance and intent of ABA as a progressive scientific method for improving conditions of social relevance for individuals with ASD.

  3. Evaluating the effectiveness of teacher training in Applied Behaviour Analysis.

    PubMed

    Grey, Ian M; Honan, Rita; McClean, Brian; Daly, Michael

    2005-09-01

    Interventions for children with autism based upon Applied Behaviour Analysis (ABA) has been repeatedly shown to be related both to educational gains and to reductions in challenging behaviours. However, to date, comprehensive training in ABA for teachers and others have been limited. Over 7 months, 11 teachers undertook 90 hours of classroom instruction and supervision in ABA. Each teacher conducted a comprehensive functional assessment and designed a behaviour support plan targeting one behaviour for one child with an autistic disorder. Target behaviours included aggression, non-compliance and specific educational skills. Teachers recorded observational data for the target behaviour for both baseline and intervention sessions. Support plans produced an average 80 percent change in frequency of occurrence of target behaviours. Questionnaires completed by parents and teachers at the end of the course indicated a beneficial effect for the children and the educational environment. The potential benefits of teacher implemented behavioural intervention are discussed.

  4. Statistical model applied to motor evoked potentials analysis.

    PubMed

    Ma, Ying; Thakor, Nitish V; Jia, Xiaofeng

    2011-01-01

    Motor evoked potentials (MEPs) convey information regarding the functional integrity of the descending motor pathways. Absence of the MEP has been used as a neurophysiological marker to suggest cortico-spinal abnormalities in the operating room. Due to their high variability and sensitivity, detailed quantitative studies of MEPs are lacking. This paper applies a statistical method to characterize MEPs by estimating the number of motor units and single motor unit potential amplitudes. A clearly increasing trend of single motor unit potential amplitudes in the MEPs after each pulse of the stimulation pulse train is revealed by this method. This statistical method eliminates the effects of anesthesia, and provides an objective assessment of MEPs. Consequently this statistical method has high potential to be useful in future quantitative MEPs analysis.

  5. Applying machine learning techniques to DNA sequence analysis

    SciTech Connect

    Shavlik, J.W.

    1992-01-01

    We are developing a machine learning system that modifies existing knowledge about specific types of biological sequences. It does this by considering sample members and nonmembers of the sequence motif being learned. Using this information (which we call a domain theory''), our learning algorithm produces a more accurate representation of the knowledge needed to categorize future sequences. Specifically, the KBANN algorithm maps inference rules, such as consensus sequences, into a neural (connectionist) network. Neural network training techniques then use the training examples of refine these inference rules. We have been applying this approach to several problems in DNA sequence analysis and have also been extending the capabilities of our learning system along several dimensions.

  6. Applying machine learning techniques to DNA sequence analysis

    SciTech Connect

    Shavlik, J.W. . Dept. of Computer Sciences); Noordewier, M.O. . Dept. of Computer Science)

    1992-01-01

    We are primarily developing a machine teaming (ML) system that modifies existing knowledge about specific types of biological sequences. It does this by considering sample members and nonmembers of the sequence motif being teamed. Using this information, our teaming algorithm produces a more accurate representation of the knowledge needed to categorize future sequences. Specifically, our KBANN algorithm maps inference rules about a given recognition task into a neural network. Neural network training techniques then use the training examples to refine these inference rules. We call these rules a domain theory, following the convention in the machine teaming community. We have been applying this approach to several problems in DNA sequence analysis. In addition, we have been extending the capabilities of our teaming system along several dimensions. We have also been investigating parallel algorithms that perform sequence alignments in the presence of frameshift errors.

  7. Moving beyond regression techniques in cardiovascular risk prediction: applying machine learning to address analytic challenges.

    PubMed

    Goldstein, Benjamin A; Navar, Ann Marie; Carter, Rickey E

    2016-07-19

    Risk prediction plays an important role in clinical cardiology research. Traditionally, most risk models have been based on regression models. While useful and robust, these statistical methods are limited to using a small number of predictors which operate in the same way on everyone, and uniformly throughout their range. The purpose of this review is to illustrate the use of machine-learning methods for development of risk prediction models. Typically presented as black box approaches, most machine-learning methods are aimed at solving particular challenges that arise in data analysis that are not well addressed by typical regression approaches. To illustrate these challenges, as well as how different methods can address them, we consider trying to predicting mortality after diagnosis of acute myocardial infarction. We use data derived from our institution's electronic health record and abstract data on 13 regularly measured laboratory markers. We walk through different challenges that arise in modelling these data and then introduce different machine-learning approaches. Finally, we discuss general issues in the application of machine-learning methods including tuning parameters, loss functions, variable importance, and missing data. Overall, this review serves as an introduction for those working on risk modelling to approach the diffuse field of machine learning.

  8. Terrestrial ecological risk evaluation for triclosan in land-applied biosolids.

    PubMed

    Fuchsman, Phyllis; Lyndall, Jennifer; Bock, Michael; Lauren, Darrel; Barber, Timothy; Leigh, Katrina; Perruchon, Elyse; Capdevielle, Marie

    2010-07-01

    Triclosan is an antimicrobial compound found in many consumer products including soaps and personal care products. Most triclosan is disposed of down household drains, whereupon it is conveyed to wastewater treatment plants. Although a high percentage of triclosan biodegrades during wastewater treatment, most of the remainder is adsorbed to sludge, which may ultimately be applied to land as biosolids. We evaluated terrestrial ecological risks related to triclosan in land-applied biosolids for soil microbes, plants, soil invertebrates, mammals, and birds. Exposures are estimated using a probabilistic fugacity-based model. Triclosan concentrations in biosolids and reported biosolids application rates are compiled to support estimation of triclosan concentrations in soil. Concentrations in biota tissue are estimated using an equilibrium partitioning model for plants and worms and a steady-state model for small mammals; the resulting tissue concentrations are used to model mammalian and avian dietary exposures. Toxicity benchmarks are identified from a review of published and proprietary studies. The results indicate that adverse effects related to soil fertility (i.e., disruption of nitrogen cycling) would be expected only under "worst-case" exposures, under certain soil conditions and would likely be transient. The available data indicate that adverse effects on plants, invertebrates, birds, and mammals due to triclosan in land-applied biosolids are unlikely.

  9. Foodborne zoonoses due to meat: a quantitative approach for a comparative risk assessment applied to pig slaughtering in Europe.

    PubMed

    Fosse, Julien; Seegers, Henri; Magras, Catherine

    2008-01-01

    Foodborne zoonoses have a major health impact in industrialised countries. New European food safety regulations were issued to apply risk analysis to the food chain. The severity of foodborne zoonoses and the exposure of humans to biological hazards transmitted by food must be assessed. For meat, inspection at the slaughterhouse is historically the main means of control to protect consumers. However, the levels of detection of biological hazards during meat inspection have not been established in quantitative terms yet. Pork is the most frequently consumed meat in Europe. The aim of this study was to provide elements for quantifying levels of risk for pork consumers and lack of detection by meat inspection. Information concerning hazard identification and characterisation was obtained by the compilation and statistical analysis of data from 440 literature references. The incidence and severity of human cases due to pork consumption in Europe were assessed in order to calculate risk scores. A ratio of non-control was calculated for each biological hazard identified as currently established in Europe, i.e. the incidence of human cases divided by the prevalence of hazards on pork. Salmonella enterica, Yersinia enterocolitica and Campylobacter spp. were characterised by high incidence rates. Listeria monocytogenes, Clostridium botulinum and Mycobacterium spp. showed the highest severity scores. The three main high risk hazards involved in foodborne infections, Y. enterocolitica, S. enterica and Campylobacter spp. are characterised by high non-control ratios and cannot be detected by macroscopic examination of carcasses. New means of hazard control are needed to complement the classical macroscopic examination.

  10. Extended Kramers-Moyal analysis applied to optical trapping.

    PubMed

    Honisch, Christoph; Friedrich, Rudolf; Hörner, Florian; Denz, Cornelia

    2012-08-01

    The Kramers-Moyal analysis is a well-established approach to analyze stochastic time series from complex systems. If the sampling interval of a measured time series is too low, systematic errors occur in the analysis results. These errors are labeled as finite time effects in the literature. In the present article, we present some new insights about these effects and discuss the limitations of a previously published method to estimate Kramers-Moyal coefficients at the presence of finite time effects. To increase the reliability of this method and to avoid misinterpretations, we extend it by the computation of error estimates for estimated parameters using a Monte Carlo error propagation technique. Finally, the extended method is applied to a data set of an optical trapping experiment yielding estimations of the forces acting on a Brownian particle trapped by optical tweezers. We find an increased Markov-Einstein time scale of the order of the relaxation time of the process, which can be traced back to memory effects caused by the interaction of the particle and the fluid. Above the Markov-Einstein time scale, the process can be very well described by the classical overdamped Markov model for Brownian motion.

  11. Advanced Risk Analysis for High-Performing Organizations

    DTIC Science & Technology

    2006-01-01

    using traditional risk analysis techniques. Mission Assurance Analysis Protocol (MAAP) is one technique that high performers can use to identify and mitigate the risks arising from operational complexity....The operational environment for many types of organizations is changing. Changes in operational environments are driving the need for advanced risk ... analysis techniques. Many types of risk prevalent in today’s operational environments (e.g., event risks, inherited risk) are not readily identified

  12. Multitaper Spectral Analysis and Wavelet Denoising Applied to Helioseismic Data

    NASA Technical Reports Server (NTRS)

    Komm, R. W.; Gu, Y.; Hill, F.; Stark, P. B.; Fodor, I. K.

    1999-01-01

    Estimates of solar normal mode frequencies from helioseismic observations can be improved by using Multitaper Spectral Analysis (MTSA) to estimate spectra from the time series, then using wavelet denoising of the log spectra. MTSA leads to a power spectrum estimate with reduced variance and better leakage properties than the conventional periodogram. Under the assumption of stationarity and mild regularity conditions, the log multitaper spectrum has a statistical distribution that is approximately Gaussian, so wavelet denoising is asymptotically an optimal method to reduce the noise in the estimated spectra. We find that a single m-upsilon spectrum benefits greatly from MTSA followed by wavelet denoising, and that wavelet denoising by itself can be used to improve m-averaged spectra. We compare estimates using two different 5-taper estimates (Stepian and sine tapers) and the periodogram estimate, for GONG time series at selected angular degrees l. We compare those three spectra with and without wavelet-denoising, both visually, and in terms of the mode parameters estimated from the pre-processed spectra using the GONG peak-fitting algorithm. The two multitaper estimates give equivalent results. The number of modes fitted well by the GONG algorithm is 20% to 60% larger (depending on l and the temporal frequency) when applied to the multitaper estimates than when applied to the periodogram. The estimated mode parameters (frequency, amplitude and width) are comparable for the three power spectrum estimates, except for modes with very small mode widths (a few frequency bins), where the multitaper spectra broadened the modest compared with the periodogram. We tested the influence of the number of tapers used and found that narrow modes at low n values are broadened to the extent that they can no longer be fit if the number of tapers is too large. For helioseismic time series of this length and temporal resolution, the optimal number of tapers is less than 10.

  13. The Evidence-Based Practice of Applied Behavior Analysis.

    PubMed

    Slocum, Timothy A; Detrich, Ronnie; Wilczynski, Susan M; Spencer, Trina D; Lewis, Teri; Wolfe, Katie

    2014-05-01

    Evidence-based practice (EBP) is a model of professional decision-making in which practitioners integrate the best available evidence with client values/context and clinical expertise in order to provide services for their clients. This framework provides behavior analysts with a structure for pervasive use of the best available evidence in the complex settings in which they work. This structure recognizes the need for clear and explicit understanding of the strength of evidence supporting intervention options, the important contextual factors including client values that contribute to decision making, and the key role of clinical expertise in the conceptualization, intervention, and evaluation of cases. Opening the discussion of EBP in this journal, Smith (The Behavior Analyst, 36, 7-33, 2013) raised several key issues related to EBP and applied behavior analysis (ABA). The purpose of this paper is to respond to Smith's arguments and extend the discussion of the relevant issues. Although we support many of Smith's (The Behavior Analyst, 36, 7-33, 2013) points, we contend that Smith's definition of EBP is significantly narrower than definitions that are used in professions with long histories of EBP and that this narrowness conflicts with the principles that drive applied behavior analytic practice. We offer a definition and framework for EBP that aligns with the foundations of ABA and is consistent with well-established definitions of EBP in medicine, psychology, and other professions. In addition to supporting the systematic use of research evidence in behavior analytic decision making, this definition can promote clear communication about treatment decisions across disciplines and with important outside institutions such as insurance companies and granting agencies.

  14. First Attempt of Applying Factor Analysis in Moving Base Gravimetry

    NASA Astrophysics Data System (ADS)

    Li, X.; Roman, D. R.

    2014-12-01

    For gravimetric observation systems on mobile platforms (land/sea/airborne), the Low Signal to Noise Ratio (SNR) issue is the main barrier to achieving an accurate, high resolution gravity signal. Normally, low-pass filters (Childers et al 1999, Forsberg et al 2000, Kwon and Jekeli 2000, Hwang et al 2006) are applied to smooth or remove the high frequency "noise" - even though some of the high frequency component is not necessarily noise. This is especially true for aerogravity surveys such as those from the Gravity for the Redefinition of the American Vertical Datum (GRAV-D) project. These gravity survey flights have a spatial resolution of 10 km between tracks but higher resolution along track. The along track resolution is improved due to the lower flight height (6.1 km), equipment sensitivity, and improved modeling of potential errors. Additionally, these surveys suffer from a loss of signal power due to the increased flight elevation. Hence, application of a low-pass filter removes possible signal sensed in the along-track direction that might otherwise prove useful for various geophysical and geodetic applications. Some cutting-edge developments in Wavelets and Artificial Neural Networks had been successfully applied for obtaining improved results (Li 2008 and 2011, Liang and Liu 2013). However, a clearer and fundamental understanding of the error characteristics will further improve the quality of the gravity estimates out of these gravimetric systems. Here, instead of using any predefined basis function or any a priori model, the idea of Factor Analysis is first employed to try to extract the underlying factors of the noises in the systems. Real data sets collected by both land vehicle and aircraft will be processed as the examples.

  15. Supplemental Hazard Analysis and Risk Assessment - Hydrotreater

    SciTech Connect

    Lowry, Peter P.; Wagner, Katie A.

    2015-04-01

    A supplemental hazard analysis was conducted and quantitative risk assessment performed in response to an independent review comment received by the Pacific Northwest National Laboratory (PNNL) from the U.S. Department of Energy Pacific Northwest Field Office (PNSO) against the Hydrotreater/Distillation Column Hazard Analysis Report issued in April 2013. The supplemental analysis used the hazardous conditions documented by the previous April 2013 report as a basis. The conditions were screened and grouped for the purpose of identifying whether additional prudent, practical hazard controls could be identified, using a quantitative risk evaluation to assess the adequacy of the controls and establish a lower level of concern for the likelihood of potential serious accidents. Calculations were performed to support conclusions where necessary.

  16. Risk assessment and its application to flight safety analysis

    SciTech Connect

    Keese, D.L.; Barton, W.R.

    1989-12-01

    Potentially hazardous test activities have historically been a part of Sandia National Labs mission to design, develop, and test new weapons systems. These test activities include high speed air drops for parachute development, sled tests for component and system level studies, multiple stage rocket experiments, and artillery firings of various projectiles. Due to the nature of Sandia's test programs, the risk associated with these activities can never be totally eliminated. However, a consistent set of policies should be available to provide guidance into the level of risk that is acceptable in these areas. This report presents a general set of guidelines for addressing safety issues related to rocket flight operations at Sandia National Laboratories. Even though the majority of this report deals primarily with rocket flight safety, these same principles could be applied to other hazardous test activities. The basic concepts of risk analysis have a wide range of applications into many of Sandia's current operations. 14 refs., 1 tab.

  17. Risk Analysis of the Supply-Handling Conveyor System.

    DTIC Science & Technology

    The report documents the risk analysis that was performed on a supply-handling conveyor system. The risk analysis was done to quantify the risks...involved for project development in addition to compliance with the draft AMC regulation on risk analysis . The conveyor system is in the final phase of

  18. Applying quality criteria to exposure in asbestos epidemiology increases the estimated risk.

    PubMed

    Burdorf, Alex; Heederik, Dick

    2011-07-01

    Mesothelioma deaths due to environmental exposure to asbestos in The Netherlands led to parliamentary concern that exposure guidelines were not strict enough. The Health Council of the Netherlands was asked for advice. Its report has recently been published. The question of quality of the exposure estimates was studied more systematically than in previous asbestos meta-analyses. Five criteria of quality of exposure information were applied, and cohort studies that failed to meet these were excluded. For lung cancer, this decreased the number of cohorts included from 19 to 3 and increased the risk estimate 3- to 6-fold, with the requirements for good historical data on exposure and job history having the largest effects. It also suggested that the apparent differences in lung cancer potency between amphiboles and chrysotile may be produced by lower quality studies. A similar pattern was seen for mesothelioma. As a result, the Health Council has proposed that the occupational exposure limit be reduced from 10 000 fibres m(-3) (all types) to 250 f m(-3) (amphiboles), 1300 f m(-3) (mixed fibres), and 2000 f m(-3) (chrysotile). The process illustrates the importance of evaluating quality of exposure in epidemiology since poor quality of exposure data will lead to underestimated risk.

  19. Risk-driven security testing using risk analysis with threat modeling approach.

    PubMed

    Palanivel, Maragathavalli; Selvadurai, Kanmani

    2014-01-01

    Security testing is a process of determining risks present in the system states and protects them from vulnerabilities. But security testing does not provide due importance to threat modeling and risk analysis simultaneously that affects confidentiality and integrity of the system. Risk analysis includes identification, evaluation and assessment of risks. Threat modeling approach is identifying threats associated with the system. Risk-driven security testing uses risk analysis results in test case identification, selection and assessment to prioritize and optimize the testing process. Threat modeling approach, STRIDE is generally used to identify both technical and non-technical threats present in the system. Thus, a security testing mechanism based on risk analysis results using STRIDE approach has been proposed for identifying highly risk states. Risk metrics considered for testing includes risk impact, risk possibility and risk threshold. Risk threshold value is directly proportional to risk impact and risk possibility. Risk-driven security testing results in reduced test suite which in turn reduces test case selection time. Risk analysis optimizes the test case selection and execution process. For experimentation, the system models namely LMS, ATM, OBS, OSS and MTRS are considered. The performance of proposed system is analyzed using Test Suite Reduction Rate (TSRR) and FSM coverage. TSRR varies from 13.16 to 21.43% whereas FSM coverage is achieved up to 91.49%. The results show that the proposed method combining risk analysis with threat modeling identifies states with high risks to improve the testing efficiency.

  20. To apply or not to apply: a survey analysis of grant writing costs and benefits.

    PubMed

    von Hippel, Ted; von Hippel, Courtney

    2015-01-01

    We surveyed 113 astronomers and 82 psychologists active in applying for federally funded research on their grant-writing history between January, 2009 and November, 2012. We collected demographic data, effort levels, success rates, and perceived non-financial benefits from writing grant proposals. We find that the average proposal takes 116 PI hours and 55 CI hours to write; although time spent writing was not related to whether the grant was funded. Effort did translate into success, however, as academics who wrote more grants received more funding. Participants indicated modest non-monetary benefits from grant writing, with psychologists reporting a somewhat greater benefit overall than astronomers. These perceptions of non-financial benefits were unrelated to how many grants investigators applied for, the number of grants they received, or the amount of time they devoted to writing their proposals. We also explored the number of years an investigator can afford to apply unsuccessfully for research grants and our analyses suggest that funding rates below approximately 20%, commensurate with current NIH and NSF funding, are likely to drive at least half of the active researchers away from federally funded research. We conclude with recommendations and suggestions for individual investigators and for department heads.

  1. To Apply or Not to Apply: A Survey Analysis of Grant Writing Costs and Benefits

    PubMed Central

    von Hippel, Ted; von Hippel, Courtney

    2015-01-01

    We surveyed 113 astronomers and 82 psychologists active in applying for federally funded research on their grant-writing history between January, 2009 and November, 2012. We collected demographic data, effort levels, success rates, and perceived non-financial benefits from writing grant proposals. We find that the average proposal takes 116 PI hours and 55 CI hours to write; although time spent writing was not related to whether the grant was funded. Effort did translate into success, however, as academics who wrote more grants received more funding. Participants indicated modest non-monetary benefits from grant writing, with psychologists reporting a somewhat greater benefit overall than astronomers. These perceptions of non-financial benefits were unrelated to how many grants investigators applied for, the number of grants they received, or the amount of time they devoted to writing their proposals. We also explored the number of years an investigator can afford to apply unsuccessfully for research grants and our analyses suggest that funding rates below approximately 20%, commensurate with current NIH and NSF funding, are likely to drive at least half of the active researchers away from federally funded research. We conclude with recommendations and suggestions for individual investigators and for department heads. PMID:25738742

  2. Low-thrust mission risk analysis.

    NASA Technical Reports Server (NTRS)

    Yen, C. L.; Smith, D. B.

    1973-01-01

    A computerized multi-stage failure process simulation procedure is used to evaluate the risk in a solar electric space mission. The procedure uses currently available thrust-subsystem reliability data and performs approximate simulations of the thrust subsystem burn operation, the system failure processes, and the retargetting operations. The application of the method is used to assess the risks in carrying out a 1980 rendezvous mission to Comet Encke. Analysis of the results and evaluation of the effects of various risk factors on the mission show that system component failure rates is the limiting factor in attaining a high mission reliability. But it is also shown that a well-designed trajectory and system operation mode can be used effectively to partially compensate for unreliable thruster performance.

  3. A comprehensive risk analysis of coastal zones in China

    NASA Astrophysics Data System (ADS)

    Wang, Guanghui; Liu, Yijun; Wang, Hongbing; Wang, Xueying

    2014-03-01

    Although coastal zones occupy an important position in the world development, they face high risks and vulnerability to natural disasters because of their special locations and their high population density. In order to estimate their capability for crisis-response, various models have been established. However, those studies mainly focused on natural factors or conditions, which could not reflect the social vulnerability and regional disparities of coastal zones. Drawing lessons from the experiences of the United Nations Environment Programme (UNEP), this paper presents a comprehensive assessment strategy based on the mechanism of Risk Matrix Approach (RMA), which includes two aspects that are further composed of five second-class indicators. The first aspect, the probability phase, consists of indicators of economic conditions, social development, and living standards, while the second one, the severity phase, is comprised of geographic exposure and natural disasters. After weighing all of the above indicators by applying the Analytic Hierarchy Process (AHP) and Delphi Method, the paper uses the comprehensive assessment strategy to analyze the risk indices of 50 coastal cities in China. The analytical results are presented in ESRI ArcGis10.1, which generates six different risk maps covering the aspects of economy, society, life, environment, disasters, and an overall assessment of the five areas. Furthermore, the study also investigates the spatial pattern of these risk maps, with detailed discussion and analysis of different risks in coastal cities.

  4. Risk analysis for renewable energy projects due to constraints arising

    NASA Astrophysics Data System (ADS)

    Prostean, G.; Vasar, C.; Prostean, O.; Vartosu, A.

    2016-02-01

    Starting from the target of the European Union (EU) to use renewable energy in the area that aims a binding target of 20% renewable energy in final energy consumption by 2020, this article illustrates the identification of risks for implementation of wind energy projects in Romania, which could lead to complex technical implications, social and administrative. In specific projects analyzed in this paper were identified critical bottlenecks in the future wind power supply chain and reasonable time periods that may arise. Renewable energy technologies have to face a number of constraints that delayed scaling-up their production process, their transport process, the equipment reliability, etc. so implementing these types of projects requiring complex specialized team, the coordination of which also involve specific risks. The research team applied an analytical risk approach to identify major risks encountered within a wind farm project developed in Romania in isolated regions with different particularities, configured for different geographical areas (hill and mountain locations in Romania). Identification of major risks was based on the conceptual model set up for the entire project implementation process. Throughout this conceptual model there were identified specific constraints of such process. Integration risks were examined by an empirical study based on the method HAZOP (Hazard and Operability). The discussion describes the analysis of our results implementation context of renewable energy projects in Romania and creates a framework for assessing energy supply to any entity from renewable sources.

  5. Applying DNA computation to intractable problems in social network analysis.

    PubMed

    Chen, Rick C S; Yang, Stephen J H

    2010-09-01

    From ancient times to the present day, social networks have played an important role in the formation of various organizations for a range of social behaviors. As such, social networks inherently describe the complicated relationships between elements around the world. Based on mathematical graph theory, social network analysis (SNA) has been developed in and applied to various fields such as Web 2.0 for Web applications and product developments in industries, etc. However, some definitions of SNA, such as finding a clique, N-clique, N-clan, N-club and K-plex, are NP-complete problems, which are not easily solved via traditional computer architecture. These challenges have restricted the uses of SNA. This paper provides DNA-computing-based approaches with inherently high information density and massive parallelism. Using these approaches, we aim to solve the three primary problems of social networks: N-clique, N-clan, and N-club. Their accuracy and feasible time complexities discussed in the paper will demonstrate that DNA computing can be used to facilitate the development of SNA.

  6. Applying microscopy to the analysis of nuclear structure and function.

    PubMed

    Iborra, Francisco; Cook, Peter R; Jackson, Dean A

    2003-02-01

    One of the ultimate goals of biological research is to understand mechanisms of cell function within living organisms. With this in mind, many sophisticated technologies that allow us to inspect macromolecular structure in exquisite detail have been developed. Although knowledge of structure derived from techniques such as X-ray crystallography and nuclear magnetic resonance is of vital importance, these approaches cannot reveal the remarkable complexity of molecular interactions that exists in vivo. With this in mind, this review focuses on the use of microscopy techniques to analyze cell structure and function. We describe the different basic microscopic methodologies and how the routine techniques are best applied to particular biological problems. We also emphasize the specific capabilities and uses of light and electron microscopy and highlight their individual advantages and disadvantages. For completion, we also comment on the alternative possibilities provided by a variety of advanced imaging technologies. We hope that this brief analysis of the undoubted power of microscopy techniques will be enough to stimulate a wider participation in this rapidly developing area of biological discovery.

  7. RISK ASSESSMENT AND EPIDEMIOLOGICAL INFORMATION FOR PATHOGENIC MICROORGANISMS APPLIED TO SOIL

    EPA Science Inventory

    There is increasing interest in the development of a microbial risk assessment methodology for regulatory and operational decision making. Initial interests in microbial risk assessments focused on drinking, recreational, and reclaimed water issues. More recently risk assessmen...

  8. New risk metrics and mathematical tools for risk analysis: Current and future challenges

    NASA Astrophysics Data System (ADS)

    Skandamis, Panagiotis N.; Andritsos, Nikolaos; Psomas, Antonios; Paramythiotis, Spyridon

    2015-01-01

    The current status of the food safety supply world wide, has led Food and Agriculture Organization (FAO) and World Health Organization (WHO) to establishing Risk Analysis as the single framework for building food safety control programs. A series of guidelines and reports that detail out the various steps in Risk Analysis, namely Risk Management, Risk Assessment and Risk Communication is available. The Risk Analysis approach enables integration between operational food management systems, such as Hazard Analysis Critical Control Points, public health and governmental decisions. To do that, a series of new Risk Metrics has been established as follows: i) the Appropriate Level of Protection (ALOP), which indicates the maximum numbers of illnesses in a population per annum, defined by quantitative risk assessments, and used to establish; ii) Food Safety Objective (FSO), which sets the maximum frequency and/or concentration of a hazard in a food at the time of consumption that provides or contributes to the ALOP. Given that ALOP is rather a metric of the public health tolerable burden (it addresses the total `failure' that may be handled at a national level), it is difficult to be interpreted into control measures applied at the manufacturing level. Thus, a series of specific objectives and criteria for performance of individual processes and products have been established, all of them assisting in the achievement of FSO and hence, ALOP. In order to achieve FSO, tools quantifying the effect of processes and intrinsic properties of foods on survival and growth of pathogens are essential. In this context, predictive microbiology and risk assessment have offered an important assistance to Food Safety Management. Predictive modelling is the basis of exposure assessment and the development of stochastic and kinetic models, which are also available in the form of Web-based applications, e.g., COMBASE and Microbial Responses Viewer), or introduced into user-friendly softwares

  9. New risk metrics and mathematical tools for risk analysis: Current and future challenges

    SciTech Connect

    Skandamis, Panagiotis N. Andritsos, Nikolaos Psomas, Antonios Paramythiotis, Spyridon

    2015-01-22

    The current status of the food safety supply world wide, has led Food and Agriculture Organization (FAO) and World Health Organization (WHO) to establishing Risk Analysis as the single framework for building food safety control programs. A series of guidelines and reports that detail out the various steps in Risk Analysis, namely Risk Management, Risk Assessment and Risk Communication is available. The Risk Analysis approach enables integration between operational food management systems, such as Hazard Analysis Critical Control Points, public health and governmental decisions. To do that, a series of new Risk Metrics has been established as follows: i) the Appropriate Level of Protection (ALOP), which indicates the maximum numbers of illnesses in a population per annum, defined by quantitative risk assessments, and used to establish; ii) Food Safety Objective (FSO), which sets the maximum frequency and/or concentration of a hazard in a food at the time of consumption that provides or contributes to the ALOP. Given that ALOP is rather a metric of the public health tolerable burden (it addresses the total ‘failure’ that may be handled at a national level), it is difficult to be interpreted into control measures applied at the manufacturing level. Thus, a series of specific objectives and criteria for performance of individual processes and products have been established, all of them assisting in the achievement of FSO and hence, ALOP. In order to achieve FSO, tools quantifying the effect of processes and intrinsic properties of foods on survival and growth of pathogens are essential. In this context, predictive microbiology and risk assessment have offered an important assistance to Food Safety Management. Predictive modelling is the basis of exposure assessment and the development of stochastic and kinetic models, which are also available in the form of Web-based applications, e.g., COMBASE and Microbial Responses Viewer), or introduced into user

  10. Integrated Reliability and Risk Analysis System (IRRAS)

    SciTech Connect

    Russell, K D; McKay, M K; Sattison, M.B. Skinner, N.L.; Wood, S T; Rasmuson, D M

    1992-01-01

    The Integrated Reliability and Risk Analysis System (IRRAS) is a state-of-the-art, microcomputer-based probabilistic risk assessment (PRA) model development and analysis tool to address key nuclear plant safety issues. IRRAS is an integrated software tool that gives the user the ability to create and analyze fault trees and accident sequences using a microcomputer. This program provides functions that range from graphical fault tree construction to cut set generation and quantification. Version 1.0 of the IRRAS program was released in February of 1987. Since that time, many user comments and enhancements have been incorporated into the program providing a much more powerful and user-friendly system. This version has been designated IRRAS 4.0 and is the subject of this Reference Manual. Version 4.0 of IRRAS provides the same capabilities as Version 1.0 and adds a relational data base facility for managing the data, improved functionality, and improved algorithm performance.

  11. Health risk analysis of atmospheric polycyclic aromatic hydrocarbons in big cities of China.

    PubMed

    Wang, Yonghua; Hu, Liangfeng; Lu, Guanghua

    2014-05-01

    A probabilistic carcinogenic risk assessment of atmospheric polycyclic aromatic hydrocarbons (PAHs) in four big cities (Beijing, Shanghai, Guangzhou, Xiamen) of China was carried out. PAHs levels in these cities were collected from published literatures and converted into BaP equivalent (BaPeq) concentrations. The health risk assessment models recommended by US EPA were applied to quantitatively characterize the health risk values of PAHs. Monte Carlo simulation and sensitivity analysis were applied to quantify uncertainties of risk assessment. The results showed that BaPeq concentrations of four cities were all higher than the newest limited value (1 ng/m(3)) of China. Health risk assessment indicated that atmospheric PAHs in Guangzhou and Xiamen posed no or little carcinogenic risk on local residents. However, the PAHs in Beijing and Shanghai posed potential carcinogenic risk for adults and lifetime exposure. Notwithstanding the uncertainties, this study provides the primary information on the carcinogenic risk of atmospheric PAHs in studied cities of China.

  12. Fire Risk Analysis for Armenian NPP Confinement

    SciTech Connect

    Poghosyan, Shahen; Malkhasyan, Albert; Bznuni, Surik; Amirjanyan, Armen

    2006-07-01

    Major fire occurred at Armenian NPP (ANPP) in October 1982 showed that fire-induced initiating events (IE) can have dominant contribution in overall risk of core damage. Probabilistic Safety Assessment study for fire-induced initiating events for ANPP was initiated in 2002. Analysis was performed for compartments fires in which could result in failure of components which are necessary for reactor cold shutdown. Analysis shows that main risk from fire at ANPP is conditioned by fire in cable tunnels 61-64. Meanwhile fire in confinement compartments don't have significant contribution to overall risk of core damage. The exception is so called 'confinement valves compartment' (room no.A-013/2) fire (more than 7.5% of CDF) in which fire could result in the loss of coolant accident with unavailability of primary makeup system, which directly leads to core damage. Detailed analysis of this problem that is common for typical WWER-440/230 reactors with no hermetic MCPs and recommendations for solution are presented in this paper. (authors)

  13. Acquaintance Rape: Applying Crime Scene Analysis to the Prediction of Sexual Recidivism.

    PubMed

    Lehmann, Robert J B; Goodwill, Alasdair M; Hanson, R Karl; Dahle, Klaus-Peter

    2016-10-01

    The aim of the current study was to enhance the assessment and predictive accuracy of risk assessments for sexual offenders by utilizing detailed crime scene analysis (CSA). CSA was conducted on a sample of 247 male acquaintance rapists from Berlin (Germany) using a nonmetric, multidimensional scaling (MDS) Behavioral Thematic Analysis (BTA) approach. The age of the offenders at the time of the index offense ranged from 14 to 64 years (M = 32.3; SD = 11.4). The BTA procedure revealed three behavioral themes of hostility, criminality, and pseudo-intimacy, consistent with previous CSA research on stranger rape. The construct validity of the three themes was demonstrated through correlational analyses with known sexual offending measures and criminal histories. The themes of hostility and pseudo-intimacy were significant predictors of sexual recidivism. In addition, the pseudo-intimacy theme led to a significant increase in the incremental validity of the Static-99 actuarial risk assessment instrument for the prediction of sexual recidivism. The results indicate the potential utility and validity of crime scene behaviors in the applied risk assessment of sexual offenders.

  14. Global Human Settlement Analysis for Disaster Risk Reduction

    NASA Astrophysics Data System (ADS)

    Pesaresi, M.; Ehrlich, D.; Ferri, S.; Florczyk, A.; Freire, S.; Haag, F.; Halkia, M.; Julea, A. M.; Kemper, T.; Soille, P.

    2015-04-01

    The Global Human Settlement Layer (GHSL) is supported by the European Commission, Joint Research Center (JRC) in the frame of his institutional research activities. Scope of GHSL is developing, testing and applying the technologies and analysis methods integrated in the JRC Global Human Settlement analysis platform for applications in support to global disaster risk reduction initiatives (DRR) and regional analysis in the frame of the European Cohesion policy. GHSL analysis platform uses geo-spatial data, primarily remotely sensed and population. GHSL also cooperates with the Group on Earth Observation on SB-04-Global Urban Observation and Information, and various international partners andWorld Bank and United Nations agencies. Some preliminary results integrating global human settlement information extracted from Landsat data records of the last 40 years and population data are presented.

  15. Meta-analysis of osteoporosis: fracture risks, medication and treatment.

    PubMed

    Liu, W; Yang, L-H; Kong, X-C; An, L-K; Wang, R

    2015-08-01

    Osteoporosis is a brittle bone disease that can cause fractures mostly in older men and women. Meta-analysis is the statistical method which is applied in the frame work for the assessment of results obtained from various research studies conducted in several years. A meta-analysis of osteoporotic fracture risk with medication non-adherence has been described to assess the bone fracture risk among patients non-adherent versus adherent to therapy for osteoporosis by many researchers. Osteoporosis therapy reduces the risk of fracture in clinical trials, and real-world adherence to therapy which is suboptimal and can reduce the effectiveness of intervention. The methods of Medline, Embase, and CINAHL were literature searched for these observational studies from year 1998 to 2009, and up to 2015. The results of meta-analysis of osteoporosis research on fractures of postmenopausal women and men are presented. The use of bisphosphonate therapy for osteoporosis has been described with other drugs. The authors, design, studies (women %), years (data), follow-up (wks), fractures (types), and compliance or persistence results from years 2004 to 2009 from are shown in a brief table. The meta-analysis studies have been reviewed from other researchers on osteoporosis and fractures, medications and treatments.

  16. Metabolic and Dynamic Profiling for Risk Assessment of Fluopyram, a Typical Phenylamide Fungicide Widely Applied in Vegetable Ecosystem

    PubMed Central

    Wei, Peng; Liu, Yanan; Li, Wenzhuo; Qian, Yuan; Nie, Yanxia; Kim, Dongyeop; Wang, Mengcen

    2016-01-01

    Fluopyram, a typical phenylamide fungicide, was widely applied to protect fruit vegetables from fungal pathogens-responsible yield loss. Highly linked to the ecological and dietary risks, its residual and metabolic profiles in the fruit vegetable ecosystem still remained obscure. Here, an approach using modified QuEChERS (Quick, Easy, Cheap, Effective, Rugged and Safe) extraction combined with GC-MS/MS analysis was developed to investigate fluopyram fate in the typical fruit vegetables including tomato, cucumber, pepper under the greenhouse environment. Fluopyram dissipated in accordance with the first-order rate dynamics equation with the maximum half-life of 5.7 d. Cleveage of fluopyram into 2-trifluoromethyl benzamide and subsequent formation of 3-chloro-5-(trifluoromethyl) pyridine-2-acetic acid and 3-chloro-5-(trifluoromethyl) picolinic acid was elucidated to be its ubiquitous metabolic pathway. Moreover, the incurrence of fluopyram at the pre-harvest interval (PHI) of 7–21 d was between 0.0108 and 0.1603 mg/kg, and the Hazard Quotients (HQs) were calculated to be less than 1, indicating temporary safety on consumption of the fruit vegetables incurred with fluopyram, irrespective of the uncertain toxicity of the metabolites. Taken together, our findings reveal the residual essential of fluopyram in the typical agricultural ecosystem, and would advance the further insight into ecological risk posed by this fungicide associated with its metabolites. PMID:27654708

  17. Metabolic and Dynamic Profiling for Risk Assessment of Fluopyram, a Typical Phenylamide Fungicide Widely Applied in Vegetable Ecosystem

    NASA Astrophysics Data System (ADS)

    Wei, Peng; Liu, Yanan; Li, Wenzhuo; Qian, Yuan; Nie, Yanxia; Kim, Dongyeop; Wang, Mengcen

    2016-09-01

    Fluopyram, a typical phenylamide fungicide, was widely applied to protect fruit vegetables from fungal pathogens-responsible yield loss. Highly linked to the ecological and dietary risks, its residual and metabolic profiles in the fruit vegetable ecosystem still remained obscure. Here, an approach using modified QuEChERS (Quick, Easy, Cheap, Effective, Rugged and Safe) extraction combined with GC-MS/MS analysis was developed to investigate fluopyram fate in the typical fruit vegetables including tomato, cucumber, pepper under the greenhouse environment. Fluopyram dissipated in accordance with the first-order rate dynamics equation with the maximum half-life of 5.7 d. Cleveage of fluopyram into 2-trifluoromethyl benzamide and subsequent formation of 3-chloro-5-(trifluoromethyl) pyridine-2-acetic acid and 3-chloro-5-(trifluoromethyl) picolinic acid was elucidated to be its ubiquitous metabolic pathway. Moreover, the incurrence of fluopyram at the pre-harvest interval (PHI) of 7–21 d was between 0.0108 and 0.1603 mg/kg, and the Hazard Quotients (HQs) were calculated to be less than 1, indicating temporary safety on consumption of the fruit vegetables incurred with fluopyram, irrespective of the uncertain toxicity of the metabolites. Taken together, our findings reveal the residual essential of fluopyram in the typical agricultural ecosystem, and would advance the further insight into ecological risk posed by this fungicide associated with its metabolites.

  18. Factor Analysis Applied the VFY-218 RCS Data

    NASA Technical Reports Server (NTRS)

    Woo, Alex; Chancellor, Marisa K. (Technical Monitor)

    1997-01-01

    Present statistical factor analysis of computer simulations and measurement data for the VFY-218 configuration. Factor analysis try to quantify the statistical grouping of measurements and simulations.

  19. Improving causal inferences in risk analysis.

    PubMed

    Cox, Louis Anthony Tony

    2013-10-01

    Recent headlines and scientific articles projecting significant human health benefits from changes in exposures too often depend on unvalidated subjective expert judgments and modeling assumptions, especially about the causal interpretation of statistical associations. Some of these assessments are demonstrably biased toward false positives and inflated effects estimates. More objective, data-driven methods of causal analysis are available to risk analysts. These can help to reduce bias and increase the credibility and realism of health effects risk assessments and causal claims. For example, quasi-experimental designs and analysis allow alternative (noncausal) explanations for associations to be tested, and refuted if appropriate. Panel data studies examine empirical relations between changes in hypothesized causes and effects. Intervention and change-point analyses identify effects (e.g., significant changes in health effects time series) and estimate their sizes. Granger causality tests, conditional independence tests, and counterfactual causality models test whether a hypothesized cause helps to predict its presumed effects, and quantify exposure-specific contributions to response rates in differently exposed groups, even in the presence of confounders. Causal graph models let causal mechanistic hypotheses be tested and refined using biomarker data. These methods can potentially revolutionize the study of exposure-induced health effects, helping to overcome pervasive false-positive biases and move the health risk assessment scientific community toward more accurate assessments of the impacts of exposures and interventions on public health.

  20. Approach to uncertainty in risk analysis

    SciTech Connect

    Rish, W.R.

    1988-08-01

    In the Fall of 1985 EPA's Office of Radiation Programs (ORP) initiated a project to develop a formal approach to dealing with uncertainties encountered when estimating and evaluating risks to human health and the environment. Based on a literature review of modeling uncertainty, interviews with ORP technical and management staff, and input from experts on uncertainty analysis, a comprehensive approach was developed. This approach recognizes by design the constraints on budget, time, manpower, expertise, and availability of information often encountered in ''real world'' modeling. It is based on the observation that in practice risk modeling is usually done to support a decision process. As such, the approach focuses on how to frame a given risk modeling problem, how to use that framing to select an appropriate mixture of uncertainty analyses techniques, and how to integrate the techniques into an uncertainty assessment that effectively communicates important information and insight to decision-makers. The approach is presented in this report. Practical guidance on characterizing and analyzing uncertainties about model form and quantities and on effectively communicating uncertainty analysis results is included. Examples from actual applications are presented.

  1. Nutrient Status and Contamination Risks from Digested Pig Slurry Applied on a Vegetable Crops Field

    PubMed Central

    Zhang, Shaohui; Hua, Yumei; Deng, Liangwei

    2016-01-01

    The effects of applied digested pig slurry on a vegetable crops field were studied. The study included a 3-year investigation on nutrient characteristics, heavy metals contamination and hygienic risks of a vegetable crops field in Wuhan, China. The results showed that, after anaerobic digestion, abundant N, P and K remained in the digested pig slurry while fecal coliforms, ascaris eggs, schistosoma eggs and hookworm eggs were highly reduced. High Cr, Zn and Cu contents in the digested pig slurry were found in spring. Digested pig slurry application to the vegetable crops field led to improved soil fertility. Plant-available P in the fertilized soils increased due to considerable increase in total P content and decrease in low-availability P fraction. The As content in the fertilized soils increased slightly but significantly (p = 0.003) compared with control. The Hg, Zn, Cr, Cd, Pb, and Cu contents in the fertilized soils did not exceed the maximum permissible contents for vegetable crops soils in China. However, high Zn accumulation should be of concern due to repeated applications of digested pig slurry. No fecal coliforms, ascaris eggs, schistosoma eggs or hookworm eggs were detected in the fertilized soils. PMID:27058548

  2. Phase plane analysis: applying chaos theory in health care.

    PubMed

    Priesmeyer, H R; Sharp, L F

    1995-01-01

    This article applies the new science of nonlinearity to administrative issues and accounts receivable management in health care, and it provides a new perspective on common operating and quality control measures.

  3. Methodology for risk analysis based on atmospheric dispersion modelling from nuclear risk sites

    NASA Astrophysics Data System (ADS)

    Baklanov, A.; Mahura, A.; Sørensen, J. H.; Rigina, O.

    2003-04-01

    The main purpose of this multidisciplinary study is to develop a methodology for complex nuclear risk and vulnerability assessment, and to test it on example of estimation of nuclear risk to the population in the Nordic countries in case of a severe accident at a nuclear risk site (NRS). The main focus of the paper is the methodology for the evaluation of the atmospheric transport and deposition of radioactive pollutants from NRSs. The method developed for this evaluation is derived from a probabilistic point of view. The main question we are trying to answer is: What is the probability for radionuclide atmospheric transport and impact to different neighbouring regions and countries in case of an accident at an NPP? To answer this question we applied a number of different tools: (i) Trajectory Modelling - to calculate multiyear forward trajectories originating over the locations of selected risk sites; (ii) Dispersion Modelling - for long-term simulation and case studies of radionuclide transport from hypothetical accidental releases at NRSs; (iii) Cluster Analysis - to identify atmospheric transport pathways from NRSs; (iv) Probability Fields Analysis - to construct annual, monthly, and seasonal NRS impact indicators to identify the most impacted geographical regions; (v) Specific Case Studies - to estimate consequences for the environment and the populations after a hypothetical accident; (vi) Vulnerability Evaluation to Radioactive Deposition - to describe its persistence in the ecosystems with a focus to the transfer of certain radionuclides into the food chains of key importance for the intake and exposure for a whole population and for certain population groups; (vii) Risk Evaluation and Mapping - to analyse socio-economical consequences for different geographical areas and various population groups taking into account social-geophysical factors and probabilities, and using demographic databases based on GIS analysis.

  4. Multi-Criteria Analysis for Biomass Utilization Applying Data Envelopment Analysis

    NASA Astrophysics Data System (ADS)

    Morimoto, Hidetsugu; Hoshino, Satoshi; Kuki, Yasuaki

    This paper aimed to consider about material-recycling, preventing global warming, and economic efficiency on preset and planed 195 Biomass Towns applying DEA (Data Envelopment Analysis), which can evaluate operational efficiency entities such as private companies or projects. In the results, although the Biomass Town can recycle material efficiently, it was clarified that preventing global warming and business profitability was brushed off like it in Biomass Town Design. Moreover, from the point of view of operational efficiency, we suggested an improvement of the Biomass Town scale for more efficiency-enhancing applying DEA. We found that applying DEA was able to catch more improvements or indicator as compared with cost-benefit analysis and cost-effectiveness analysis.

  5. Anticipating risk for human subjects participating in clinical research: application of Failure Mode and Effects Analysis.

    PubMed

    Cody, Robert J

    2006-03-01

    Failure Mode and Effects Analysis (FMEA) is a method applied in various industries to anticipate and mitigate risk. This methodology can be more systematically applied to the protection of human subjects in research. The purpose of FMEA is simple: prevent problems before they occur. By applying FMEA process analysis to the elements of a specific research protocol, the failure severity, occurrence, and detection rates can be estimated for calculation of a "risk priority number" (RPN). Methods can then be identified to reduce the RPN to levels where the risk/benefit ratio favors human subject benefit, to a greater magnitude than existed in the pre-analysis risk profile. At the very least, the approach provides a checklist of issues that can be individualized for specific research protocols or human subject populations.

  6. Application of Risk Analysis: Response from a Systems Division,

    DTIC Science & Technology

    A review of theoretical literature reveals that most technical aspects of risk analysis have become a reasonably well-defined process with many... risk analysis in order to enhance its application. Also needed are better tools to enhance use of both subjective judgment and group decision processes...hope that it would lead to increased application of risk analysis in the acquisition process.

  7. Examples of Communicating Uncertainty Applied to Earthquake Hazard and Risk Products

    NASA Astrophysics Data System (ADS)

    Wald, D. J.

    2013-12-01

    When is communicating scientific modeling uncertainty effective? One viewpoint is that the answer depends on whether one is communicating hazard or risk: hazards have quantifiable uncertainties (which, granted, are often ignored), yet risk uncertainties compound uncertainties inherent in the hazard with those of the risk calculations, and are thus often larger. Larger, yet more meaningful: since risk entails societal impact of some form, consumers of such information tend to have a better grasp of the potential uncertainty ranges for loss information than they do for less-tangible hazard values (like magnitude, peak acceleration, or stream flow). I present two examples that compare and contrast communicating uncertainty for earthquake hazard and risk products. The first example is the U.S. Geological Survey's (USGS) ShakeMap system, which portrays the uncertain, best estimate of the distribution and intensity of shaking over the potentially impacted region. The shaking intensity is well constrained at seismograph locations yet is uncertain elsewhere, so shaking uncertainties are quantified and presented spatially. However, with ShakeMap, it seems that users tend to believe what they see is accurate in part because (1) considering the shaking uncertainty complicates the picture, and (2) it would not necessarily alter their decision-making. In contrast, when it comes to making earthquake-response decisions based on uncertain loss estimates, actions tend to be made only after analysis of the confidence in (or source of) such estimates. Uncertain ranges of loss estimates instill tangible images for users, and when such uncertainties become large, intuitive reality-check alarms go off, for example, when the range of losses presented become too wide to be useful. The USGS Prompt Assessment of Global Earthquakes for Response (PAGER) system, which in near-real time alerts users to the likelihood of ranges of potential fatalities and economic impact, is aimed at

  8. Modeling Opponents in Adversarial Risk Analysis.

    PubMed

    Rios Insua, David; Banks, David; Rios, Jesus

    2016-04-01

    Adversarial risk analysis has been introduced as a framework to deal with risks derived from intentional actions of adversaries. The analysis supports one of the decisionmakers, who must forecast the actions of the other agents. Typically, this forecast must take account of random consequences resulting from the set of selected actions. The solution requires one to model the behavior of the opponents, which entails strategic thinking. The supported agent may face different kinds of opponents, who may use different rationality paradigms, for example, the opponent may behave randomly, or seek a Nash equilibrium, or perform level-k thinking, or use mirroring, or employ prospect theory, among many other possibilities. We describe the appropriate analysis for these situations, and also show how to model the uncertainty about the rationality paradigm used by the opponent through a Bayesian model averaging approach, enabling a fully decision-theoretic solution. We also show how as we observe an opponent's decision behavior, this approach allows learning about the validity of each of the rationality models used to predict his decision by computing the models' (posterior) probabilities, which can be understood as a measure of their validity. We focus on simultaneous decision making by two agents.

  9. Multimodel Bayesian analysis of data-worth applied to unsaturated fractured tuffs

    NASA Astrophysics Data System (ADS)

    Lu, Dan; Ye, Ming; Neuman, Shlomo P.; Xue, Liang

    2012-01-01

    To manage water resource and environmental systems effectively requires suitable data. The worth of collecting such data depends on their potential benefit and cost, including the expected cost (risk) of failing to take an appropriate decision. Evaluating this risk calls for a probabilistic approach to data-worth assessment. Recently we [39] developed a multimodel approach to optimum value-of-information or data-worth analysis based on model averaging within a maximum likelihood Bayesian framework. Adopting a two-dimensional synthetic example, we implemented our approach using Monte Carlo (MC) simulations with and without lead order approximations, finding that the former approach was almost equally accurate but computationally more efficient. Here we apply our methodology to pneumatic permeability data from vertical and inclined boreholes drilled into unsaturated fractured tuff near Superior, Arizona. In an attempt to improve computational efficiency, we introduce three new approximations that require less computational effort and compare results with those obtained by the original Monte Carlo method. The first approximation disregards uncertainty in model parameter estimates, the second does so for estimates of potential new data, and the third disregards both uncertainties. We find that only the first approximation yields reliable quantitative assessments of reductions in predictive uncertainty brought about by the collection of new data. We conclude that, whereas parameter uncertainty may sometimes be disregarded for purposes of analyzing data worth, the same does not generally apply to uncertainty in estimates of potential new data.

  10. Risk Analysis Related to Quality Management Principles

    NASA Astrophysics Data System (ADS)

    Vykydal, David; Halfarová, Petra; Nenadál, Jaroslav; Plura, Jiří; Hekelová, Edita

    2012-12-01

    Efficient and effective implementation of quality management principles asks for a responsible approach from top managers' perspectives. A study of the current state of affairs in Czech organizations discovers a lot of shortcomings in this field that can be changed to vary managerial risks. The article identifies and analyses some of them and gives short guidance for appropriate treatment. Text of the article reflects the authors' experience as well as knowledge obtained from the systematic analysis of industrial companies' environments.

  11. Applying the Land Use Portfolio Model with Hazus to analyse risk from natural hazard events

    USGS Publications Warehouse

    Dinitz, Laura B.; Taketa, Richard A.

    2013-01-01

    This paper describes and demonstrates the integration of two geospatial decision-support systems for natural-hazard risk assessment and management. Hazus is a risk-assessment tool developed by the Federal Emergency Management Agency to identify risks and estimate the severity of risk from natural hazards. The Land Use Portfolio Model (LUPM) is a risk-management tool developed by the U.S. Geological Survey to evaluate plans or actions intended to reduce risk from natural hazards. We analysed three mitigation policies for one earthquake scenario in the San Francisco Bay area to demonstrate the added value of using Hazus and the LUPM together. The demonstration showed that Hazus loss estimates can be input to the LUPM to obtain estimates of losses avoided through mitigation, rates of return on mitigation investment, and measures of uncertainty. Together, they offer a more comprehensive approach to help with decisions for reducing risk from natural hazards.

  12. Applied Missing Data Analysis. Methodology in the Social Sciences Series

    ERIC Educational Resources Information Center

    Enders, Craig K.

    2010-01-01

    Walking readers step by step through complex concepts, this book translates missing data techniques into something that applied researchers and graduate students can understand and utilize in their own research. Enders explains the rationale and procedural details for maximum likelihood estimation, Bayesian estimation, multiple imputation, and…

  13. Building a Better Model: A Comprehensive Breast Cancer Risk Model Incorporating Breast Density to Stratify Risk and Apply Resources

    DTIC Science & Technology

    2014-10-01

    assessment model that includes automated measurement of breast density. Scope: Assemble a cohort of women with known breast cancer risk factors and...digital mammogram files for women diagnosed with breast cancer using existing data sources and match them to controls (Harvey/Knaus). Validate and...density will translate to changes in breast cancer risk. Therefore, noise in measurement should be minimal. Thirty women were recruited under this

  14. Building a Better Model: A Comprehensive Breast Cancer Risk Model Incorporating Breast Density to Stratify Risk and Apply Resources

    DTIC Science & Technology

    2012-10-01

    methods (CumulusV, Volpara), developed an automated area based 15. SUBJECT TERMS Breast cancer; risk model; mammography ; breast density 16...recommendations based on an individual’s risk beginning with personalized mammography screening decisions. This will be done by increasing the ability... mammography machine vendor. Once the model is complete, tested nationally, and proven accurate, it will be available for widespread use within five to six

  15. Applying the skin sensitisation adverse outcome pathway (AOP) to quantitative risk assessment.

    PubMed

    Maxwell, Gavin; MacKay, Cameron; Cubberley, Richard; Davies, Michael; Gellatly, Nichola; Glavin, Stephen; Gouin, Todd; Jacquoilleot, Sandrine; Moore, Craig; Pendlington, Ruth; Saib, Ouarda; Sheffield, David; Stark, Richard; Summerfield, Vicki

    2014-02-01

    As documented in the recent OECD report 'the adverse outcome pathway for skin sensitisation initiated by covalent binding to proteins' (OECD, 2012), the chemical and biological events driving the induction of human skin sensitisation have been investigated for many years and are now well understood. Several non-animal test methods have been developed to predict sensitiser potential by measuring the impact of chemical sensitisers on these key events (Adler et al., 2011; Maxwell et al., 2011); however our ability to use these non-animal datasets for risk assessment decision-making (i.e. to establish a safe level of human exposure for a sensitising chemical) remains limited and a more mechanistic approach to data integration is required to address this challenge. Informed by our previous efforts to model the induction of skin sensitisation (Maxwell and MacKay, 2008) we are now developing two mathematical models ('total haptenated protein' model and 'CD8(+) T cell response' model) that will be linked to provide predictions of the human CD8(+) T cell response for a defined skin exposure to a sensitising chemical. Mathematical model development is underpinned by focussed clinical or human-relevant research activities designed to inform/challenge model predictions whilst also increasing our fundamental understanding of human skin sensitisation. With this approach, we aim to quantify the relationship between the dose of sensitiser applied to the skin and the extent of the hapten-specific T cell response that would result. Furthermore, by benchmarking our mathematical model predictions against clinical datasets (e.g. human diagnostic patch test data), instead of animal test data, we propose that this approach could represent a new paradigm for mechanistic toxicology.

  16. INDICATORS OF RISK: AN ANALYSIS APPROACH FOR IMPROVED RIVER MANAGEMENT

    EPA Science Inventory

    A risk index is an approach to measuring the level of risk to the plants and/or animals (biota) in a certain area using water and habitat quality information. A new technique for developing risk indices was applied to data collected from Mid-Atlantic streams of the U.S. during 1...

  17. How Has Applied Behavior Analysis and Behavior Therapy Changed?: An Historical Analysis of Journals

    ERIC Educational Resources Information Center

    O'Donohue, William; Fryling, Mitch

    2007-01-01

    Applied behavior analysis and behavior therapy are now nearly a half century old. It is interesting to ask if and how these disciplines have changed over time, particularly regarding some of their key internal controversies (e.g., role of cognitions). We examined the first five years and the 2000-2004 five year period of the "Journal of Applied…

  18. Applying Model Analysis to a Resource-Based Analysis of the Force and Motion Conceptual Evaluation

    ERIC Educational Resources Information Center

    Smith, Trevor I.; Wittmann, Michael C.; Carter, Tom

    2014-01-01

    Previously, we analyzed the Force and Motion Conceptual Evaluation in terms of a resources-based model that allows for clustering of questions so as to provide useful information on how students correctly or incorrectly reason about physics. In this paper, we apply model analysis to show that the associated model plots provide more information…

  19. Risk analysis for autonomous underwater vehicle operations in extreme environments.

    PubMed

    Brito, Mario Paulo; Griffiths, Gwyn; Challenor, Peter

    2010-12-01

    Autonomous underwater vehicles (AUVs) are used increasingly to explore hazardous marine environments. Risk assessment for such complex systems is based on subjective judgment and expert knowledge as much as on hard statistics. Here, we describe the use of a risk management process tailored to AUV operations, the implementation of which requires the elicitation of expert judgment. We conducted a formal judgment elicitation process where eight world experts in AUV design and operation were asked to assign a probability of AUV loss given the emergence of each fault or incident from the vehicle's life history of 63 faults and incidents. After discussing methods of aggregation and analysis, we show how the aggregated risk estimates obtained from the expert judgments were used to create a risk model. To estimate AUV survival with mission distance, we adopted a statistical survival function based on the nonparametric Kaplan-Meier estimator. We present theoretical formulations for the estimator, its variance, and confidence limits. We also present a numerical example where the approach is applied to estimate the probability that the Autosub3 AUV would survive a set of missions under Pine Island Glacier, Antarctica in January-March 2009.

  20. Improving Patient Prostate Cancer Risk Assessment: Moving From Static, Globally-Applied to Dynamic, Practice-Specific Cancer Risk Calculators

    PubMed Central

    Strobl, Andreas N.; Vickers, Andrew J.; Van Calster, Ben; Steyerberg, Ewout; Leach, Robin J.; Thompson, Ian M.; Ankerst, Donna P.

    2015-01-01

    Clinical risk calculators are now widely available but have generally been implemented in a static and one-size-fits-all fashion. The objective of this study was to challenge these notions and show via a case study concerning risk-based screening for prostate cancer how calculators can be dynamically and locally tailored to improve on-site patient accuracy. Yearly data from five international prostate biopsy cohorts (3 in the US, 1 in Austria, 1 in England) were used to compare 6 methods for annual risk prediction: static use of the online US-developed Prostate Cancer Prevention Trial Risk Calculator (PCPTRC); recalibration of the PCPTRC; revision of the PCPTRC; building a new model each year using logistic regression, Bayesian prior-to-posterior updating, or random forests. All methods performed similarly with respect to discrimination, except for random forests, which were worse. All methods except for random forests greatly improved calibration over the static PCPTRC in all cohorts except for Austria, where the PCPTRC had the best calibration followed closely by recalibration. The case study shows that a simple annual recalibration of a general online risk tool for prostate cancer can improve its accuracy with respect to the local patient practice at hand. PMID:25989018

  1. Audio spectrum analysis of umbilical artery Doppler ultrasound signals applied to a clinical material.

    PubMed

    Thuring, Ann; Brännström, K Jonas; Jansson, Tomas; Maršál, Karel

    2014-12-01

    Analysis of umbilical artery flow velocity waveforms characterized by pulsatility index (PI) is used to evaluate fetoplacental circulation in high-risk pregnancies. However, an experienced sonographer may be able to further differentiate between various timbres of Doppler audio signals. Recently, we have developed a method for objective audio signal characterization; the method has been tested in an animal model. In the present pilot study, the method was for the first time applied to human pregnancies. Doppler umbilical artery velocimetry was performed in 13 preterm fetuses before and after two doses of 12 mg betamethasone. The auditory measure defined by the frequency band where the spectral energy had dropped 15 dB from its maximum level (MAXpeak-15 dB ), increased two days after betamethasone administration (p = 0.001) parallel with a less pronounced decrease in PI (p = 0.04). The new auditory parameter MAXpeak-15 dB reflected the changes more sensitively than the PI did.

  2. Neutron-activation analysis applied to copper ores and artifacts

    NASA Technical Reports Server (NTRS)

    Linder, N. F.

    1970-01-01

    Neutron activation analysis is used for quantitative identification of trace metals in copper. Establishing a unique fingerprint of impurities in Michigan copper would enable identification of artifacts made from this copper.

  3. Use-related risk analysis for medical devices based on improved FMEA.

    PubMed

    Liu, Long; Shuai, Ma; Wang, Zhu; Li, Ping

    2012-01-01

    In order to effectively analyze and control use-related risk of medical devices, quantitative methodologies must be applied. Failure Mode and Effects Analysis (FMEA) is a proactive technique for error detection and risk reduction. In this article, an improved FMEA based on Fuzzy Mathematics and Grey Relational Theory is developed to better carry out user-related risk analysis for medical devices. As an example, the analysis process using this improved FMEA method for a certain medical device (C-arm X-ray machine) is described.

  4. Analysis and classification of the tools for assessing the risks associated with industrial machines.

    PubMed

    Paques, Joseph-Jean; Gauthier, François; Perez, Alejandro

    2007-01-01

    To assess and plan future risk-analysis research projects, 275 documents describing methods and tools for assessing the risks associated with industrial machines or with other sectors such as the military, and the nuclear and aeronautics industries, etc., were collected. These documents were in the format of published books or papers, standards, technical guides and company procedures collected throughout industry. From the collected documents, 112 documents were selected for analysis; 108 methods applied or potentially applicable for assessing the risks associated with industrial machines were analyzed and classified. This paper presents the main quantitative results of the analysis of the methods and tools.

  5. Big Data Usage Patterns in the Health Care Domain: A Use Case Driven Approach Applied to the Assessment of Vaccination Benefits and Risks

    PubMed Central

    Liyanage, H.; Liaw, S-T.; Kuziemsky, C.; Mold, F.; Krause, P.; Fleming, D.; Jones, S.

    2014-01-01

    Summary Background Generally benefits and risks of vaccines can be determined from studies carried out as part of regulatory compliance, followed by surveillance of routine data; however there are some rarer and more long term events that require new methods. Big data generated by increasingly affordable personalised computing, and from pervasive computing devices is rapidly growing and low cost, high volume, cloud computing makes the processing of these data inexpensive. Objective To describe how big data and related analytical methods might be applied to assess the benefits and risks of vaccines. Method: We reviewed the literature on the use of big data to improve health, applied to generic vaccine use cases, that illustrate benefits and risks of vaccination. We defined a use case as the interaction between a user and an information system to achieve a goal. We used flu vaccination and pre-school childhood immunisation as exemplars. Results We reviewed three big data use cases relevant to assessing vaccine benefits and risks: (i) Big data processing using crowd-sourcing, distributed big data processing, and predictive analytics, (ii) Data integration from heterogeneous big data sources, e.g. the increasing range of devices in the “internet of things”, and (iii) Real-time monitoring for the direct monitoring of epidemics as well as vaccine effects via social media and other data sources. Conclusions Big data raises new ethical dilemmas, though its analysis methods can bring complementary real-time capabilities for monitoring epidemics and assessing vaccine benefit-risk balance. PMID:25123718

  6. PIXE measurement applied to trace elemental analysis of human tissues

    NASA Astrophysics Data System (ADS)

    Tanaka, M.; Matsugi, E.; Miyasaki, K.; Yamagata, T.; Inoue, M.; Ogata, H.; Shimoura, S.

    1987-03-01

    PIXE measurement was applied for trace elemental analyses of 40 autoptic human kidneys. To investigate the reproducibility of the PIXE data, 9 targets obtained from one human liver were examined. The targets were prepared by wet-digestion using nitric and sulfuric acid. Yttrium was used as an internal standard. The extracted elemental concentrations for K, Fe, Cu, Zn, and Cd were in reasonable agreement with those obtained by atomic absorption spectrometry (AAS) and flame photometry (FP). Various correlations among the elements K, Ca, Cr, Mn, Fe, Ni, Cu, Zn, Rb, and Cd were examined individually for the renal cortex and renal medulla.

  7. 49 CFR 260.17 - Credit risk premium analysis.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 49 Transportation 4 2010-10-01 2010-10-01 false Credit risk premium analysis. 260.17 Section 260... Financial Assistance § 260.17 Credit risk premium analysis. (a) When Federal appropriations are not available to cover the total subsidy cost, the Administrator will determine the Credit Risk...

  8. Applying a Generic Juvenile Risk Assessment Instrument to a Local Context: Some Practical and Theoretical Lessons

    ERIC Educational Resources Information Center

    Miller, Joel; Lin, Jeffrey

    2007-01-01

    This article examines issues raised by the application of a generic actuarial juvenile risk instrument (the Model Risk Assessment Instrument) to New York City, a context different from the one in which it was developed. It describes practical challenges arising from the constraints of locally available data and local sensibilities and highlights…

  9. Applying a forensic actuarial assessment (the Violence Risk Appraisal Guide) to nonforensic patients.

    PubMed

    Harris, Grant T; Rice, Marnie E; Camilleri, Joseph A

    2004-09-01

    The actuarial Violence Risk Appraisal Guide (VRAG) was developed for male offenders where it has shown excellent replicability in many new forensic samples using officially recorded outcomes. Clinicians also make decisions, however, about the risk of interpersonal violence posed by nonforensic psychiatric patients of both sexes. Could an actuarial risk assessment developed for male forensic populations be used for a broader clientele? We modified the VRAG to permit evaluation using data from the MacArthur Violence Risk Assessment Study that included nonforensic male and female patients and primarily self-reported violence. The modified VRAG yielded a large effect size in the prediction of dichotomous postdischarge severe violence over 20 and 50 weeks. Accuracy of VRAG predictions was unrelated to sex. The results provide evidence about the robustness of comprehensive actuarial risk assessments and the generality of the personal factors that underlie violent behavior.

  10. IT-OSRA: applying ensemble simulations to estimate the oil spill risk associated to operational and accidental oil spills

    NASA Astrophysics Data System (ADS)

    Sepp Neves, Antonio Augusto; Pinardi, Nadia; Martins, Flavio

    2016-08-01

    Oil Spill Risk Assessments (OSRAs) are widely employed to support decision making regarding oil spill risks. This article adapts the ISO-compliant OSRA framework developed by Sepp Neves et al. (J Environ Manag 159:158-168, 2015) to estimate risks in a complex scenario where uncertainties related to the meteo-oceanographic conditions, where and how a spill could happen exist and the risk computation methodology is not yet well established (ensemble oil spill modeling). The improved method was applied to the Algarve coast, Portugal. Over 50,000 simulations were performed in 2 ensemble experiments to estimate the risks due to operational and accidental spill scenarios associated with maritime traffic. The level of risk was found to be important for both types of scenarios, with significant seasonal variations due to the the currents and waves variability. Higher frequency variability in the meteo-oceanographic variables were also found to contribute to the level of risk. The ensemble results show that the distribution of oil concentrations found on the coast is not Gaussian, opening up new fields of research on how to deal with oil spill risks and related uncertainties.

  11. GPS ensemble analysis applied to Antarctic vertical velocities

    NASA Astrophysics Data System (ADS)

    Petrie, E. J.; Clarke, P. J.; King, M. A.; Williams, S. D. P.

    2014-12-01

    GPS data is used to provide estimates of vertical land motion caused by e.g. glacial isostatic adjustment (GIA) and hydrologic loading. The vertical velocities estimated from the GPS data are often assimilated into GIA models or used for comparison purposes. GIA models are very important as they provide time-variable gravity corrections needed to estimate ice mass change over Greenland and Antarctica. While state-of-the art global GPS analysis has previously been performed for many Antarctic sites, formal errors in the resulting site velocities are typically obtained from noise analysis of each individual time series without consideration of processing or metadata issues. Here we present analysis of the results from two full global runs including a variety of parameter and reference frame alignment choices, and compare the results to previous work with a view to assessing if the size of the formal errors from the standard method is truly representative.

  12. Concentration of Risk Model (CORM) Verification and Analysis

    DTIC Science & Technology

    2014-06-15

    Mental Health and using data from a repository at the University of Michigan, had attempted to identify soldiers at higher-than-average risk of suicide ...TRAC-M-TR-14-023 15 June 2014 Concentration of Risk Model (CORM) Verification and Analysis TRADOC Analysis Center - Monterey 700 Dyer Road Monterey...TRAC-M-TR-14-023 15 June 2014 Concentration of Risk Model (CORM) Verification and Analysis Edward M. Masotti Sam Buttrey TRADOC Analysis Center

  13. Systems design analysis applied to launch vehicle configuration

    NASA Technical Reports Server (NTRS)

    Ryan, R.; Verderaime, V.

    1993-01-01

    As emphasis shifts from optimum-performance aerospace systems to least lift-cycle costs, systems designs must seek, adapt, and innovate cost improvement techniques in design through operations. The systems design process of concept, definition, and design was assessed for the types and flow of total quality management techniques that may be applicable in a launch vehicle systems design analysis. Techniques discussed are task ordering, quality leverage, concurrent engineering, Pareto's principle, robustness, quality function deployment, criteria, and others. These cost oriented techniques are as applicable to aerospace systems design analysis as to any large commercial system.

  14. Overcoming barriers to integrating economic analysis into risk assessment.

    PubMed

    Hoffmann, Sandra

    2011-09-01

    Regulatory risk analysis is designed to provide decisionmakers with a clearer understanding of how policies are likely to affect risk. The systems that produce risk are biological, physical, and social and economic. As a result, risk analysis is an inherently interdisciplinary task. Yet in practice, risk analysis has been interdisciplinary in only limited ways. Risk analysis could provide more accurate assessments of risk if there were better integration of economics and other social sciences into risk assessment itself. This essay examines how discussions about risk analysis policy have influenced the roles of various disciplines in risk analysis. It explores ways in which integrated bio/physical-economic modeling could contribute to more accurate assessments of risk. It reviews examples of the kind of integrated economics-bio/physical modeling that could be used to enhance risk assessment. The essay ends with a discussion of institutional barriers to greater integration of economic modeling into risk assessment and provides suggestions on how these might be overcome.

  15. Applying Costs, Risks and Values Evaluation (CRAVE) methodology to Engineering Support Request (ESR) prioritization

    NASA Technical Reports Server (NTRS)

    Joglekar, Prafulla N.

    1994-01-01

    Given limited budget, the problem of prioritization among Engineering Support Requests (ESR's) with varied sizes, shapes, and colors is a difficult one. At the Kennedy Space Center (KSC), the recently developed 4-Matrix (4-M) method represents a step in the right direction as it attempts to combine the traditional criteria of technical merits only with the new concern for cost-effectiveness. However, the 4-M method was not adequately successful in the actual prioritization of ESRs for the fiscal year 1995 (FY95). This research identifies a number of design issues that should help us to develop better methods. It emphasizes that given the variety and diversity of ESR's one should not expect that a single method could help in the assessment of all ESR's. One conclusion is that a methodology such as Costs, Risks, and Values Evaluation (CRAVE) should be adopted. It also is clear that the development of methods such as 4-M requires input not only from engineers with technical expertise in ESR's but also from personnel with adequate background in the theory and practice of cost-effectiveness analysis. At KSC, ESR prioritization is one part of the Ground Support Working Teams (GSWT) Integration Process. It was discovered that the more important barriers to the incorporation of cost-effectiveness considerations in ESR prioritization lie in this process. The culture of integration, and the corresponding structure of review by a committee of peers, is not conducive to the analysis and confrontation necessary in the assessment and prioritization of ESR's. Without assistance from appropriately trained analysts charged with the responsibility to analyze and be confrontational about each ESR, the GSWT steering committee will continue to make its decisions based on incomplete understanding, inconsistent numbers, and at times, colored facts. The current organizational separation of the prioritization and the funding processes is also identified as an important barrier to the

  16. Setting events in applied behavior analysis: Toward a conceptual and methodological expansion

    PubMed Central

    Wahler, Robert G.; Fox, James J.

    1981-01-01

    The contributions of applied behavior analysis as a natural science approach to the study of human behavior are acknowledged. However, it is also argued that applied behavior analysis has provided limited access to the full range of environmental events that influence socially significant behavior. Recent changes in applied behavior analysis to include analysis of side effects and social validation represent ways in which the traditional applied behavior analysis conceptual and methodological model has been profitably expanded. A third area of expansion, the analysis of setting events, is proposed by the authors. The historical development of setting events as a behavior influence concept is traced. Modifications of the basic applied behavior analysis methodology and conceptual systems that seem necessary to setting event analysis are discussed and examples of descriptive and experimental setting event analyses are presented. PMID:16795646

  17. Action, Content and Identity in Applied Genre Analysis for ESP

    ERIC Educational Resources Information Center

    Flowerdew, John

    2011-01-01

    Genres are staged, structured, communicative events, motivated by various communicative purposes, and performed by members of specific discourse communities (Swales 1990; Bhatia 1993, 2004; Berkenkotter & Huckin 1995). Since its inception, with the two seminal works on the topic by Swales (1990) and Bhatia (1993), genre analysis has taken pride of…

  18. Applying an Activity System to Online Collaborative Group Work Analysis

    ERIC Educational Resources Information Center

    Choi, Hyungshin; Kang, Myunghee

    2010-01-01

    This study determines whether an activity system provides a systematic framework to analyse collaborative group work. Using an activity system as a unit of analysis, the research examined learner behaviours, conflicting factors and facilitating factors while students engaged in collaborative work via asynchronous computer-mediated communication.…

  19. Applying Adult Learning Theory through a Character Analysis

    ERIC Educational Resources Information Center

    Baskas, Richard S.

    2011-01-01

    The purpose of this study is to analyze the behavior of a character, Celie, in a movie, 'The Color Purple," through the lens of two adult learning theorists to determine the relationships the character has with each theory. The development and portrayal of characters in movies can be explained and understood by the analysis of adult learning…

  20. Applying Skinner's Analysis of Verbal Behavior to Persons with Dementia

    ERIC Educational Resources Information Center

    Dixon, Mark; Baker, Jonathan C.; Sadowski, Katherine Ann

    2011-01-01

    Skinner's 1957 analysis of verbal behavior has demonstrated a fair amount of utility to teach language to children with autism and other various disorders. However, the learning of language can be forgotten, as is the case for many elderly suffering from dementia or other degenerative diseases. It appears possible that Skinner's operants may…

  1. Applying Score Analysis to a Rehearsal Pedagogy of Expressive Performance

    ERIC Educational Resources Information Center

    Byo, James L.

    2014-01-01

    The discoveries of score analysis (e.g., minor seventh chord, ostinato, phrase elision, melodic fragment, half cadence) are more than just compositional techniques or music vocabulary. They are sounds--fascinating, storytelling, dynamic modes of expression--that when approached as such enrich the rehearsal experience. This article presents a…

  2. Applying AI tools to operational space environmental analysis

    NASA Technical Reports Server (NTRS)

    Krajnak, Mike; Jesse, Lisa; Mucks, John

    1995-01-01

    The U.S. Air Force and National Oceanic Atmospheric Agency (NOAA) space environmental operations centers are facing increasingly complex challenges meeting the needs of their growing user community. These centers provide current space environmental information and short term forecasts of geomagnetic activity. Recent advances in modeling and data access have provided sophisticated tools for making accurate and timely forecasts, but have introduced new problems associated with handling and analyzing large quantities of complex data. AI (Artificial Intelligence) techniques have been considered as potential solutions to some of these problems. Fielding AI systems has proven more difficult than expected, in part because of operational constraints. Using systems which have been demonstrated successfully in the operational environment will provide a basis for a useful data fusion and analysis capability. Our approach uses a general purpose AI system already in operational use within the military intelligence community, called the Temporal Analysis System (TAS). TAS is an operational suite of tools supporting data processing, data visualization, historical analysis, situation assessment and predictive analysis. TAS includes expert system tools to analyze incoming events for indications of particular situations and predicts future activity. The expert system operates on a knowledge base of temporal patterns encoded using a knowledge representation called Temporal Transition Models (TTM's) and an event database maintained by the other TAS tools. The system also includes a robust knowledge acquisition and maintenance tool for creating TTM's using a graphical specification language. The ability to manipulate TTM's in a graphical format gives non-computer specialists an intuitive way of accessing and editing the knowledge base. To support space environmental analyses, we used TAS's ability to define domain specific event analysis abstractions. The prototype system defines

  3. Best practices: applying management analysis of excellence to immunization.

    PubMed

    Wishner, Amy; Aronson, Jerold; Kohrt, Alan; Norton, Gary

    2005-01-01

    The authors applied business management tools to analyze and promote excellence and to evaluate differences between average and above-average immunization peformers in private practices. The authors conducted a pilot study of 10 private practices in Pennsylvania using tools common in management to assess practices' organizational climate and managerial style. Authoritative and coaching styles of physician leaders were common to both groups. Managerial styles that emphasized higher levels of clarity and responsibility managerial styles were evident in the large practices; and rewards and flexibility styles were higher in the small above-average practices. The findings of this pilot study match results seen in high performers in other industries. It concludes that the authoritative style appears to have the most impact on performance. It has interesting implications for training/behavior change to improve immunization rates, along with traditional medical interventions.

  4. Technical Overview of Ecological Risk Assessment - Analysis Phase: Exposure Characterization

    EPA Pesticide Factsheets

    Exposure Characterization is the second major component of the analysis phase of a risk assessment. For a pesticide risk assessment, the exposure characterization describes the potential or actual contact of a pesticide with a plant, animal, or media.

  5. School attendance, health-risk behaviors, and self-esteem in adolescents applying for working papers.

    PubMed Central

    Suss, A. L.; Tinkelman, B. K.; Freeman, K.; Friedman, S. B.

    1996-01-01

    Since health-risk behaviors are often encountered in clusters among adolescents, it was hypothesized that adolescents with poor school attendance would be associated with more health-risk behaviors (e.g., substance use, violence) than those who attend school regularly. This study assessed the relationship between poor school attendance and health-risk behaviors, and described health-risk behaviors and self-esteem among adolescents seeking employment. In this cross-sectional study, school attendance (poor vs. regular attendance) was related to health-risk behaviors by asking 122 subjects seen at a New York City Working Papers Clinic to complete both a 72-item questionnaire about their health-risk behaviors and the 58-item Coopersmith Self-Esteem School Form Inventory. Chi-square and Fisher's Exact Tests were performed. The poor and regular attenders of school differed significantly in only 5 out of 44 items pertaining to health-risk behaviors. Self-esteem measures for the two groups did not differ from one another or from national norms. In this sample, depression "in general" (global) and "at home," but not "at school," were associated significantly with suicidal thoughts/attempts and serious past life events (e.g. family conflict, sexual abuse). There were no significant associations between depression or self-esteem and illicit substance or alcohol use. We found few associations between poor school attendance and health-risk behaviors in this sample of employment-seeking adolescents. The poor and regular attenders of school were similar in most aspects of their health-risk behaviors and self-esteem. PMID:8982520

  6. Applying the Analytic Hierarchy Process to Oil Sands Environmental Compliance Risk Management

    NASA Astrophysics Data System (ADS)

    Roux, Izak Johannes, III

    Oil companies in Alberta, Canada, invested $32 billion on new oil sands projects in 2013. Despite the size of this investment, there is a demonstrable deficiency in the uniformity and understanding of environmental legislation requirements that manifest into increased project compliance risks. This descriptive study developed 2 prioritized lists of environmental regulatory compliance risks and mitigation strategies and used multi-criteria decision theory for its theoretical framework. Information from compiled lists of environmental compliance risks and mitigation strategies was used to generate a specialized pairwise survey, which was piloted by 5 subject matter experts (SMEs). The survey was validated by a sample of 16 SMEs, after which the Analytic Hierarchy Process (AHP) was used to rank a total of 33 compliance risks and 12 mitigation strategy criteria. A key finding was that the AHP is a suitable tool for ranking of compliance risks and mitigation strategies. Several working hypotheses were also tested regarding how SMEs prioritized 1 compliance risk or mitigation strategy compared to another. The AHP showed that regulatory compliance, company reputation, environmental compliance, and economics ranked the highest and that a multi criteria mitigation strategy for environmental compliance ranked the highest. The study results will inform Alberta oil sands industry leaders about the ranking and utility of specific compliance risks and mitigations strategies, enabling them to focus on actions that will generate legislative and public trust. Oil sands leaders implementing a risk management program using the risks and mitigation strategies identified in this study will contribute to environmental conservation, economic growth, and positive social change.

  7. A value analysis model applied to the management of amblyopia.

    PubMed Central

    Beauchamp, G R; Bane, M C; Stager, D R; Berry, P M; Wright, W W

    1999-01-01

    PURPOSE: To assess the value of amblyopia-related services by utilizing a health value model (HVM). Cost and quality criteria are evaluated in accordance with the interests of patients, physicians, and purchasers. METHODS: We applied an HVM to a hypothetical statistical ("median") child with amblyopia whose visual acuity is 20/80 and to a group of children with amblyopia who are managed by our practice. We applied the model to calculate the value of these services by evaluating the responses of patients and physicians and relating these responses to clinical outcomes. RESULTS: The consensus value of care for the hypothetical median child was calculated to be 0.406 (of 1.000). For those children managed in our practice, the calculated value is 0.682. Clinically, 79% achieved 20/40 or better visual acuity, and the mean final visual acuity was 0.2 logMAR (20/32). Value appraisals revealed significant concerns about the financial aspects of amblyopia-related services, particularly among physicians. Patients rated services more positively than did physicians. CONCLUSIONS: Amblyopia care is difficult, sustained, and important work that requires substantial sensitivity to and support of children and families. Compliance and early detection are essential to success. The value of amblyopia services is rated significantly higher by patients than by physicians. Relative to the measured value, amblyopia care is undercompensated. The HVM is useful to appraise clinical service delivery and its variation. The costs of failure and the benefits of success are high; high-value amblyopia care yields substantial dividends and should be commensurately compensated in the marketplace. PMID:10703133

  8. Current Human Reliability Analysis Methods Applied to Computerized Procedures

    SciTech Connect

    Ronald L. Boring

    2012-06-01

    Computerized procedures (CPs) are an emerging technology within nuclear power plant control rooms. While CPs have been implemented internationally in advanced control rooms, to date no US nuclear power plant has implemented CPs in its main control room (Fink et al., 2009). Yet, CPs are a reality of new plant builds and are an area of considerable interest to existing plants, which see advantages in terms of enhanced ease of use and easier records management by omitting the need for updating hardcopy procedures. The overall intent of this paper is to provide a characterization of human reliability analysis (HRA) issues for computerized procedures. It is beyond the scope of this document to propose a new HRA approach or to recommend specific methods or refinements to those methods. Rather, this paper serves as a review of current HRA as it may be used for the analysis and review of computerized procedures.

  9. An Inverse Kinematic Approach Using Groebner Basis Theory Applied to Gait Cycle Analysis

    DTIC Science & Technology

    2013-03-01

    AN INVERSE KINEMATIC APPROACH USING GROEBNER BASIS THEORY APPLIED TO GAIT CYCLE ANALYSIS THESIS Anum Barki AFIT-ENP-13-M-02 DEPARTMENT OF THE AIR...copyright protection in the United States. AFIT-ENP-13-M-02 AN INVERSE KINEMATIC APPROACH USING GROEBNER BASIS THEORY APPLIED TO GAIT CYCLE ANALYSIS THESIS...APPROACH USING GROEBNER BASIS THEORY APPLIED TO GAIT CYCLE ANALYSIS Anum Barki, BS Approved: Dr. Ronald F. Tuttle (Chairman) Date Dr. Kimberly Kendricks

  10. Applying Cognitive Work Analysis to Time Critical Targeting Functionality

    DTIC Science & Technology

    2004-10-01

    Target List/Dynamic Target Queue (DTL/ DTQ ) in the same place. Figure 4-27 shows the task steps involved in achieving Goal 7. 4- 30 Figure 4-27...GUI WG to brainstorm the order of columns in the DTL/ DTQ Table, a critical component of the TCTF CUI, with successful results, which were...Cognitive Work Analysis DTD Display Task Description DTL/ DTQ Dynamic Target List/Dynamic Target Queue FDO Fighter Duty Officer FEBA Forward Edge

  11. Applying the emergency risk management process to tackle the crisis of antibiotic resistance

    PubMed Central

    Dominey-Howes, Dale; Bajorek, Beata; Michael, Carolyn A.; Betteridge, Brittany; Iredell, Jonathan; Labbate, Maurizio

    2015-01-01

    We advocate that antibiotic resistance be reframed as a disaster risk management problem. Antibiotic-resistant infections represent a risk to life as significant as other commonly occurring natural disasters (e.g., earthquakes). Despite efforts by global health authorities, antibiotic resistance continues to escalate. Therefore, new approaches and expertise are needed to manage the issue. In this perspective we: (1) make a call for the emergency management community to recognize the antibiotic resistance risk and join in addressing this problem; (2) suggest using the risk management process to help tackle antibiotic resistance; (3) show why this approach has value and why it is different to existing approaches; and (4) identify public perception of antibiotic resistance as an important issue that warrants exploration. PMID:26388864

  12. [Future built-up area zoning by applying the methodology for assessing the population health risk].

    PubMed

    Bobkova, T E

    2009-01-01

    Using the methodology for assessing the population health risk provides proposals on the functional zoning of the reorganized area of a plastics-works. An area has been allocated for possible house-building.

  13. Applying Risk Society Theory to findings of a scoping review on caregiver safety.

    PubMed

    Macdonald, Marilyn; Lang, Ariella

    2014-03-01

    Chronic Illness represents a growing concern in the western world and individuals living with chronic illness are primarily managed at home by family caregivers. A scoping review of the home-care literature (2004-2009; updated with review articles from 2010 to January 2013) on the topic of the caregiver revealed that this group experiences the following safety-related concerns: caregivers are conscripted to the role, experience economic hardship, risk being abused as well as abusing, and may well become patients themselves. Methodology and methods used in the scoping review are presented as well as a brief overview of the findings. The concepts of risk and safety are defined. Risk Society Theory is introduced and used as a lens to view the findings, and to contribute to an understanding of the construction of risk in contemporary health-care.

  14. Applying thiouracil (TU)-tagging for mouse transcriptome analysis

    PubMed Central

    Gay, Leslie; Karfilis, Kate V.; Miller, Michael R.; Doe, Chris Q.; Stankunas, Kryn

    2014-01-01

    Transcriptional profiling is a powerful approach to study mouse development, physiology, and disease models. Here, we describe a protocol for mouse thiouracil-tagging (TU-tagging), a transcriptome analysis technology that includes in vivo covalent labeling, purification, and analysis of cell type-specific RNA. TU-tagging enables 1) the isolation of RNA from a given cell population of a complex tissue, avoiding transcriptional changes induced by cell isolation trauma, and 2) the identification of actively transcribed RNAs and not pre-existing transcripts. Therefore, in contrast to other cell-specific transcriptional profiling methods based on purification of tagged ribosomes or nuclei, TU-tagging provides a direct examination of transcriptional regulation. We describe how to: 1) deliver 4-thiouracil to transgenic mice to thio-label cell lineage-specific transcripts, 2) purify TU-tagged RNA and prepare libraries for Illumina sequencing, and 3) follow a straight-forward bioinformatics workflow to identify cell type-enriched or differentially expressed genes. Tissue containing TU-tagged RNA can be obtained in one day, RNA-Seq libraries generated within two days, and, following sequencing, an initial bioinformatics analysis completed in one additional day. PMID:24457332

  15. Elusive Critical Elements of Transformative Risk Assessment Practice and Interpretation: Is Alternatives Analysis the Next Step?

    PubMed

    Francis, Royce A

    2015-11-01

    This article argues that "game-changing" approaches to risk analysis must focus on "democratizing" risk analysis in the same way that information technologies have democratized access to, and production of, knowledge. This argument is motivated by the author's reading of Goble and Bier's analysis, "Risk Assessment Can Be a Game-Changing Information Technology-But Too Often It Isn't" (Risk Analysis, 2013; 33: 1942-1951), in which living risk assessments are shown to be "game changing" in probabilistic risk analysis. In this author's opinion, Goble and Bier's article focuses on living risk assessment's potential for transforming risk analysis from the perspective of risk professionals-yet, the game-changing nature of information technologies has typically achieved a much broader reach. Specifically, information technologies change who has access to, and who can produce, information. From this perspective, the author argues that risk assessment is not a game-changing technology in the same way as the printing press or the Internet because transformative information technologies reduce the cost of production of, and access to, privileged knowledge bases. The author argues that risk analysis does not reduce these costs. The author applies Goble and Bier's metaphor to the chemical risk analysis context, and in doing so proposes key features that transformative risk analysis technology should possess. The author also discusses the challenges and opportunities facing risk analysis in this context. These key features include: clarity in information structure and problem representation, economical information dissemination, increased transparency to nonspecialists, democratized manufacture and transmission of knowledge, and democratic ownership, control, and interpretation of knowledge. The chemical safety decision-making context illustrates the impact of changing the way information is produced and accessed in the risk context. Ultimately, the author concludes that although

  16. Problem Formulation for Human Health Risk Assessments of Pathogens in Land-Applied Biosolids (Final Report)

    EPA Science Inventory

    Cover of the Land-<span class=Applied Biosolids 2011 Final Report "> Millions of tons of treated sewage sludges or “biosolids” are applied annually to f...

  17. Quantitative microbial risk assessment applied to irrigation of salad crops with waste stabilization pond effluents.

    PubMed

    Pavione, D M S; Bastos, R K X; Bevilacqua, P D

    2013-01-01

    A quantitative microbial risk assessment model for estimating infection risks arising from consuming crops eaten raw that have been irrigated with effluents from stabilization ponds was constructed. A log-normal probability distribution function was fitted to a large database from a comprehensive monitoring of an experimental pond system to account for variability in Escherichia coli concentration in irrigation water. Crop contamination levels were estimated using predictive models derived from field experiments involving the irrigation of several crops with different effluent qualities. Data on daily intake of salad crops were obtained from a national survey in Brazil. Ten thousand-trial Monte Carlo simulations were used to estimate human health risks associated with the use of wastewater for irrigating low- and high-growing crops. The use of effluents containing 10(3)-10(4) E. coli per 100 ml resulted in median rotavirus infection risk of approximately 10(-3) and 10(-4) pppy when irrigating, respectively, low- and high-growing crops; the corresponding 95th percentile risk estimates were around 10(-2) in both scenarios. Sensitivity analyses revealed that variations in effluent quality, in the assumed ratios of pathogens to E. coli, and in the reduction of pathogens between harvest and consumption had great impact upon risk estimates.

  18. The acquired preparedness risk model applied to smoking in 5th grade children.

    PubMed

    Combs, Jessica L; Spillane, Nichea S; Caudill, Leann; Stark, Brittany; Smith, Gregory T

    2012-03-01

    The very early onset of smoking predicts numerous health problems. The authors conducted the first test of one risk model for elementary school age smoking, known as the acquired preparedness (AP) model of risk, in a cross-sectional sample of 309 5th grade children. The model posits that (a) impulsivity-related personality traits contribute to risk for a variety of risky, maladaptive behaviors; (b) smoking expectancies confer risk only for smoking; and (c) the personality traits contribute to the formation of high risk expectancies for reinforcement from smoking, which in turn increases the likelihood of early onset smoking. The model was supported: the high-risk personality traits distinguished children engaging in any risky, maladaptive behavior from other children, and the smoking expectancies differentiated smokers from all other children. The relationship between personality tendencies to act rashly when experiencing intense positive or negative emotions and smoker status was partially mediated by expectancies for reinforcement from smoking. This model should be investigated longitudinally.

  19. Sensitivity analysis of a two-dimensional quantitative microbiological risk assessment: keeping variability and uncertainty separated.

    PubMed

    Busschaert, Pieter; Geeraerd, Annemie H; Uyttendaele, Mieke; Van Impe, Jan F

    2011-08-01

    The aim of quantitative microbiological risk assessment is to estimate the risk of illness caused by the presence of a pathogen in a food type, and to study the impact of interventions. Because of inherent variability and uncertainty, risk assessments are generally conducted stochastically, and if possible it is advised to characterize variability separately from uncertainty. Sensitivity analysis allows to indicate to which of the input variables the outcome of a quantitative microbiological risk assessment is most sensitive. Although a number of methods exist to apply sensitivity analysis to a risk assessment with probabilistic input variables (such as contamination, storage temperature, storage duration, etc.), it is challenging to perform sensitivity analysis in the case where a risk assessment includes a separate characterization of variability and uncertainty of input variables. A procedure is proposed that focuses on the relation between risk estimates obtained by Monte Carlo simulation and the location of pseudo-randomly sampled input variables within the uncertainty and variability distributions. Within this procedure, two methods are used-that is, an ANOVA-like model and Sobol sensitivity indices-to obtain and compare the impact of variability and of uncertainty of all input variables, and of model uncertainty and scenario uncertainty. As a case study, this methodology is applied to a risk assessment to estimate the risk of contracting listeriosis due to consumption of deli meats.

  20. HANFORD SAFETY ANALYSIS & RISK ASSESSMENT HANDBOOK (SARAH)

    SciTech Connect

    EVANS, C B

    2004-12-21

    The purpose of the Hanford Safety Analysis and Risk Assessment Handbook (SARAH) is to support the development of safety basis documentation for Hazard Category 2 and 3 (HC-2 and 3) U.S. Department of Energy (DOE) nuclear facilities to meet the requirements of 10 CFR 830, ''Nuclear Safety Management''. Subpart B, ''Safety Basis Requirements.'' Consistent with DOE-STD-3009-94, Change Notice 2, ''Preparation Guide for U.S. Department of Energy Nonreactor Nuclear Facility Documented Safety Analyses'' (STD-3009), and DOE-STD-3011-2002, ''Guidance for Preparation of Basis for Interim Operation (BIO) Documents'' (STD-3011), the Hanford SARAH describes methodology for performing a safety analysis leading to development of a Documented Safety Analysis (DSA) and derivation of Technical Safety Requirements (TSR), and provides the information necessary to ensure a consistently rigorous approach that meets DOE expectations. The DSA and TSR documents, together with the DOE-issued Safety Evaluation Report (SER), are the basic components of facility safety basis documentation. For HC-2 or 3 nuclear facilities in long-term surveillance and maintenance (S&M), for decommissioning activities, where source term has been eliminated to the point that only low-level, residual fixed contamination is present, or for environmental remediation activities outside of a facility structure, DOE-STD-1120-98, ''Integration of Environment, Safety, and Health into Facility Disposition Activities'' (STD-1120), may serve as the basis for the DSA. HC-2 and 3 environmental remediation sites also are subject to the hazard analysis methodologies of this standard.

  1. Response deprivation and reinforcement in applied settings: A preliminary analysis

    PubMed Central

    Konarski, Edward A.; Johnson, Moses R.; Crowell, Charles R.; Whitman, Thomas L.

    1980-01-01

    First-grade children engaged in seatwork behaviors under reinforcement schedules established according to the Premack Principle and the Response Deprivation Hypothesis. Across two experiments, schedules were presented to the children in a counter-balanced fashion which fulfilled the conditions of one, both, or neither of the hypotheses. Duration of on-task math and coloring in Experiment 1 and on-task math and reading in Experiment 2 were the dependent variables. A modified ABA-type withdrawal design, including a condition to control for the noncontingent effects of a schedule, indicated an increase of on-task instrumental responding only in those schedules where the condition of response deprivation was present but not where it was absent, regardless of the probability differential between the instrumental and contingent responses. These results were consistent with laboratory findings supporting the necessity of response deprivation for producing the reinforcement effect in single response, instrumental schedules. However, the results of the control procedure were equivocal so the contribution of the contingent relationship between the responses to the increases in instrumental behavior could not be determined. Nevertheless, these results provided tentative support for the Response Deprivation Hypothesis as a new approach to establishing reinforcement schedules while indicating the need for further research in this area. The possible advantages of this technique for applied use were identified and discussed. PMID:16795635

  2. LAVA (Los Alamos Vulnerability and Risk Assessment Methodology): A conceptual framework for automated risk analysis

    SciTech Connect

    Smith, S.T.; Lim, J.J.; Phillips, J.R.; Tisinger, R.M.; Brown, D.C.; FitzGerald, P.D.

    1986-01-01

    At Los Alamos National Laboratory, we have developed an original methodology for performing risk analyses on subject systems characterized by a general set of asset categories, a general spectrum of threats, a definable system-specific set of safeguards protecting the assets from the threats, and a general set of outcomes resulting from threats exploiting weaknesses in the safeguards system. The Los Alamos Vulnerability and Risk Assessment Methodology (LAVA) models complex systems having large amounts of ''soft'' information about both the system itself and occurrences related to the system. Its structure lends itself well to automation on a portable computer, making it possible to analyze numerous similar but geographically separated installations consistently and in as much depth as the subject system warrants. LAVA is based on hierarchical systems theory, event trees, fuzzy sets, natural-language processing, decision theory, and utility theory. LAVA's framework is a hierarchical set of fuzzy event trees that relate the results of several embedded (or sub-) analyses: a vulnerability assessment providing information about the presence and efficacy of system safeguards, a threat analysis providing information about static (background) and dynamic (changing) threat components coupled with an analysis of asset ''attractiveness'' to the dynamic threat, and a consequence analysis providing information about the outcome spectrum's severity measures and impact values. By using LAVA, we have modeled our widely used computer security application as well as LAVA/CS systems for physical protection, transborder data flow, contract awards, and property management. It is presently being applied for modeling risk management in embedded systems, survivability systems, and weapons systems security. LAVA is especially effective in modeling subject systems that include a large human component.

  3. Applying temporal network analysis to the venture capital market

    NASA Astrophysics Data System (ADS)

    Zhang, Xin; Feng, Ling; Zhu, Rongqian; Stanley, H. Eugene

    2015-10-01

    Using complex network theory to study the investment relationships of venture capital firms has produced a number of significant results. However, previous studies have often neglected the temporal properties of those relationships, which in real-world scenarios play a pivotal role. Here we examine the time-evolving dynamics of venture capital investment in China by constructing temporal networks to represent (i) investment relationships between venture capital firms and portfolio companies and (ii) the syndication ties between venture capital investors. The evolution of the networks exhibits rich variations in centrality, connectivity and local topology. We demonstrate that a temporal network approach provides a dynamic and comprehensive analysis of real-world networks.

  4. Applied analysis/computational mathematics. Final report 1993

    SciTech Connect

    Lax, P.; Berger, M.

    1993-12-01

    This is the final report for the Courant Mathematics and Computing Laboratory (CMCL) research program for the years 1991--1993. Our research efforts encompass the formulation of physical problems in terms of mathematical models (both old and new), the mathematical analysis of such models, and their numerical resolution. This last step involves the development and implementation of efficient methods for large scale computation. Our analytic and numerical work often go hand in hand; new theoretical approaches often have numerical counterparts, while numerical experimentation often suggests avenues for analytical investigation.

  5. Inconsistencies in the harmonic analysis applied to pulsating stars

    NASA Astrophysics Data System (ADS)

    Pascual-Granado, J.; Garrido, R.; Suárez, J. C.

    2015-05-01

    Harmonic analysis is the fundamental mathematical method used for the identification of pulsation frequencies in asteroseismology and other fields of physics. Here we introduce a test to evaluate the validity of the hypothesis in which Fourier theorem is based: the convergence of the expansion series. The huge number of difficulties found in the interpretation of the periodograms of pulsating stars observed by CoRoT and Kepler satellites lead us to test whether the function underlying these time series is analytic or not. Surprisingly, the main result is that these are originated from non-analytic functions, therefore, the condition for Parseval's theorem is not guaranteed.

  6. Radiation Leukemogenesis: Applying Basic Science of Epidemiological Estimates of Low Dose Risks and Dose-Rate Effects

    SciTech Connect

    Hoel, D. G.

    1998-11-01

    The next stage of work has been to examine more closely the A-bomb leukemia data which provides the underpinnings of the risk estimation of CML in the above mentioned manuscript. The paper by Hoel and Li (Health Physics 75:241-50) shows how the linear-quadratic model has basic non-linearities at the low dose region for the leukemias including CML. Pierce et. al., (Radiation Research 123:275-84) have developed distributions for the uncertainty in the estimated exposures of the A-bomb cohort. Kellerer, et. al., (Radiation and Environmental Biophysics 36:73-83) has further considered possible errors in the estimated neutron values and with changing RBE values with dose and has hypothesized that the tumor response due to gamma may not be linear. We have incorporated his neutron model and have constricted new A-bomb doses based on his model adjustments. The Hoel and Li dose response analysis has also been applied using the Kellerer neutron dose adjustments for the leukemias. Finally, both Pierce's dose uncertainties and Kellerer neutron adjustments are combined as well as the varying RBE with dose as suggested by Rossi and Zaider and used for leukemia dose-response analysis. First the results of Hoel and Li showing a significantly improved fit of the linear-quadratic dose response by the inclusion of a threshold (i.e. low-dose nonlinearity) persisted. This work has been complete for both solid tumor as well as leukemia for both mortality as well as incidence data. The results are given in the manuscript described below which has been submitted to Health Physics.

  7. Operational modal analysis applied to the concert harp

    NASA Astrophysics Data System (ADS)

    Chomette, B.; Le Carrou, J.-L.

    2015-05-01

    Operational modal analysis (OMA) methods are useful to extract modal parameters of operating systems. These methods seem to be particularly interesting to investigate the modal basis of string instruments during operation to avoid certain disadvantages due to conventional methods. However, the excitation in the case of string instruments is not optimal for OMA due to the presence of damped harmonic components and low noise in the disturbance signal. Therefore, the present study investigates the least-square complex exponential (LSCE) and the modified least-square complex exponential methods in the case of a string instrument to identify modal parameters of the instrument when it is played. The efficiency of the approach is experimentally demonstrated on a concert harp excited by some of its strings and the two methods are compared to a conventional modal analysis. The results show that OMA allows us to identify modes particularly present in the instrument's response with a good estimation especially if they are close to the excitation frequency with the modified LSCE method.

  8. Dynamical systems analysis applied to working memory data.

    PubMed

    Gasimova, Fidan; Robitzsch, Alexander; Wilhelm, Oliver; Boker, Steven M; Hu, Yueqin; Hülür, Gizem

    2014-01-01

    In the present paper we investigate weekly fluctuations in the working memory capacity (WMC) assessed over a period of 2 years. We use dynamical system analysis, specifically a second order linear differential equation, to model weekly variability in WMC in a sample of 112 9th graders. In our longitudinal data we use a B-spline imputation method to deal with missing data. The results show a significant negative frequency parameter in the data, indicating a cyclical pattern in weekly memory updating performance across time. We use a multilevel modeling approach to capture individual differences in model parameters and find that a higher initial performance level and a slower improvement at the MU task is associated with a slower frequency of oscillation. Additionally, we conduct a simulation study examining the analysis procedure's performance using different numbers of B-spline knots and values of time delay embedding dimensions. Results show that the number of knots in the B-spline imputation influence accuracy more than the number of embedding dimensions.

  9. Principles of micellar electrokinetic capillary chromatography applied in pharmaceutical analysis.

    PubMed

    Hancu, Gabriel; Simon, Brigitta; Rusu, Aura; Mircia, Eleonora; Gyéresi, Arpád

    2013-01-01

    Since its introduction capillary electrophoresis has shown great potential in areas where electrophoretic techniques have rarely been used before, including here the analysis of pharmaceutical substances. The large majority of pharmaceutical substances are neutral from electrophoretic point of view, consequently separations by the classic capillary zone electrophoresis; where separation is based on the differences between the own electrophoretic mobilities of the analytes; are hard to achieve. Micellar electrokinetic capillary chromatography, a hybrid method that combines chromatographic and electrophoretic separation principles, extends the applicability of capillary electrophoretic methods to neutral analytes. In micellar electrokinetic capillary chromatography, surfactants are added to the buffer solution in concentration above their critical micellar concentrations, consequently micelles are formed; micelles that undergo electrophoretic migration like any other charged particle. The separation is based on the differential partitioning of an analyte between the two-phase system: the mobile aqueous phase and micellar pseudostationary phase. The present paper aims to summarize the basic aspects regarding separation principles and practical applications of micellar electrokinetic capillary chromatography, with particular attention to those relevant in pharmaceutical analysis.

  10. Multivariate calibration applied to the quantitative analysis of infrared spectra

    SciTech Connect

    Haaland, D.M.

    1991-01-01

    Multivariate calibration methods are very useful for improving the precision, accuracy, and reliability of quantitative spectral analyses. Spectroscopists can more effectively use these sophisticated statistical tools if they have a qualitative understanding of the techniques involved. A qualitative picture of the factor analysis multivariate calibration methods of partial least squares (PLS) and principal component regression (PCR) is presented using infrared calibrations based upon spectra of phosphosilicate glass thin films on silicon wafers. Comparisons of the relative prediction abilities of four different multivariate calibration methods are given based on Monte Carlo simulations of spectral calibration and prediction data. The success of multivariate spectral calibrations is demonstrated for several quantitative infrared studies. The infrared absorption and emission spectra of thin-film dielectrics used in the manufacture of microelectronic devices demonstrate rapid, nondestructive at-line and in-situ analyses using PLS calibrations. Finally, the application of multivariate spectral calibrations to reagentless analysis of blood is presented. We have found that the determination of glucose in whole blood taken from diabetics can be precisely monitored from the PLS calibration of either mind- or near-infrared spectra of the blood. Progress toward the non-invasive determination of glucose levels in diabetics is an ultimate goal of this research. 13 refs., 4 figs.

  11. Unsupervised feature relevance analysis applied to improve ECG heartbeat clustering.

    PubMed

    Rodríguez-Sotelo, J L; Peluffo-Ordoñez, D; Cuesta-Frau, D; Castellanos-Domínguez, G

    2012-10-01

    The computer-assisted analysis of biomedical records has become an essential tool in clinical settings. However, current devices provide a growing amount of data that often exceeds the processing capacity of normal computers. As this amount of information rises, new demands for more efficient data extracting methods appear. This paper addresses the task of data mining in physiological records using a feature selection scheme. An unsupervised method based on relevance analysis is described. This scheme uses a least-squares optimization of the input feature matrix in a single iteration. The output of the algorithm is a feature weighting vector. The performance of the method was assessed using a heartbeat clustering test on real ECG records. The quantitative cluster validity measures yielded a correctly classified heartbeat rate of 98.69% (specificity), 85.88% (sensitivity) and 95.04% (general clustering performance), which is even higher than the performance achieved by other similar ECG clustering studies. The number of features was reduced on average from 100 to 18, and the temporal cost was a 43% lower than in previous ECG clustering schemes.

  12. Differential Network Analysis Applied to Preoperative Breast Cancer Chemotherapy Response

    PubMed Central

    Warsow, Gregor; Struckmann, Stephan; Kerkhoff, Claus; Reimer, Toralf; Engel, Nadja; Fuellen, Georg

    2013-01-01

    In silico approaches are increasingly considered to improve breast cancer treatment. One of these treatments, neoadjuvant TFAC chemotherapy, is used in cases where application of preoperative systemic therapy is indicated. Estimating response to treatment allows or improves clinical decision-making and this, in turn, may be based on a good understanding of the underlying molecular mechanisms. Ever increasing amounts of high throughput data become available for integration into functional networks. In this study, we applied our software tool ExprEssence to identify specific mechanisms relevant for TFAC therapy response, from a gene/protein interaction network. We contrasted the resulting active subnetwork to the subnetworks of two other such methods, OptDis and KeyPathwayMiner. We could show that the ExprEssence subnetwork is more related to the mechanistic functional principles of TFAC therapy than the subnetworks of the other two methods despite the simplicity of ExprEssence. We were able to validate our method by recovering known mechanisms and as an application example of our method, we identified a mechanism that may further explain the synergism between paclitaxel and doxorubicin in TFAC treatment: Paclitaxel may attenuate MELK gene expression, resulting in lower levels of its target MYBL2, already associated with doxorubicin synergism in hepatocellular carcinoma cell lines. We tested our hypothesis in three breast cancer cell lines, confirming it in part. In particular, the predicted effect on MYBL2 could be validated, and a synergistic effect of paclitaxel and doxorubicin could be demonstrated in the breast cancer cell lines SKBR3 and MCF-7. PMID:24349128

  13. Environmental risk analysis for indirect coal liquefaction

    SciTech Connect

    Barnthouse, L.W.; Suter, G.W. II; Baes, C.F. III; Bartell, S.M.; Cavendish, M.G.; Gardner, R.H.; O'Neill, R.V.; Rosen, A.E.

    1985-01-01

    This report presents an analysis of the risks to fish, water quality (due to noxious algal blooms), crops, forests, and wildlife of two technologies for the indirect liquefaction of coal: Lurgi and Koppers-Totzek gasification of coal for Fischer-Tropsch synthesis. A variety of analytical techniques were used to make maximum use of the available data to consider effects of effluents on different levels of ecological organization. The most significant toxicants to fish were found to be ammonia, cadmium, and acid gases. An analysis of whole-effluent toxicity indicated that the Lurgi effluent is more acutely toxic than the Koppers-Totzek effluent. Six effluent components appear to pose a potential threat of blue-green algal blooms, primarily because of their effects on higher trophic levels. The most important atmospheric emissions with respect to crops, forests, and wildlife were found to be the conventional combustion products SO/sub 2/ and NO/sub 2/. Of the materials deposited on the soil, arsenic, cadmium, and nickel appear of greatest concern for phytotoxicity. 147 references, 5 figures, 41 tables.

  14. Multiobjective Risk Partitioning: An Application to Dam Safety Risk Analysis

    DTIC Science & Technology

    1988-04-01

    expectation distorts, and a’most eliminates, the distinctive features of many viable alternative policy options that could lead to the reduction of the risk...height of the dam) from 20 to 30 million dollirs would contribute to a negligible reduction of 0.1 units of conventional (unconditional) expected social...results could be easily influenced by either a change in the return period of the PMH or by the choice of the distribution. Therefore, it is

  15. The Use and Abuse of Risk Analysis in Policy Debate.

    ERIC Educational Resources Information Center

    Herbeck, Dale A.; Katsulas, John P.

    The best check on the preposterous claims of crisis rhetoric is an appreciation of the nature of risk analysis and how it functions in argumentation. The use of risk analysis is common in policy debate. While the stock issues paradigm focused the debate exclusively on the affirmative case, the advent of policy systems analysis has transformed…

  16. Risk Analysis from a Top-Down Perspective

    DTIC Science & Technology

    1983-07-15

    and focused studies in critical areas. A variety of analyses, such as a localized version of the bottom up risk analysis approach and sensitivity...analysis, focus on these open ended cases to resolve them. Unresolvable decision conflicts include value judgments which risk analysis cannot solve

  17. Relative risk analysis of the use of radiation-emitting medical devices: A preliminary application

    SciTech Connect

    Jones, E.D.

    1996-06-01

    This report describes the development of a risk analysis approach for evaluating the use of radiation-emitting medial devices. This effort was performed by Lawrence Livermore National Laboratory for the US Nuclear Regulatory Commission (NRC). The assessment approach has bee applied to understand the risks in using the Gamma Knife, a gamma irradiation therapy device. This effort represents an initial step to evaluate the potential role of risk analysis for developing regulations and quality assurance requirements in the use of nuclear medical devices. The risk approach identifies and assesses the most likely risk contributors and their relative importance for the medical system. The approach uses expert screening techniques and relative risk profiling to incorporate the type, quality, and quantity of data available and to present results in an easily understood form.

  18. On applying continuous wavelet transform in wheeze analysis.

    PubMed

    Taplidou, Styliani A; Hadjileontiadis, Leontios J; Kitsas, Ilias K; Panoulas, Konstantinos I; Penzel, Thomas; Gross, Volker; Panas, Stavros M

    2004-01-01

    The identification of continuous abnormal lung sounds, like wheezes, in the total breathing cycle is of great importance in the diagnosis of obstructive airways pathologies. To this vein, the current work introduces an efficient method for the detection of wheezes, based on the time-scale representation of breath sound recordings. The employed Continuous Wavelet Transform is proven to be a valuable tool at this direction, when combined with scale-dependent thresholding. Analysis of lung sound recordings from 'wheezing' patients shows promising performance in the detection and extraction of wheezes from the background noise and reveals its potentiality for data-volume reduction in long-term wheezing screening, such as in sleep-laboratories.

  19. Applying Machine Learning to GlueX Data Analysis

    NASA Astrophysics Data System (ADS)

    Boettcher, Thomas

    2014-03-01

    GlueX is a high energy physics experiment with the goal of collecting data necessary for understanding confinement in quantum chromodynamics. Beginning in 2015, GlueX will collect huge amounts of data describing billions of particle collisions. In preparation for data collection, efforts are underway to develop a methodology for analyzing these large data sets. One of the primary challenges in GlueX data analysis is isolating events of interest from a proportionally large background. GlueX has recently begun approaching this selection problem using machine learning algorithms, specifically boosted decision trees. Preliminary studies indicate that these algorithms have the potential to offer vast improvements in both signal selection efficiency and purity over more traditional techniques.

  20. Applying Skinner's analysis of verbal behavior to persons with dementia.

    PubMed

    Dixon, Mark; Baker, Jonathan C; Sadowski, Katherine Ann

    2011-03-01

    Skinner's 1957 analysis of verbal behavior has demonstrated a fair amount of utility to teach language to children with autism and other various disorders. However, the learning of language can be forgotten, as is the case for many elderly suffering from dementia or other degenerative diseases. It appears possible that Skinner's operants may facilitate not only acquisition of language but also the ability to recall items or objects that may have appeared to be "forgotten." The present study examined the utility of having a series of adults in long-term care emit tacts, echoics, or intraverbals upon presentation of various visual stimuli. Compared to a no-verbal response condition, it appears that the incorporation of Skinner's verbal operants can in fact improve recall for this population. Implications for the retraining of lost language are presented.

  1. Geostatistical analysis as applied to two environmental radiometric time series.

    PubMed

    Dowdall, Mark; Lind, Bjørn; Gerland, Sebastian; Rudjord, Anne Liv

    2003-03-01

    This article details the results of an investigation into the application of geostatistical data analysis to two environmental radiometric time series. The data series employed consist of 99Tc values for seaweed (Fucus vesiculosus) and seawater samples taken as part of a marine monitoring program conducted on the coast of northern Norway by the Norwegian Radiation Protection Authority. Geostatistical methods were selected in order to provide information on values of the variables at unsampled times and to investigate the temporal correlation exhibited by the data sets. This information is of use in the optimisation of future sampling schemes and for providing information on the temporal behaviour of the variables in question that may not be obtained during a cursory analysis. The results indicate a high degree of temporal correlation within the data sets, the correlation for the seawater and seaweed data being modelled with an exponential and linear function, respectively. The semi-variogram for the seawater data indicates a temporal range of correlation of approximately 395 days with no apparent random component to the overall variance structure and was described best by an exponential function. The temporal structure of the seaweed data was best modelled by a linear function with a small nugget component. Evidence of drift was present in both semi-variograms. Interpolation of the data sets using the fitted models and a simple kriging procedure were compared, using a cross-validation procedure, with simple linear interpolation. Results of this exercise indicate that, for the seawater data, the kriging procedure outperformed the simple interpolation with respect to error distribution and correlation of estimates with actual values. Using the unbounded linear model with the seaweed data produced estimates that were only marginally better than those produced by the simple interpolation.

  2. Technical Risk Analysis - Exploiting the Power of MBSE

    DTIC Science & Technology

    2012-11-01

    UNCLASSIFIED DSTO-GD-0734 18. Technical Risk Analysis – Exploiting the Power of MBSE – Despina Tramoundanis1, Wayne Power1 and Daniel Spencer2...Functional Risk Analysis (FRA) conducted within a Model Based Systems Engineering ( MBSE ) environment. FRA is a rigorous technique used to explore potential...TITLE AND SUBTITLE Technical Risk Analysis â Exploiting the Power of MBSE â 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6

  3. Risk factor detection for heart disease by applying text analytics in electronic medical records.

    PubMed

    Torii, Manabu; Fan, Jung-Wei; Yang, Wei-Li; Lee, Theodore; Wiley, Matthew T; Zisook, Daniel S; Huang, Yang

    2015-12-01

    In the United States, about 600,000 people die of heart disease every year. The annual cost of care services, medications, and lost productivity reportedly exceeds 108.9 billion dollars. Effective disease risk assessment is critical to prevention, care, and treatment planning. Recent advancements in text analytics have opened up new possibilities of using the rich information in electronic medical records (EMRs) to identify relevant risk factors. The 2014 i2b2/UTHealth Challenge brought together researchers and practitioners of clinical natural language processing (NLP) to tackle the identification of heart disease risk factors reported in EMRs. We participated in this track and developed an NLP system by leveraging existing tools and resources, both public and proprietary. Our system was a hybrid of several machine-learning and rule-based components. The system achieved an overall F1 score of 0.9185, with a recall of 0.9409 and a precision of 0.8972.

  4. Applying predictive analytics to develop an intelligent risk detection application for healthcare contexts.

    PubMed

    Moghimi, Fatemeh Hoda; Cheung, Michael; Wickramasinghe, Nilmini

    2013-01-01

    Healthcare is an information rich industry where successful outcomes require the processing of multi-spectral data and sound decision making. The exponential growth of data and big data issues coupled with a rapid increase of service demands in healthcare contexts today, requires a robust framework enabled by IT (information technology) solutions as well as real-time service handling in order to ensure superior decision making and successful healthcare outcomes. Such a context is appropriate for the application of real time intelligent risk detection decision support systems using predictive analytic techniques such as data mining. To illustrate the power and potential of data science technologies in healthcare decision making scenarios, the use of an intelligent risk detection (IRD) model is proffered for the context of Congenital Heart Disease (CHD) in children, an area which requires complex high risk decisions that need to be made expeditiously and accurately in order to ensure successful healthcare outcomes.

  5. Applying data mining for the analysis of breast cancer data.

    PubMed

    Liou, Der-Ming; Chang, Wei-Pin

    2015-01-01

    Data mining, also known as Knowledge-Discovery in Databases (KDD), is the process of automatically searching large volumes of data for patterns. For instance, a clinical pattern might indicate a female who have diabetes or hypertension are easier suffered from stroke for 5 years in a future. Then, a physician can learn valuable knowledge from the data mining processes. Here, we present a study focused on the investigation of the application of artificial intelligence and data mining techniques to the prediction models of breast cancer. The artificial neural network, decision tree, logistic regression, and genetic algorithm were used for the comparative studies and the accuracy and positive predictive value of each algorithm were used as the evaluation indicators. 699 records acquired from the breast cancer patients at the University of Wisconsin, nine predictor variables, and one outcome variable were incorporated for the data analysis followed by the tenfold cross-validation. The results revealed that the accuracies of logistic regression model were 0.9434 (sensitivity 0.9716 and specificity 0.9482), the decision tree model 0.9434 (sensitivity 0.9615, specificity 0.9105), the neural network model 0.9502 (sensitivity 0.9628, specificity 0.9273), and the genetic algorithm model 0.9878 (sensitivity 1, specificity 0.9802). The accuracy of the genetic algorithm was significantly higher than the average predicted accuracy of 0.9612. The predicted outcome of the logistic regression model was higher than that of the neural network model but no significant difference was observed. The average predicted accuracy of the decision tree model was 0.9435 which was the lowest of all four predictive models. The standard deviation of the tenfold cross-validation was rather unreliable. This study indicated that the genetic algorithm model yielded better results than other data mining models for the analysis of the data of breast cancer patients in terms of the overall accuracy of

  6. Applied Climate-Change Analysis: The Climate Wizard Tool

    PubMed Central

    Girvetz, Evan H.; Zganjar, Chris; Raber, George T.; Maurer, Edwin P.; Kareiva, Peter; Lawler, Joshua J.

    2009-01-01

    Background Although the message of “global climate change” is catalyzing international action, it is local and regional changes that directly affect people and ecosystems and are of immediate concern to scientists, managers, and policy makers. A major barrier preventing informed climate-change adaptation planning is the difficulty accessing, analyzing, and interpreting climate-change information. To address this problem, we developed a powerful, yet easy to use, web-based tool called Climate Wizard (http://ClimateWizard.org) that provides non-climate specialists with simple analyses and innovative graphical depictions for conveying how climate has and is projected to change within specific geographic areas throughout the world. Methodology/Principal Findings To demonstrate the Climate Wizard, we explored historic trends and future departures (anomalies) in temperature and precipitation globally, and within specific latitudinal zones and countries. We found the greatest temperature increases during 1951–2002 occurred in northern hemisphere countries (especially during January–April), but the latitude of greatest temperature change varied throughout the year, sinusoidally ranging from approximately 50°N during February-March to 10°N during August-September. Precipitation decreases occurred most commonly in countries between 0–20°N, and increases mostly occurred outside of this latitudinal region. Similarly, a quantile ensemble analysis based on projections from 16 General Circulation Models (GCMs) for 2070–2099 identified the median projected change within countries, which showed both latitudinal and regional patterns in projected temperature and precipitation change. Conclusions/Significance The results of these analyses are consistent with those reported by the Intergovernmental Panel on Climate Change, but at the same time, they provide examples of how Climate Wizard can be used to explore regionally- and temporally-specific analyses of climate

  7. Applied Drama and the Higher Education Learning Spaces: A Reflective Analysis

    ERIC Educational Resources Information Center

    Moyo, Cletus

    2015-01-01

    This paper explores Applied Drama as a teaching approach in Higher Education learning spaces. The exploration takes a reflective analysis approach by first examining the impact that Applied Drama has had on my career as a Lecturer/Educator/Teacher working in Higher Education environments. My engagement with Applied Drama practice and theory is…

  8. Exploratory Factor Analysis as a Construct Validation Tool: (Mis)applications in Applied Linguistics Research

    ERIC Educational Resources Information Center

    Karami, Hossein

    2015-01-01

    Factor analysis has been frequently exploited in applied research to provide evidence about the underlying factors in various measurement instruments. A close inspection of a large number of studies published in leading applied linguistic journals shows that there is a misconception among applied linguists as to the relative merits of exploratory…

  9. Applying Risk Science and Stakeholder Engagement to Overcome Environmental Barriers to Marine and Hydrokinetic Energy Projects

    SciTech Connect

    Copping, Andrea E.; Anderson, Richard M.; Van Cleve, Frances B.

    2010-09-20

    The production of electricity from the moving waters of the ocean has the potential to be a viable addition to the portfolio of renewable energy sources worldwide. The marine and hydrokinetic (MHK) industry faces many hurdles, including technology development, challenges of offshore deployments, and financing; however, the barrier most commonly identified by industry, regulators, and stakeholders is the uncertainty surrounding potential environmental effects of devices placed in the water and the permitting processes associated with real or potential impacts. Regulatory processes are not well positioned to judge the severity of harm due to turbines or wave generators. Risks from MHK devices to endangered or protected animals in coastal waters and rivers, as well as the habitats that support them, are poorly understood. This uncertainty raises concerns about catastrophic interactions between spinning turbine blades or slack mooring lines and marine mammals, birds and fish. In order to accelerate the deployment of tidal and wave devices, there is a need to sort through the extensive list of potential interactions that may cause harm to marine organisms and ecosystems, to set priorities for regulatory triggers, and to direct future research. Identifying the risk of MHK technology components on specific marine organisms and ecosystem components can separate perceived from real risk-relevant interactions. Scientists from Pacific Northwest National Laboratory (PNNL) are developing an Environmental Risk Evaluation System (ERES) to assess environmental effects associated with MHK technologies and projects through a systematic analytical process, with specific input from key stakeholder groups. The array of stakeholders interested in the development of MHK is broad, segmenting into those whose involvement is essential for the success of the MHK project, those that are influential, and those that are interested. PNNL and their partners have engaged these groups, gaining

  10. Ion Beam Analysis applied to laser-generated plasmas

    NASA Astrophysics Data System (ADS)

    Cutroneo, M.; Macková, A.; Havranek, V.; Malinsky, P.; Torrisi, L.; Kormunda, M.; Barchuk, M.; Ullschmied, J.; Dudzak, R.

    2016-04-01

    This paper presents the research activity on Ion Beam Analysis methods performed at Tandetron Laboratory (LT) of the Institute of Nuclear Physics AS CR, Rez, Czech Republic. Recently, many groups are paying attention to implantation by laser generated plasma. This process allows to insert a controllable amount of energetic ions into the surface layers of different materials modifying the physical and chemical properties of the surface material. Different substrates are implanted by accelerated ions from plasma through terawatt iodine laser, at nominal intensity of 1015 W/cm2, at the PALS Research Infrastructure AS CR, in the Czech Republic. This regime of the laser matter interaction generates, multi-MeV proton beams, and multi-charged ions that are tightly confined in time (hundreds ps) and space (source radius of a few microns). These ion beams have a much lower transverse temperature, a much shorter duration and a much higher current than those obtainable from conventional accelerators. The implementation of protons and ions acceleration driven by ultra-short high intensity lasers is exhibited by adopting suitable irradiation conditions as well as tailored targets. An overview of implanted targets and their morphological and structural characterizations is presented and discussed.

  11. Applied and computational harmonic analysis on graphs and networks

    NASA Astrophysics Data System (ADS)

    Irion, Jeff; Saito, Naoki

    2015-09-01

    In recent years, the advent of new sensor technologies and social network infrastructure has provided huge opportunities and challenges for analyzing data recorded on such networks. In the case of data on regular lattices, computational harmonic analysis tools such as the Fourier and wavelet transforms have well-developed theories and proven track records of success. It is therefore quite important to extend such tools from the classical setting of regular lattices to the more general setting of graphs and networks. In this article, we first review basics of graph Laplacian matrices, whose eigenpairs are often interpreted as the frequencies and the Fourier basis vectors on a given graph. We point out, however, that such an interpretation is misleading unless the underlying graph is either an unweighted path or cycle. We then discuss our recent effort of constructing multiscale basis dictionaries on a graph, including the Hierarchical Graph Laplacian Eigenbasis Dictionary and the Generalized Haar-Walsh Wavelet Packet Dictionary, which are viewed as generalizations of the classical hierarchical block DCTs and the Haar-Walsh wavelet packets, respectively, to the graph setting. Finally, we demonstrate the usefulness of our dictionaries by using them to simultaneously segment and denoise 1-D noisy signals sampled on regular lattices, a problem where classical tools have difficulty.

  12. Improving the flash flood frequency analysis applying dendrogeomorphological evidences

    NASA Astrophysics Data System (ADS)

    Ruiz-Villanueva, V.; Ballesteros, J. A.; Bodoque, J. M.; Stoffel, M.; Bollschweiler, M.; Díez-Herrero, A.

    2009-09-01

    Flash floods are one of the natural hazards that cause major damages worldwide. Especially in Mediterranean areas they provoke high economic losses every year. In mountain areas with high stream gradients, floods events are characterized by extremely high flow and debris transport rates. Flash flood analysis in mountain areas presents specific scientific challenges. On one hand, there is a lack of information on precipitation and discharge due to a lack of spatially well distributed gauge stations with long records. On the other hand, gauge stations may not record correctly during extreme events when they are damaged or the discharge exceeds the recordable level. In this case, no systematic data allows improvement of the understanding of the spatial and temporal occurrence of the process. Since historic documentation is normally scarce or even completely missing in mountain areas, tree-ring analysis can provide an alternative approach. Flash floods may influence trees in different ways: (1) tilting of the stem through the unilateral pressure of the flowing mass or individual boulders; (2) root exposure through erosion of the banks; (3) injuries and scars caused by boulders and wood transported in the flow; (4) decapitation of the stem and resulting candelabra growth through the severe impact of boulders; (5) stem burial through deposition of material. The trees react to these disturbances with specific growth changes such as abrupt change of the yearly increment and anatomical changes like reaction wood or callus tissue. In this study, we sampled 90 cross sections and 265 increment cores of trees heavily affected by past flash floods in order to date past events and to reconstruct recurrence intervals in two torrent channels located in the Spanish Central System. The first study site is located along the Pelayo River, a torrent in natural conditions. Based on the external disturbances of trees and their geomorphological position, 114 Pinus pinaster (Ait

  13. Seismic vulnerability assessments in risk analysis

    NASA Astrophysics Data System (ADS)

    Frolova, Nina; Larionov, Valery; Bonnin, Jean; Ugarov, Alexander

    2013-04-01

    The assessment of seismic vulnerability is a critical issue within natural and technological risk analysis. In general, there are three common types of methods used for development of vulnerability functions of different elements at risk: empirical, analytical and expert estimations. The paper addresses the empirical methods for seismic vulnerability estimation for residential buildings and industrial facilities. The results of engineering analysis of past earthquake consequences, as well as the statistical data on buildings behavior during strong earthquakes presented in the different seismic intensity scales, are used to verify the regional parameters of mathematical models in order to simulate physical and economic vulnerability for different building types classified according to seismic scale MMSK-86. Verified procedure has been used to estimate the physical and economic vulnerability of buildings and constructions against earthquakes for the Northern Caucasus Federal region of the Russian Federation and Krasnodar area, which are characterized by rather high level of seismic activity and high population density. In order to estimate expected damage states to buildings and constructions in the case of the earthquakes according to the OSR-97B (return period T=1,000 years) within big cities and towns, they were divided into unit sites and their coordinates were presented as dots located in the centers of unit sites. Then the indexes obtained for each unit site were summed up. The maps of physical vulnerability zoning for Northern Caucasus Federal region of the Russian Federation and Krasnodar area includes two elements: percent of different damage states for settlements with number of inhabitants less than 1,000 and vulnerability for cities and towns with number of inhabitants more than 1,000. The hypsometric scale is used to represent both elements on the maps. Taking into account the size of oil pipe line systems located in the highly active seismic zones in

  14. Applying Social Norms Theory within Affiliation Groups: Promising Interventions for High-Risk Drinking

    ERIC Educational Resources Information Center

    Bruce, Susan; Keller, Adrienne E.

    2007-01-01

    On college campuses across the country, high-risk drinking and the associated negative consequences have become a national concern. As colleges strive to find appropriate and effective approaches to deal with this issue, social norms theory provides a coherent framework for interventions that are relevant and positive. Small Group Social Norms…

  15. Invitational Theory and Practice Applied to Resiliency Development in At-Risk Youth

    ERIC Educational Resources Information Center

    Lee, R. Scott

    2012-01-01

    Resilience development is a growing field of study within the scholarly literature regarding social emotional achievement of at-risk students. Developing resiliency is based on the assumption that positive, pro-social, and/or strength-based values inherent in children and youth should be actively and intentionally developed. The core values of…

  16. A performance-based approach to landslide risk analysis

    NASA Astrophysics Data System (ADS)

    Romeo, R. W.

    2009-04-01

    An approach for the risk assessment based on a probabilistic analysis of the performance of structures threatened by landslides is shown and discussed. The risk is a possible loss due to the occurrence of a potentially damaging event. Analytically the risk is the probability convolution of hazard, which defines the frequency of occurrence of the event (i.e., the demand), and fragility that defines the capacity of the system to withstand the event given its characteristics (i.e., severity) and those of the exposed goods (vulnerability), that is: Risk=p(D>=d|S,V) The inequality sets a damage (or loss) threshold beyond which the system's performance is no longer met. Therefore a consistent approach to risk assessment should: 1) adopt a probabilistic model which takes into account all the uncertainties of the involved variables (capacity and demand), 2) follow a performance approach based on given loss or damage thresholds. The proposed method belongs to the category of the semi-empirical ones: the theoretical component is given by the probabilistic capacity-demand model; the empirical component is given by the observed statistical behaviour of structures damaged by landslides. Two landslide properties alone are required: the area-extent and the type (or kinematism). All other properties required to determine the severity of landslides (such as depth, speed and frequency) are derived via probabilistic methods. The severity (or intensity) of landslides, in terms of kinetic energy, is the demand of resistance; the resistance capacity is given by the cumulative distribution functions of the limit state performance (fragility functions) assessed via damage surveys and cards compilation. The investigated limit states are aesthetic (of nominal concern alone), functional (interruption of service) and structural (economic and social losses). The damage probability is the probabilistic convolution of hazard (the probability mass function of the frequency of occurrence of given

  17. Beyond Time out and Table Time: Today's Applied Behavior Analysis for Students with Autism

    ERIC Educational Resources Information Center

    Boutot, E. Amanda; Hume, Kara

    2012-01-01

    Recent mandates related to the implementation of evidence-based practices for individuals with autism spectrum disorder (ASD) require that autism professionals both understand and are able to implement practices based on the science of applied behavior analysis (ABA). The use of the term "applied behavior analysis" and its related concepts…

  18. Beyond Time Out and Table Time: Today's Applied Behavior Analysis for Students with Autism

    ERIC Educational Resources Information Center

    Boutot, E. Amanda; Hume, Kara

    2010-01-01

    Recent mandates related to the implementation of evidence-based practices for individuals with autism spectrum disorder (ASD) require that autism professionals both understand and are able to implement practices based on the science of applied behavior analysis (ABA). The use of the term "applied behavior analysis" and its related concepts…

  19. Developing scenarios to assess future landslide risks: a model-based approach applied to mountainous regions

    NASA Astrophysics Data System (ADS)

    Vacquie, Laure; Houet, Thomas

    2016-04-01

    In the last century, European mountain landscapes have experienced significant transformations. Natural and anthropogenic changes, climate changes, touristic and industrial development, socio-economic interactions, and their implications in terms of LUCC (land use and land cover changes) have directly influenced the spatial organization and vulnerability of mountain landscapes. This study is conducted as part of the SAMCO project founded by the French National Science Agency (ANR). It aims at developing a methodological approach, combining various tools, modelling platforms and methods, to identify vulnerable regions to landslide hazards accounting for futures LUCC. It presents an integrated approach combining participative scenarios and a LULC changes simulation models to assess the combined effects of LUCC and climate change on landslide risks in the Cauterets valley (French Pyrenees Mountains) up to 2100. Through vulnerability and risk mapping, the objective is to gather information to support landscape planning and implement land use strategies with local stakeholders for risk management. Four contrasting scenarios are developed and exhibit contrasting trajectories of socio-economic development. Prospective scenarios are based on national and international socio-economic contexts relying on existing assessment reports. The methodological approach integrates knowledge from local stakeholders to refine each scenario during their construction and to reinforce their plausibility and relevance by accounting for local specificities, e.g. logging and pastoral activities, touristic development, urban planning, etc. A process-based model, the Forecasting Scenarios for Mountains (ForeSceM) model, developed on the Dinamica Ego modelling platform is used to spatially allocate futures LUCC for each prospective scenario. Concurrently, a spatial decision support tool, i.e. the SYLVACCESS model, is used to identify accessible areas for forestry in scenario projecting logging

  20. Risk analysis for Arctic offshore operations

    SciTech Connect

    Slomski, S.; Vivatrat, V.

    1986-04-01

    Offshore exploration for hydrocarbons is being conducted in the near-shore regions of the Beaufort Sea. This activity is expected to be intensified and expanded into the deeper portions of the Beaufort, as well as into the Chukchi Sea. The ice conditions in the Beaufort Sea are very variable, particularly in the deeper water regions. This variability greatly influences the probability of success or failure of an offshore operation. For example, a summer exploratory program conducted from a floating drilling unit may require a period of 60 to 100 days on station. The success of such a program depends on: (a) the time when the winter ice conditions deteriorate sufficiently for the drilling unit to move on station; (b) the number of summer invasions by the arctic ice pack, forcing the drilling unit to abandon station; (c) the rate at which first-year ice grows to the ice thickness limit of the supporting icebreakers; and (d) the extent of arctic pack expansion during the fall and early winter. In general, the ice conditions are so variable that, even with good planning, the change of failure of an offshore operation will not be negligible. Contingency planning for such events is therefore necessary. This paper presents a risk analysis procedure which can greatly benefit the planning of an offshore operation. A floating drilling program and a towing and installation operation for a fixed structure are considered to illustrate the procedure.

  1. Probabilistic risk analysis of groundwater remediation strategies

    NASA Astrophysics Data System (ADS)

    Bolster, D.; Barahona, M.; Dentz, M.; Fernandez-Garcia, D.; Sanchez-Vila, X.; Trinchero, P.; Valhondo, C.; Tartakovsky, D. M.

    2009-06-01

    Heterogeneity of subsurface environments and insufficient site characterization are some of the reasons why decisions about groundwater exploitation and remediation have to be made under uncertainty. A typical decision maker chooses between several alternative remediation strategies by balancing their respective costs with the probability of their success or failure. We conduct a probabilistic risk assessment (PRA) to determine the likelihood of the success of a permeable reactive barrier, one of the leading approaches to groundwater remediation. While PRA is used extensively in many engineering fields, its applications in hydrogeology are scarce. This is because rigorous PRA requires one to quantify structural and parametric uncertainties inherent in predictions of subsurface flow and transport. We demonstrate how PRA can facilitate a comprehensive uncertainty quantification for complex subsurface phenomena by identifying key transport processes contributing to a barrier's failure, each of which is amenable to uncertainty analysis. Probability of failure of a remediation strategy is computed by combining independent and conditional probabilities of failure of each process. Individual probabilities can be evaluated either analytically or numerically or, barring both, can be inferred from expert opinion.

  2. Comparing multiple competing interventions in the absence of randomized trials using clinical risk-benefit analysis

    PubMed Central

    2012-01-01

    Background To demonstrate the use of risk-benefit analysis for comparing multiple competing interventions in the absence of randomized trials, we applied this approach to the evaluation of five anticoagulants to prevent thrombosis in patients undergoing orthopedic surgery. Methods Using a cost-effectiveness approach from a clinical perspective (i.e. risk benefit analysis) we compared thromboprophylaxis with warfarin, low molecular weight heparin, unfractionated heparin, fondaparinux or ximelagatran in patients undergoing major orthopedic surgery, with sub-analyses according to surgery type. Proportions and variances of events defining risk (major bleeding) and benefit (thrombosis averted) were obtained through a meta-analysis and used to define beta distributions. Monte Carlo simulations were conducted and used to calculate incremental risks, benefits, and risk-benefit ratios. Finally, net clinical benefit was calculated for all replications across a range of risk-benefit acceptability thresholds, with a reference range obtained by estimating the case fatality rate - ratio of thrombosis to bleeding. Results The analysis showed that compared to placebo ximelagatran was superior to other options but final results were influenced by type of surgery, since ximelagatran was superior in total knee replacement but not in total hip replacement. Conclusions Using simulation and economic techniques we demonstrate a method that allows comparing multiple competing interventions in the absence of randomized trials with multiple arms by determining the option with the best risk-benefit profile. It can be helpful in clinical decision making since it incorporates risk, benefit, and personal risk acceptance. PMID:22233221

  3. Locating and applying sociological theories of risk-taking to develop public health interventions for adolescents

    PubMed Central

    Pound, Pandora; Campbell, Rona

    2015-01-01

    Sociological theories seldom inform public health interventions at the community level. The reasons for this are unclear but may include difficulties in finding, understanding or operationalising theories. We conducted a study to explore the feasibility of locating sociological theories within a specific field of public health, adolescent risk-taking, and to consider their potential for practical application. We identified a range of sociological theories. These explained risk-taking: (i) as being due to lack of social integration; (ii) as a consequence of isolation from mainstream society; (iii) as a rite of passage; (iv) as a response to social constraints; (v) as resistance; (vi) as an aspect of adolescent development; (vii) by the theory of the ‘habitus’; (viii) by situated rationality and social action theories; and (ix) as social practice. We consider these theories in terms of their potential to inform public health interventions for young people. PMID:25999784

  4. Risk analysis of colorectal cancer incidence by gene expression analysis

    PubMed Central

    Shangkuan, Wei-Chuan; Lin, Hung-Che; Chang, Yu-Tien; Jian, Chen-En; Fan, Hueng-Chuen; Chen, Kang-Hua; Liu, Ya-Fang; Hsu, Huan-Ming; Chou, Hsiu-Ling; Yao, Chung-Tay

    2017-01-01

    Background Colorectal cancer (CRC) is one of the leading cancers worldwide. Several studies have performed microarray data analyses for cancer classification and prognostic analyses. Microarray assays also enable the identification of gene signatures for molecular characterization and treatment prediction. Objective Microarray gene expression data from the online Gene Expression Omnibus (GEO) database were used to to distinguish colorectal cancer from normal colon tissue samples. Methods We collected microarray data from the GEO database to establish colorectal cancer microarray gene expression datasets for a combined analysis. Using the Prediction Analysis for Microarrays (PAM) method and the GSEA MSigDB resource, we analyzed the 14,698 genes that were identified through an examination of their expression values between normal and tumor tissues. Results Ten genes (ABCG2, AQP8, SPIB, CA7, CLDN8, SCNN1B, SLC30A10, CD177, PADI2, and TGFBI) were found to be good indicators of the candidate genes that correlate with CRC. From these selected genes, an average of six significant genes were obtained using the PAM method, with an accuracy rate of 95%. The results demonstrate the potential of utilizing a model with the PAM method for data mining. After a detailed review of the published reports, the results confirmed that the screened candidate genes are good indicators for cancer risk analysis using the PAM method. Conclusions Six genes were selected with 95% accuracy to effectively classify normal and colorectal cancer tissues. We hope that these results will provide the basis for new research projects in clinical practice that aim to rapidly assess colorectal cancer risk using microarray gene expression analysis. PMID:28229027

  5. Problem Formulation for Human Health Risk Assessments of Pathogens in Land-Applied Biosolids (Final Report)

    EPA Science Inventory

    Millions of tons of treated sewage sludges or “biosolids” are applied annually to farms, forests, rangelands, mine lands and other types of land in the United States. Biosolids are defined by the U.S. Environmental Protection Agency (EPA) as “the primarily organic solid product ...

  6. The Significance of Consequence Assessment Applied to the Risk-Based Approach of Homeland Security

    DTIC Science & Technology

    2008-03-01

    PAGE INTENTIONALLY LEFT BLANK xi LIST OF ABBREVIATIONS/ACRONYMS CDC Centers for Disease Control CHDS Center of Homeland Security and Defense CI/KR...Office of Comparative Studies in DHS’ Science and Technology Directorate argues that risk, no matter how well founded, is in reality a mental and... cinema complexes, office buildings, airport arrival halls, and train stations. There are a plethora of these venues across the country, which, given

  7. Definition and GIS-based characterization of an integral risk index applied to a chemical/petrochemical area.

    PubMed

    Nadal, Martí; Kumar, Vikas; Schuhmacher, Marta; Domingo, José L

    2006-08-01

    A risk map of the chemical/petrochemical industrial area of Tarragona (Catalonia, Spain) was designed following a two-stage procedure. The first step was the creation of a ranking system (Hazard Index) for a number of different inorganic and organic pollutants: heavy metals, polychlorinated dibenzo-p-dioxins and dibenzofurans (PCDD/Fs), polychlorinated biphenyls (PCBs) and polychlorinated aromatic hydrocarbons (PAHs) by applying self-organizing maps (SOM) to persistence, bioaccumulation and toxicity properties of the chemicals. PCBs seemed to be the most hazardous compounds, while the light PAHs showed the minimum values. Subsequently, an Integral Risk Index was developed taking into account the Hazard Index and the concentrations of all pollutants in soil samples collected in the assessed area of Tarragona. Finally, a risk map was elaborated by representing the spatial distribution of the Integral Risk Index with a geographic information system (GIS). The results of the present study seem to indicate that the development of an integral risk map might be useful to help in making-decision processes concerning environmental pollutants.

  8. Loss Exposure and Risk Analysis Methodology (LERAM) Project Database Design.

    DTIC Science & Technology

    1996-06-01

    MISREPS) to more capably support system safety engineering concepts such as hazard analysis and risk management. As part of the Loss Exposure and Risk ... Analysis Methodology (LERAM) project, the research into the methods which we employ to report, track, and analyze hazards has resulted in a series of low

  9. 49 CFR 260.17 - Credit risk premium analysis.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... Financial Assistance § 260.17 Credit risk premium analysis. (a) When Federal appropriations are not... 49 Transportation 4 2014-10-01 2014-10-01 false Credit risk premium analysis. 260.17 Section 260..., based on Applicant's: (A) Industry outlook; (B) Market position; (C) Management and financial...

  10. 49 CFR 260.17 - Credit risk premium analysis.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... Financial Assistance § 260.17 Credit risk premium analysis. (a) When Federal appropriations are not... 49 Transportation 4 2013-10-01 2013-10-01 false Credit risk premium analysis. 260.17 Section 260..., based on Applicant's: (A) Industry outlook; (B) Market position; (C) Management and financial...

  11. Virtues and Limitations of Risk Analysis

    ERIC Educational Resources Information Center

    Weatherwax, Robert K.

    1975-01-01

    After summarizing the Rasmussion Report, the author reviews the probabilistic portion of the report from the perspectives of engineering utility and risk assessment uncertainty. The author shows that the report may represent a significant step forward in the assurance of reactor safety and an imperfect measure of actual reactor risk. (BT)

  12. Risk-Based Explosive Safety Analysis

    DTIC Science & Technology

    2016-11-30

    safety siting of energetic liquids and propellants can be greatly aided by the use of risk-based methodologies. The low probability of exposed...of energetic liquids and propellants can be greatly aided by the use of risk-based methodologies. The low probability of exposed personnel and the

  13. Risk analysis for worker exposure to benzene

    NASA Astrophysics Data System (ADS)

    Hallenbeck, William H.; Flowers, Roxanne E.

    1992-05-01

    Cancer risk factors (characterized by route, dose, dose rate per kilogram, fraction of lifetime exposed, species, and sex) were derived for workers exposed to benzene via inhalation or ingestion. Exposure at the current Occupational Safety and Health Administration (OSHA) permissible exposure limit (PEL) and at leaking underground storage tank (LUST) sites were evaluated. At the current PEL of 1 ppm, the theoretical lifetime excess risk of cancer from benzene inhalation is ten per 1000. The theoretical lifetime excess risk for worker inhalation exposure at LUST sites ranged from 10 to 40 per 1000. These results indicate that personal protection should be required. The theoretical lifetime excess risk due to soil ingestion is five to seven orders of magnitude less than the inhalation risks.

  14. Risk Management in Coastal Engineering - Applied Research Projects for the German Wadden Sea

    NASA Astrophysics Data System (ADS)

    Woeffler, T.; Grimm, C.; Bachmann, D.; Jensen, J.; Mudersbach, C.; Froehle, P.; Thorenz, F.; Schuettrumpf, H.

    2012-04-01

    Several islands in the northfrisian part of the UNESCO - World Natural Heritage Wadden Sea are exposed to extreme storm surges due to climate change and sea level rise. Existing coastal protection measures in this area do not consider the future sea state and are mainly based on tradition and expert knowledge. The two projects HoRisK and ZukunftHallig (supported by the German Coastal Engineering Research Council) focus on this area and implement the requirements defined in the Directive 2007/60/EC on the assessment and management of flood risk. The main objects of the projects are the design and evaluation of new coastal protection techniques for the investigation area. With numerical simulations hydrological parameters are investigated in order to design new coastal protection- and management strategies. The decision support system PROMAIDES (Protection Measure against Inundation Decision Support) developed at the Institute of Hydraulic Engineering and Water Resources Management of the RWTH Aachen University analyzes the effects and reliability of new coastal protection techniques and evaluates inundation areas and economic damages for different hydrological boundary conditions. As a result flood risk and hazard maps are shown in this work. Furthermore sensitivity analyses expose possible variations in future storm surges and illustrate the difference in significant wave heights for varying wind climates. This risk based approach of both projects is a suitable way to ensure life for further generations on these islands under sustainable ecological und economic conditions. Acknowledgments This work was supported by the KFKI (German Coastal Engineering Research Council) and the German Federal Ministery of Education and Research (BMBF) (Project No. 03KIS094 and 03KIS078)

  15. Viral metagenomics applied to blood donors and recipients at high risk for blood-borne infections

    PubMed Central

    Sauvage, Virginie; Laperche, Syria; Cheval, Justine; Muth, Erika; Dubois, Myriam; Boizeau, Laure; Hébert, Charles; Lionnet, François; Lefrère, Jean-Jacques; Eloit, Marc

    2016-01-01

    Background Characterisation of human-associated viral communities is essential for epidemiological surveillance and to be able to anticipate new potential threats for blood transfusion safety. In high-resource countries, the risk of blood-borne agent transmission of well-known viruses (HBV, HCV, HIV and HTLV) is currently considered to be under control. However, other unknown or unsuspected viruses may be transmitted to recipients by blood-derived products. To investigate this, the virome of plasma from individuals at high risk for parenterally and sexually transmitted infections was analysed by high throughput sequencing (HTS). Materials and methods Purified nucleic acids from two pools of 50 samples from recipients of multiple transfusions, and three pools containing seven plasma samples from either HBV−, HCV− or HIV-infected blood donors, were submitted to HTS. Results Sequences from resident anelloviruses and HPgV were evidenced in all pools. HBV and HCV sequences were detected in pools containing 3.8×103 IU/mL of HBV-DNA and 1.7×105 IU/mL of HCV-RNA, respectively, whereas no HIV sequence was found in a pool of 150 copies/mL of HIV-RNA. This suggests a lack of sensitivity in HTS performance in detecting low levels of virus. In addition, this study identified other issues, including laboratory contaminants and the uncertainty of taxonomic assignment of short sequence. No sequence suggestive of a new viral species was identified. Discussion This study did not identify any new blood-borne virus in high-risk individuals. However, rare and/or viruses present at very low titre could have escaped our protocol. Our results demonstrate the positive contribution of HTS in the detection of viral sequences in blood donations. PMID:27136432

  16. Inclusive Elementary Classroom Teacher Knowledge of and Attitudes toward Applied Behavior Analysis and Autism Spectrum Disorder and Their Use of Applied Behavior Analysis

    ERIC Educational Resources Information Center

    McCormick, Jennifer A.

    2011-01-01

    The purpose of this study was to examine inclusive elementary teacher knowledge and attitude toward Autism Spectrum Disorder (ASD) and applied behavior analysis (ABA) and their use of ABA. Furthermore, this study examined if knowledge and attitude predicted use of ABA. A survey was developed and administered through a web-based program. Of the…

  17. Pitfalls in Pathways: Some Perspectives on Competing Risks Event History Analysis in Education Research

    ERIC Educational Resources Information Center

    Scott, Marc A.; Kennedy, Benjamin B.

    2005-01-01

    A set of discrete-time methods for competing risks event history analysis is presented. The approach used is accessible to the practitioner and the article describes the strengths, weaknesses, and interpretation of both exploratory and model-based tools. These techniques are applied to the impact of "nontraditional" enrollment features (working,…

  18. Risk analysis of gravity dam instability using credibility theory Monte Carlo simulation model.

    PubMed

    Xin, Cao; Chongshi, Gu

    2016-01-01

    Risk analysis of gravity dam stability involves complicated uncertainty in many design parameters and measured data. Stability failure risk ratio described jointly by probability and possibility has deficiency in characterization of influence of fuzzy factors and representation of the likelihood of risk occurrence in practical engineering. In this article, credibility theory is applied into stability failure risk analysis of gravity dam. Stability of gravity dam is viewed as a hybrid event considering both fuzziness and randomness of failure criterion, design parameters and measured data. Credibility distribution function is conducted as a novel way to represent uncertainty of influence factors of gravity dam stability. And combining with Monte Carlo simulation, corresponding calculation method and procedure are proposed. Based on a dam section, a detailed application of the modeling approach on risk calculation of both dam foundation and double sliding surfaces is provided. The results show that, the present method is feasible to be applied on analysis of stability failure risk for gravity dams. The risk assessment obtained can reflect influence of both sorts of uncertainty, and is suitable as an index value.

  19. A risk analysis model of the relationship between beverage consumption from school vending machines and risk of adolescent overweight.

    PubMed

    Forshee, Richard A; Storey, Maureen L; Ginevan, Michael E

    2005-10-01

    Risk analysis is a widely used tool to understand problems in food safety policy, but it is seldom applied to nutrition policy. We propose that risk analysis be applied more often to inform debates on nutrition policy, and we conduct a risk assessment of the relationship of regular carbonated soft drink (RCSD) consumption in schools and body mass index (BMI) as a case study. Data for RCSD consumption in schools were drawn from three data sets: the Continuing Survey of Food Intake by Individuals 1994-1996, 1998 (CSFII), the National Health and Nutrition Examination Survey 1999-2000 (NHANES), and the National Family Opinion (NFO) WorldGroup Share of Intake Panel (SIP) study. We used the largest relationship between RCSD and BMI that was published by prospective observational studies to characterize the maximum plausible relationship in our study. Consumption of RCSD in schools was low in all three data sets, ranging from 15 g/day in NFO-SIP to 60 g/day in NHANES. There was no relationship between RCSD consumption from all sources and BMI in either the CSFII or the NHANES data. The risk assessment showed no impact on BMI by removing RCSD consumption in school. These findings suggest that focusing adolescent overweight prevention programs on RCSD in schools will not have a significant impact on BMI.

  20. Extended risk-analysis model for activities of the project.

    PubMed

    Kušar, Janez; Rihar, Lidija; Zargi, Urban; Starbek, Marko

    2013-12-01

    Project management of product/service orders has become a mode of operation in many companies. Although these are mostly cyclically recurring projects, risk management is very important for them. An extended risk-analysis model for new product/service projects is presented in this paper. Emphasis is on a solution developed in the Faculty of Mechanical Engineering in Ljubljana, Slovenia. The usual project activities risk analysis is based on evaluation of the probability that risk events occur and on evaluation of their consequences. A third parameter has been added in our model: an estimate of the incidence of risk events. On the basis of the calculated activity risk level, a project team prepares preventive and corrective measures that should be taken according to the status indicators. An important advantage of the proposed solution is that the project manager and his team members are timely warned of risk events and they can thus activate the envisaged preventive and corrective measures as necessary.

  1. EC Transmission Line Risk Identification and Analysis

    SciTech Connect

    Bigelow, Tim S

    2012-04-01

    The purpose of this document is to assist in evaluating and planning for the cost, schedule, and technical project risks associated with the delivery and operation of the EC (Electron cyclotron) transmission line system. In general, the major risks that are anticipated to be encountered during the project delivery phase associated with the implementation of the Procurement Arrangement for the EC transmission line system are associated with: (1) Undefined or changing requirements (e.g., functional or regulatory requirements) (2) Underperformance of prototype, first unit, or production components during testing (3) Unavailability of qualified vendors for critical components Technical risks associated with the design and operation of the system are also identified.

  2. A flexible count data regression model for risk analysis.

    PubMed

    Guikema, Seth D; Coffelt, Jeremy P; Goffelt, Jeremy P

    2008-02-01

    In many cases, risk and reliability analyses involve estimating the probabilities of discrete events such as hardware failures and occurrences of disease or death. There is often additional information in the form of explanatory variables that can be used to help estimate the likelihood of different numbers of events in the future through the use of an appropriate regression model, such as a generalized linear model. However, existing generalized linear models (GLM) are limited in their ability to handle the types of variance structures often encountered in using count data in risk and reliability analysis. In particular, standard models cannot handle both underdispersed data (variance less than the mean) and overdispersed data (variance greater than the mean) in a single coherent modeling framework. This article presents a new GLM based on a reformulation of the Conway-Maxwell Poisson (COM) distribution that is useful for both underdispersed and overdispersed count data and demonstrates this model by applying it to the assessment of electric power system reliability. The results show that the proposed COM GLM can provide as good of fits to data as the commonly used existing models for overdispered data sets while outperforming these commonly used models for underdispersed data sets.

  3. Capability for Integrated Systems Risk-Reduction Analysis

    NASA Technical Reports Server (NTRS)

    Mindock, J.; Lumpkins, S.; Shelhamer, M.

    2016-01-01

    NASA's Human Research Program (HRP) is working to increase the likelihoods of human health and performance success during long-duration missions, and subsequent crew long-term health. To achieve these goals, there is a need to develop an integrated understanding of how the complex human physiological-socio-technical mission system behaves in spaceflight. This understanding will allow HRP to provide cross-disciplinary spaceflight countermeasures while minimizing resources such as mass, power, and volume. This understanding will also allow development of tools to assess the state of and enhance the resilience of individual crewmembers, teams, and the integrated mission system. We will discuss a set of risk-reduction questions that has been identified to guide the systems approach necessary to meet these needs. In addition, a framework of factors influencing human health and performance in space, called the Contributing Factor Map (CFM), is being applied as the backbone for incorporating information addressing these questions from sources throughout HRP. Using the common language of the CFM, information from sources such as the Human System Risk Board summaries, Integrated Research Plan, and HRP-funded publications has been combined and visualized in ways that allow insight into cross-disciplinary interconnections in a systematic, standardized fashion. We will show examples of these visualizations. We will also discuss applications of the resulting analysis capability that can inform science portfolio decisions, such as areas in which cross-disciplinary solicitations or countermeasure development will potentially be fruitful.

  4. An analysis of the new EPA risk management rule

    SciTech Connect

    Loran, B.; Nand, K.; Male, M.

    1997-08-01

    Due to increasing public concern of risks from handling highly hazardous chemicals at various facilities, a number of state and federal regulatory agencies, such as the Occupational Safety and Health Administration (OSHA) and recently the US Environmental Protection Agency (EPA), have enacted regulations requiring these facilities to perform accidental risk analysis and develop process safety and risk management programs. The regulatory requirements to be fulfilled are described; the major components involved are a Process Hazard Analysis, a Consequence Analysis, and a Management Program. The performance of these analyses and the development of a management program for 21 facilities operated by the City of Los Angeles Department of Water and Power, treating drinking water supplies with chlorine, is discussed. The effectiveness of the EPA risk management rule in achieving risk reduction is critically analyzed; it is found that, while the rule increases the worker and public awareness of the inherent risks present, some of the analytical results obtained may have a limited practical application.

  5. Fire behavior and risk analysis in spacecraft

    NASA Technical Reports Server (NTRS)

    Friedman, Robert; Sacksteder, Kurt R.

    1988-01-01

    Practical risk management for present and future spacecraft, including space stations, involves the optimization of residual risks balanced by the spacecraft operational, technological, and economic limitations. Spacecraft fire safety is approached through three strategies, in order of risk: (1) control of fire-causing elements, through exclusion of flammable materials for example; (2) response to incipient fires through detection and alarm; and (3) recovery of normal conditions through extinguishment and cleanup. Present understanding of combustion in low gravity is that, compared to normal gravity behavior, fire hazards may be reduced by the absence of buoyant gas flows yet at the same time increased by ventilation flows and hot particle expulsion. This paper discusses the application of low-gravity combustion knowledge and appropriate aircraft analogies to fire detection, fire fighting, and fire-safety decisions for eventual fire-risk management and optimization in spacecraft.

  6. Risk analysis of an RTG on the Space Shuttle

    NASA Technical Reports Server (NTRS)

    Frank, Michael V.

    1991-01-01

    As part of the effort to review the Ulysses Final Safety Analysis Report and to understand the risk of plutonium release from the Ulysses spacecraft General Purpose Heat Source-Radioisotope Thermal Generator (GPHS-RTG), the Interagency Nuclear Safety Review Panel (INSRP) and the author performed an integrated, quantitative analysis of the uncertainties of the calculated risk of plutonium release from Ulysses. Using state-of-the-art probabilistic risk assessment technology, the uncertainty analysis accounted for both variability and uncertainty of the key parameters of the risk analysis. The results show tht INSRP had high confidence that risk of fatal cancers from potential plutonium release associated with calculated launch and deployment accident scenarios is low.

  7. Applying Transactional Analysis and Personality Assessment to Improve Patient Counseling and Communication Skills

    PubMed Central

    Lawrence, Lesa

    2007-01-01

    Objective To teach pharmacy students how to apply transactional analysis and personality assessment to patient counseling to improve communication. Design A lecture series for a required pharmacy communications class was developed to teach pharmacy students how to apply transactional analysis and personality assessment to patient counseling. Students were asked to apply these techniques and to report their experiences. A personality self-assessment was also conducted. Assessment After attending the lecture series, students were able to apply the techniques and demonstrated an understanding of the psychological factors that may affect patient communication, an appreciation for the diversity created by different personality types, the ability to engage patients based on adult-to-adult interaction cues, and the ability to adapt the interactive patient counseling model to different personality traits. Conclusion Students gained a greater awareness of transactional analysis and personality assessment by applying these concepts. This understanding will help students communicate more effectively with patients. PMID:17786269

  8. Environmental risk assessment of replication competent viral vectors applied in clinical trials: potential effects of inserted sequences.

    PubMed

    van den Akker, Eric; van der Vlugt, Cecile J B; Bleijs, Diederik A; Bergmans, Hans E

    2013-12-01

    Risk assessments of clinical applications involving genetically modified viral vectors are carried out according to general principles that are implemented in many national and regional legislations, e.g., in Directive 2001/18/EC of the European Union. Recent developments in vector design have a large impact on the concepts that underpin the risk assessments of viral vectors that are used in clinical trials. The use of (conditionally) replication competent viral vectors (RCVVs) may increase the likelihood of the exposure of the environment around the patient, compared to replication defective viral vectors. Based on this assumption we have developed a methodology for the environmental risk assessment of replication competent viral vectors, which is presented in this review. Furthermore, the increased likelihood of exposure leads to a reevaluation of what would constitute a hazardous gene product in viral vector therapies, and a keen interest in new developments in the inserts used. One of the trends is the use of inserts produced by synthetic biology. In this review the implications of these developments for the environmental risk assessment of RCVVs are highlighted, with examples from current clinical trials. The conclusion is drawn that RCVVs, notwithstanding their replication competency, can be applied in an environmentally safe way, in particular if adequate built-in safeties are incorporated, like conditional replication competency, as mitigating factors to reduce adverse environmental effects that could occur.

  9. Applying a sociolinguistic model to the analysis of informed consent documents.

    PubMed

    Granero-Molina, José; Fernández-Sola, Cayetano; Aguilera-Manrique, Gabriel

    2009-11-01

    Information on the risks and benefits related to surgical procedures is essential for patients in order to obtain their informed consent. Some disciplines, such as sociolinguistics, offer insights that are helpful for patient-professional communication in both written and oral consent. Communication difficulties become more acute when patients make decisions through an informed consent document because they may sign this with a lack of understanding and information, and consequently feel deprived of their freedom to make their choice about different treatments or surgery. This article discusses findings from documentary analysis using the sociolinguistic SPEAKING model, which was applied to the general and specific informed consent documents required for laparoscopic surgery of the bile duct at Torrecárdenas Hospital, Almería, Spain. The objective of this procedure was to identify flaws when information was provided, together with its readability, its voluntary basis, and patients' consent. The results suggest potential linguistic communication difficulties, different languages being used, cultural clashes, asymmetry of communication between professionals and patients, assignment of rights on the part of patients, and overprotection of professionals and institutions.

  10. Applying Monte Carlo Simulation to Launch Vehicle Design and Requirements Analysis

    NASA Technical Reports Server (NTRS)

    Hanson, J. M.; Beard, B. B.

    2010-01-01

    This Technical Publication (TP) is meant to address a number of topics related to the application of Monte Carlo simulation to launch vehicle design and requirements analysis. Although the focus is on a launch vehicle application, the methods may be applied to other complex systems as well. The TP is organized so that all the important topics are covered in the main text, and detailed derivations are in the appendices. The TP first introduces Monte Carlo simulation and the major topics to be discussed, including discussion of the input distributions for Monte Carlo runs, testing the simulation, how many runs are necessary for verification of requirements, what to do if results are desired for events that happen only rarely, and postprocessing, including analyzing any failed runs, examples of useful output products, and statistical information for generating desired results from the output data. Topics in the appendices include some tables for requirements verification, derivation of the number of runs required and generation of output probabilistic data with consumer risk included, derivation of launch vehicle models to include possible variations of assembled vehicles, minimization of a consumable to achieve a two-dimensional statistical result, recontact probability during staging, ensuring duplicated Monte Carlo random variations, and importance sampling.

  11. Applying Ecodevelopmental Theory and the Theory of Reasoned Action to Understand HIV Risk Behaviors Among Hispanic Adolescents.

    PubMed

    Ortega, Johis; Huang, Shi; Prado, Guillermo

    2012-01-03

    HIV/AIDS is listed as one of the top 10 reasons for the death of Hispanics between the ages of 15 and 54 in the United States. This cross sectional, descriptive secondary study proposed that using both the systemic (ecodevelopmental) and the individually focused (theory of reasoned action) theories together would lead to an increased understanding of the risk and protective factors that influence HIV risk behaviors in this population. The sample consisted of 493 Hispanic adolescent 7th and 8th graders and their immigrant parents living in Miami, Florida. Structural Equation Modeling (SEM) was used for the data analysis. Family functioning emerged as the heart of the model, embedded within a web of direct and mediated relationships. The data support the idea that family can play a central role in the prevention of Hispanic adolescents' risk behaviors.

  12. Applying Ecodevelopmental Theory and the Theory of Reasoned Action to Understand HIV Risk Behaviors Among Hispanic Adolescents

    PubMed Central

    Ortega, Johis; Huang, Shi; Prado, Guillermo

    2012-01-01

    HIV/AIDS is listed as one of the top 10 reasons for the death of Hispanics between the ages of 15 and 54 in the United States. This cross sectional, descriptive secondary study proposed that using both the systemic (ecodevelopmental) and the individually focused (theory of reasoned action) theories together would lead to an increased understanding of the risk and protective factors that influence HIV risk behaviors in this population. The sample consisted of 493 Hispanic adolescent 7th and 8th graders and their immigrant parents living in Miami, Florida. Structural Equation Modeling (SEM) was used for the data analysis. Family functioning emerged as the heart of the model, embedded within a web of direct and mediated relationships. The data support the idea that family can play a central role in the prevention of Hispanic adolescents’ risk behaviors. PMID:23152718

  13. Evaluation of residue risk and toxicity of different treatments with diazinon insecticide applied to mushroom crops.

    PubMed

    Navarro, María J; Merino, Llanos; Gea, Francisco J

    2017-03-04

    This work describes the phytotoxic effect of different doses of diazinon and different application times on Agaricus bisporus mycelium, and determines the residue levels in mushrooms from the first three flushes. Mushroom cultivation is a widespread commercial activity throughout the world. The application of insecticide diazinon to the compost or casing layer is a common practice to control two mushroom pests, the phorid Megaselia halterata and the sciarid Lycoriella auripila. Application to the compost does not result in any appreciable fall in yield or quality, and does not delay the harvest time. In contrast, application to the casing led to a slight fall (6.2%) in production and a smaller number of mushrooms although they are larger in size. Residue levels of more than 0.01 ppm are detected in many of the samples analyzed, and raising the question whether the product should continue to be used in mushroom cultivation in the conditions in which it is currently applied.

  14. Comparing risk in conventional and organic dairy farming in the Netherlands: an empirical analysis.

    PubMed

    Berentsen, P B M; Kovacs, K; van Asseldonk, M A P M

    2012-07-01

    This study was undertaken to contribute to the understanding of why most dairy farmers do not convert to organic farming. Therefore, the objective of this research was to assess and compare risks for conventional and organic farming in the Netherlands with respect to gross margin and the underlying price and production variables. To investigate the risk factors a farm accountancy database was used containing panel data from both conventional and organic representative Dutch dairy farms (2001-2007). Variables with regard to price and production risk were identified using a gross margin analysis scheme. Price risk variables were milk price and concentrate price. The main production risk variables were milk yield per cow, roughage yield per hectare, and veterinary costs per cow. To assess risk, an error component implicit detrending method was applied and the resulting detrended standard deviations were compared between conventional and organic farms. Results indicate that the risk included in the gross margin per cow is significantly higher in organic farming. This is caused by both higher price and production risks. Price risks are significantly higher in organic farming for both milk price and concentrate price. With regard to production risk, only milk yield per cow poses a significantly higher risk in organic farming.

  15. Analysis of Affordance, Time, and Adaptation in the Assessment of Industrial Control System Cybersecurity Risk.

    PubMed

    Busby, J S; Green, B; Hutchison, D

    2017-01-17

    Industrial control systems increasingly use standard communication protocols and are increasingly connected to public networks-creating substantial cybersecurity risks, especially when used in critical infrastructures such as electricity and water distribution systems. Methods of assessing risk in such systems have recognized for some time the way in which the strategies of potential adversaries and risk managers interact in defining the risk to which such systems are exposed. But it is also important to consider the adaptations of the systems' operators and other legitimate users to risk controls, adaptations that often appear to undermine these controls, or shift the risk from one part of a system to another. Unlike the case with adversarial risk analysis, the adaptations of system users are typically orthogonal to the objective of minimizing or maximizing risk in the system. We argue that this need to analyze potential adaptations to risk controls is true for risk problems more generally, and we develop a framework for incorporating such adaptations into an assessment process. The method is based on the principle of affordances, and we show how this can be incorporated in an iterative procedure based on raising the minimum period of risk materialization above some threshold. We apply the method in a case study of a small European utility provider and discuss the observations arising from this.

  16. The application of risk analysis in aquatic animal health management.

    PubMed

    Peeler, E J; Murray, A G; Thebault, A; Brun, E; Giovaninni, A; Thrush, M A

    2007-09-14

    Risk analysis has only been regularly used in the management of aquatic animal health in recent years. The Agreement on the Application of Sanitary and Phytosanitary measures (SPS) stimulated the application of risk analysis to investigate disease risks associated with international trade (import risk analysis-IRA). A majority (9 of 17) of the risk analyses reviewed were IRA. The other major focus has been the parasite of Atlantic salmon--Gyrodactylus salaris. Six studies investigated the spread of this parasite, between countries, rivers and from farmed to wild stocks, and clearly demonstrated that risk analysis can support aquatic animal health policy development, from international trade and biosecurity to disease interaction between wild and farmed stocks. Other applications of risk analysis included the spread of vertically transmitted pathogens and disease emergence in aquaculture. The Covello-Merkhofer, risk analysis model was most commonly used and appears to be a flexible tool not only for IRA but also the investigation of disease spread in other contexts. The limitations of the identified risk assessments were discussed. A majority were qualitative, partly due to the lack of data for quantitative analysis, and this, it can be argued, constrained their usefulness for trade purposes (i.e. setting appropriate sanitary measures); in other instances, a qualitative result was found to be adequate for decision making. A lack of information about the disease hazards of the large number of fish species traded is likely to constrain quantitative analysis for a number of years. The consequence assessment element of a risk analysis was most likely to be omitted, or limited in scope and depth, rarely extending beyond examining the evidence of susceptibility of farmed and wild species to the identified hazard. The reasons for this are discussed and recommendations made to develop guidelines for a consistent, systematic and multi-disciplinary approach to consequence

  17. A Comparison of Disease Risk Analysis Tools for Conservation Translocations.

    PubMed

    Dalziel, Antonia Eleanor; Sainsbury, Anthony W; McInnes, Kate; Jakob-Hoff, Richard; Ewen, John G

    2017-03-01

    Conservation translocations are increasingly used to manage threatened species and restore ecosystems. Translocations increase the risk of disease outbreaks in the translocated and recipient populations. Qualitative disease risk analyses have been used as a means of assessing the magnitude of any effect of disease and the probability of the disease occurring associated with a translocation. Currently multiple alternative qualitative disease risk analysis packages are available to practitioners. Here we compare the ease of use, expertise required, transparency, and results from, three different qualitative disease risk analyses using a translocation of the endangered New Zealand passerine, the hihi (Notiomystis cincta), as a model. We show that the three methods use fundamentally different approaches to define hazards. Different methods are used to produce estimations of the risk from disease, and the estimations are different for the same hazards. Transparency of the process varies between methods from no referencing, or explanations of evidence to justify decisions, through to full documentation of resources, decisions and assumptions made. Evidence to support decisions on estimation of risk from disease is important, to enable knowledge acquired in the future, for example, from translocation outcome, to be used to improve the risk estimation for future translocations. Information documenting each disease risk analysis differs along with variation in emphasis of the questions asked within each package. The expertise required to commence a disease risk analysis varies and an action flow chart tailored for the non-wildlife health specialist are included in one method but completion of the disease risk analysis requires wildlife health specialists with epidemiological and pathological knowledge in all three methods. We show that disease risk analysis package choice may play a greater role in the overall risk estimation of the effect of disease on animal populations

  18. Research in progress in applied mathematics, numerical analysis, and computer science

    NASA Technical Reports Server (NTRS)

    1990-01-01

    Research conducted at the Institute in Science and Engineering in applied mathematics, numerical analysis, and computer science is summarized. The Institute conducts unclassified basic research in applied mathematics in order to extend and improve problem solving capabilities in science and engineering, particularly in aeronautics and space.

  19. Risk assessment methodology applied to counter IED research & development portfolio prioritization

    SciTech Connect

    Shevitz, Daniel W; O' Brien, David A; Zerkle, David K; Key, Brian P; Chavez, Gregory M

    2009-01-01

    In an effort to protect the United States from the ever increasing threat of domestic terrorism, the Department of Homeland Security, Science and Technology Directorate (DHS S&T), has significantly increased research activities to counter the terrorist use of explosives. More over, DHS S&T has established a robust Counter-Improvised Explosive Device (C-IED) Program to Deter, Predict, Detect, Defeat, and Mitigate this imminent threat to the Homeland. The DHS S&T portfolio is complicated and changing. In order to provide the ''best answer'' for the available resources, DHS S&T would like some ''risk based'' process for making funding decisions. There is a definite need for a methodology to compare very different types of technologies on a common basis. A methodology was developed that allows users to evaluate a new ''quad chart'' and rank it, compared to all other quad charts across S&T divisions. It couples a logic model with an evidential reasoning model using an Excel spreadsheet containing weights of the subjective merits of different technologies. The methodology produces an Excel spreadsheet containing the aggregate rankings of the different technologies. It uses Extensible Logic Modeling (ELM) for logic models combined with LANL software called INFTree for evidential reasoning.

  20. Applied behavior analysis: understanding and changing behavior in the community-a representative review.

    PubMed

    Luyben, Paul D

    2009-01-01

    Applied behavior analysis, a psychological discipline, has been characterized as the science of behavior change (Chance, 2006). Research in applied behavior analysis has been published for approximately 40 years since the initial publication of the Journal of Applied Behavior Analysis in 1968. The field now encompasses a wide range of human behavior. Although much of the published research centers on problem behaviors that occur in schools and among people with disabilities, a substantial body of knowledge has emerged in community settings. This article provides a review of the behavioral community research published in the Journal of Applied Behavior Analysis as representative of this work, including research in the areas of home and family, health, safety, community involvement and the environment, recreation and sports, crime and delinquency, and organizations. In the interest of space, research in schools and with people with disabilities has been excluded from this review.

  1. Risk analysis and its link with standards of the World Organisation for Animal Health.

    PubMed

    Sugiura, K; Murray, N

    2011-04-01

    Among the agreements included in the treaty that created the World Trade Organization (WTO) in January 1995 is the Agreement on the Application of Sanitary and Phytosanitary Measures (SPS Agreement) that sets out the basic rules for food safety and animal and plant health standards. The SPS Agreement designates the World Organisation for Animal Health (OIE) as the organisation responsible for developing international standards for animal health and zoonoses. The SPS Agreement requires that the sanitary measures that WTO members apply should be based on science and encourages them to either apply measures based on the OIE standards or, if they choose to adopt a higher level of protection than that provided by these standards, apply measures based on a science-based risk assessment. The OIE also provides a procedural framework for risk analysis for its Member Countries to use. Despite the inevitable challenges that arise in carrying out a risk analysis of the international trade in animals and animal products, the OIE risk analysis framework provides a structured approach that facilitates the identification, assessment, management and communication of these risks.

  2. Maternal migration and autism risk: systematic analysis.

    PubMed

    Crafa, Daina; Warfa, Nasir

    2015-02-01

    Autism (AUT) is one of the most prevalent developmental disorders emerging during childhood, and can be amongst the most incapacitating mental disorders. Some individuals with AUT require a lifetime of supervised care. Autism Speaks reported estimated costs for 2012 at £34 billion in the UK; and $3.2 million-$126 billion in the US, Australia and Canada. Ethnicity and migration experiences appear to increase risks of AUT and relate to underlying biological risk factors. Sociobiological stress factors can affect the uterine environment, or relate to stress-induced epigenetic changes during pregnancy and delivery. Epigenetic risk factors associated with AUT also include poor pregnancy conditions, low birth weight, and congenital malformation. Recent studies report that children from migrant communities are at higher risk of AUT than children born to non-migrant mothers, with the exception of Hispanic children. This paper provides the first systematic review into prevalence and predictors of AUT with a particular focus on maternal migration stressors and epigenetic risk factors. AUT rates appear higher in certain migrant communities, potentially relating to epigenetic changes after stressful experiences. Although AUT remains a rare disorder, failures to recognize its public health urgency and local community needs continue to leave certain cultural groups at a disadvantage.

  3. Successful risk assessment may not always lead to successful risk control: A systematic literature review of risk control after root cause analysis.

    PubMed

    Card, Alan J; Ward, James; Clarkson, P John

    2012-01-01

    Root cause analysis is perhaps the most widely used tool in healthcare risk management, but does it actually lead to successful risk control? Are there categories of risk control that are more likely to be effective? And do healthcare risk managers have the tools they need to support the risk control process? This systematic review examines how the healthcare sector translates risk analysis to risk control action plans and examines how to do better. It suggests that the hierarchy of risk controls should inform risk control action planning and that new tools should be developed to improve the risk control process.

  4. Association between cholesterol intake and pancreatic cancer risk: evidence from a meta-analysis.

    PubMed

    Chen, Hongqiang; Qin, Shiyong; Wang, Minghai; Zhang, Tao; Zhang, Shuguang

    2015-02-04

    Quantification of the association between the intake of cholesterol and risk of pancreatic cancer is still conflicting. We therefore conducted a meta-analysis to summarize the evidence from epidemiological studies of cholesterol intake and the risk of pancreatic cancer. Pertinent studies were delivered by PubMed and Web of Knowledge issued through April of 2014. A random effects model was used to process the data for analysis. Sensitivity analysis and publication bias were conducted. Dose-response relationship was assessed by restricted cubic spline and variance-weighted least squares regression analysis. With 4513 pancreatic cases exemplified, 16 articles were applied in the meta-analysis. Pooled results suggest that cholesterol intake level was significantly associated with the risk of pancreatic cancer [summary relative risk (RR) = 1.371, 95%CI = 1.155-1.627, I(2) = 58.2%], especially in America [summary RR = 1.302, 95%CI = 1.090-1.556]. A linear dose-response relation was attested that the risk of pancreatic cancer rises by 8% with 100 mg/day of cholesterol intake. [summary RR = 1.08, 95% CI = 1.04-1.13]. In conclusion, our analysis suggests that a high intake of cholesterol might increase the risk of pancreatic cancer, especially in America.

  5. Risk Analysis for Environmental Health Triage

    SciTech Connect

    Bogen, K T

    2005-11-18

    The Homeland Security Act mandates development of a national, risk-based system to support planning for, response to and recovery from emergency situations involving large-scale toxic exposures. To prepare for and manage consequences effectively, planners and responders need not only to identify zones of potentially elevated individual risk, but also to predict expected casualties. Emergency response support systems now define ''consequences'' by mapping areas in which toxic chemical concentrations do or may exceed Acute Exposure Guideline Levels (AEGLs) or similar guidelines. However, because AEGLs do not estimate expected risks, current unqualified claims that such maps support consequence management are misleading. Intentionally protective, AEGLs incorporate various safety/uncertainty factors depending on scope and quality of chemical-specific toxicity data. Some of these factors are irrelevant, and others need to be modified, whenever resource constraints or exposure-scenario complexities require responders to make critical trade-off (triage) decisions in order to minimize expected casualties. AEGL-exceedance zones cannot consistently be aggregated, compared, or used to calculate expected casualties, and so may seriously misguide emergency response triage decisions. Methods and tools well established and readily available to support environmental health protection are not yet developed for chemically related environmental health triage. Effective triage decisions involving chemical risks require a new assessment approach that focuses on best estimates of likely casualties, rather than on upper plausible bounds of individual risk. If risk-based consequence management is to become a reality, federal agencies tasked with supporting emergency response must actively coordinate to foster new methods that can support effective environmental health triage.

  6. Risk analysis: divergent models and convergent interpretations

    NASA Technical Reports Server (NTRS)

    Carnes, B. A.; Gavrilova, N.

    2001-01-01

    Material presented at a NASA-sponsored workshop on risk models for exposure conditions relevant to prolonged space flight are described in this paper. Analyses used mortality data from experiments conducted at Argonne National Laboratory on the long-term effects of external whole-body irradiation on B6CF1 mice by 60Co gamma rays and fission neutrons delivered as a single exposure or protracted over either 24 or 60 once-weekly exposures. The maximum dose considered was restricted to 1 Gy for neutrons and 10 Gy for gamma rays. Proportional hazard models were used to investigate the shape of the dose response at these lower doses for deaths caused by solid-tissue tumors and tumors of either connective or epithelial tissue origin. For protracted exposures, a significant mortality effect was detected at a neutron dose of 14 cGy and a gamma-ray dose of 3 Gy. For single exposures, radiation-induced mortality for neutrons also occurred within the range of 10-20 cGy, but dropped to 86 cGy for gamma rays. Plots of risk relative to control estimated for each observed dose gave a visual impression of nonlinearity for both neutrons and gamma rays. At least for solid-tissue tumors, male and female mortality was nearly identical for gamma-ray exposures, but mortality risks for females were higher than for males for neutron exposures. As expected, protracting the gamma-ray dose reduced mortality risks. Although curvature consistent with that observed visually could be detected by a model parameterized to detect curvature, a relative risk term containing only a simple term for total dose was usually sufficient to describe the dose response. Although detectable mortality for the three pathology end points considered typically occurred at the same level of dose, the highest risks were almost always associated with deaths caused by tumors of epithelial tissue origin.

  7. Analysis of Alternatives for Risk Assessment Methodologies and Tools

    SciTech Connect

    Nachtigal, Noel M.; Fruetel, Julia A.; Gleason, Nathaniel J.; Helms, Jovana; Imbro, Dennis Raymond; Sumner, Matthew C.

    2013-10-01

    The purpose of this document is to provide a basic overview and understanding of risk assessment methodologies and tools from the literature and to assess the suitability of these methodologies and tools for cyber risk assessment. Sandia National Laboratories (SNL) performed this review in support of risk modeling activities performed for the Stakeholder Engagement and Cyber Infrastructure Resilience (SECIR) division of the Department of Homeland Security (DHS) Office of Cybersecurity and Communications (CS&C). The set of methodologies and tools covered in this document is not intended to be exhaustive; instead, it focuses on those that are commonly used in the risk assessment community. The classification of methodologies and tools was performed by a group of analysts with experience in risk analysis and cybersecurity, and the resulting analysis of alternatives has been tailored to address the needs of a cyber risk assessment.

  8. Asbestos Workshop: Sampling, Analysis, and Risk Assessment

    DTIC Science & Technology

    2012-03-01

    fibrosis (fibrosis of the lining of the cavity holding the lungs) EMDQ March 2012 Chest x - ray showing areas of scarring related to asbestosis. 8...soil) •If the expected number of asbestos structures in a sample is λ, then the probability that there are exactly x asbestos fibers is equal to: •E.g...Estimating Risk for Asbestos Risk = Exposure x Toxicity = [Air] × ET × EF × IUR = f/cm3× hour/hour × day/day × (f/cm3)-1 For asbestos , ED is

  9. Integrated Hybrid System Architecture for Risk Analysis

    NASA Technical Reports Server (NTRS)

    Moynihan, Gary P.; Fonseca, Daniel J.; Ray, Paul S.

    2010-01-01

    A conceptual design has been announced of an expert-system computer program, and the development of a prototype of the program, intended for use as a project-management tool. The program integrates schedule and risk data for the purpose of determining the schedule applications of safety risks and, somewhat conversely, the effects of changes in schedules on changes on safety. It is noted that the design has been delivered to a NASA client and that it is planned to disclose the design in a conference presentation.

  10. Approaches for derivation of environmental quality criteria for substances applied in risk assessment of discharges from offshore drilling operations.

    PubMed

    Altin, Dag; Frost, Tone Karin; Nilssen, Ingunn

    2008-04-01

    In order to achieve the offshore petroleum industries "zero harm" goal to the environment, the environmental impact factor for drilling discharges was developed as a tool to identify and quantify the environmental risks associated with disposal of drilling discharges to the marine environment. As an initial step in this work the main categories of substances associated with drilling discharges and assumed to contribute to toxic or nontoxic stress were identified and evaluated for inclusion in the risk assessment. The selection were based on the known toxicological properties of the substances, or the total amount discharged together with their potential for accumulation in the water column or sediments to levels that could be expected to cause toxic or nontoxic stress to the biota. Based on these criteria 3 categories of chemicals were identified for risk assessment the water column and sediments: Natural organic substances, metals, and drilling fluid chemicals. Several approaches for deriving the environmentally safe threshold concentrations as predicted no effect concentrations were evaluated in the process. For the water column consensus were reached for using the species sensitivity distribution approach for metals and the assessment factor approach for natural organic substances and added drilling chemicals. For the sediments the equilibrium partitioning approach was selected for all three categories of chemicals. The theoretically derived sediment quality criteria were compared to field-derived threshold effect values based on statistical approaches applied on sediment monitoring data from the Norwegian Continental Shelf. The basis for derivation of predicted no effect concentration values for drilling discharges should be consistent with the principles of environmental risk assessment as described in the Technical Guidance Document on Risk Assessment issued by the European Union.

  11. Performance analysis of high quality parallel preconditioners applied to 3D finite element structural analysis

    SciTech Connect

    Kolotilina, L.; Nikishin, A.; Yeremin, A.

    1994-12-31

    The solution of large systems of linear equations is a crucial bottleneck when performing 3D finite element analysis of structures. Also, in many cases the reliability and robustness of iterative solution strategies, and their efficiency when exploiting hardware resources, fully determine the scope of industrial applications which can be solved on a particular computer platform. This is especially true for modern vector/parallel supercomputers with large vector length and for modern massively parallel supercomputers. Preconditioned iterative methods have been successfully applied to industrial class finite element analysis of structures. The construction and application of high quality preconditioners constitutes a high percentage of the total solution time. Parallel implementation of high quality preconditioners on such architectures is a formidable challenge. Two common types of existing preconditioners are the implicit preconditioners and the explicit preconditioners. The implicit preconditioners (e.g. incomplete factorizations of several types) are generally high quality but require solution of lower and upper triangular systems of equations per iteration which are difficult to parallelize without deteriorating the convergence rate. The explicit type of preconditionings (e.g. polynomial preconditioners or Jacobi-like preconditioners) require sparse matrix-vector multiplications and can be parallelized but their preconditioning qualities are less than desirable. The authors present results of numerical experiments with Factorized Sparse Approximate Inverses (FSAI) for symmetric positive definite linear systems. These are high quality preconditioners that possess a large resource of parallelism by construction without increasing the serial complexity.

  12. A review of risk analysis and helicopter air ambulance accidents.

    PubMed

    Nix, Sam; Buckner, Steven; Cercone, Richard

    2014-01-01

    The Federal Aviation Administration announced a final rule in February 2014 that includes a requirement for helicopter air ambulance operators to institute preflight risk analysis programs. This qualitative study examined risk factors that were described in 22 preliminary, factual, and probable cause helicopter air ambulance accident and incident reports that were initiated by the National Transportation Safety Board between January 1, 2011, and December 31, 2013. Insights into the effectiveness of existing preflight risk analysis strategies were gained by comparing these risk factors with the preflight risk analysis guidance that is published by the Federal Aviation Administration in the Flight Standards Information Management System. When appropriate, a deeper understanding of the human factors that may have contributed to occurrences was gained through methodologies that are described in the Human Factors Analysis and Classification System. The results of this study suggest that there are some vulnerabilities in existing preflight risk analysis guidelines that may affect safety in the helicopter air ambulance industry. The likelihood that human factors contributed to most of the helicopter air ambulance accidents and incidents that occurred during the study period was also evidenced. The results of this study suggest that effective risk analysis programs should provide pilots with both preflight and in-flight resources.

  13. System Analysis Applied to Autonomy: Application to High-Altitude Long-Endurance Remotely Operated Aircraft

    NASA Technical Reports Server (NTRS)

    Young, Larry A.; Yetter, Jeffrey A.; Guynn, Mark D.

    2006-01-01

    Maturation of intelligent systems technologies and their incorporation into aerial platforms are dictating the development of new analysis tools and incorporation of such tools into existing system analysis methodologies in order to fully capture the trade-offs of autonomy on vehicle and mission success. A first-order "system analysis of autonomy" methodology is outlined in this paper. Further, this analysis methodology is subsequently applied to notional high-altitude long-endurance (HALE) aerial vehicle missions.

  14. Cardiometabolic risk in Canada: a detailed analysis and position paper by the cardiometabolic risk working group.

    PubMed

    Leiter, Lawrence A; Fitchett, David H; Gilbert, Richard E; Gupta, Milan; Mancini, G B John; McFarlane, Philip A; Ross, Robert; Teoh, Hwee; Verma, Subodh; Anand, Sonia; Camelon, Kathryn; Chow, Chi-Ming; Cox, Jafna L; Després, Jean-Pierre; Genest, Jacques; Harris, Stewart B; Lau, David C W; Lewanczuk, Richard; Liu, Peter P; Lonn, Eva M; McPherson, Ruth; Poirier, Paul; Qaadri, Shafiq; Rabasa-Lhoret, Rémi; Rabkin, Simon W; Sharma, Arya M; Steele, Andrew W; Stone, James A; Tardif, Jean-Claude; Tobe, Sheldon; Ur, Ehud

    2011-01-01

    The concepts of "cardiometabolic risk," "metabolic syndrome," and "risk stratification" overlap and relate to the atherogenic process and development of type 2 diabetes. There is confusion about what these terms mean and how they can best be used to improve our understanding of cardiovascular disease treatment and prevention. With the objectives of clarifying these concepts and presenting practical strategies to identify and reduce cardiovascular risk in multiethnic patient populations, the Cardiometabolic Working Group reviewed the evidence related to emerging cardiovascular risk factors and Canadian guideline recommendations in order to present a detailed analysis and consolidated approach to the identification and management of cardiometabolic risk. The concepts related to cardiometabolic risk, pathophysiology, and strategies for identification and management (including health behaviours, pharmacotherapy, and surgery) in the multiethnic Canadian population are presented. "Global cardiometabolic risk" is proposed as an umbrella term for a comprehensive list of existing and emerging factors that predict cardiovascular disease and/or type 2 diabetes. Health behaviour interventions (weight loss, physical activity, diet, smoking cessation) in people identified at high cardiometabolic risk are of critical importance given the emerging crisis of obesity and the consequent epidemic of type 2 diabetes. Vascular protective measures (health behaviours for all patients and pharmacotherapy in appropriate patients) are essential to reduce cardiometabolic risk, and there is growing consensus that a multidisciplinary approach is needed to adequately address cardiometabolic risk factors. Health care professionals must also consider risk factors related to ethnicity in order to appropriately evaluate everyone in their diverse patient populations.

  15. Dynamic taxonomies applied to a web-based relational database for geo-hydrological risk mitigation

    NASA Astrophysics Data System (ADS)

    Sacco, G. M.; Nigrelli, G.; Bosio, A.; Chiarle, M.; Luino, F.

    2012-02-01

    In its 40 years of activity, the Research Institute for Geo-hydrological Protection of the Italian National Research Council has amassed a vast and varied collection of historical documentation on landslides, muddy-debris flows, and floods in northern Italy from 1600 to the present. Since 2008, the archive resources have been maintained through a relational database management system. The database is used for routine study and research purposes as well as for providing support during geo-hydrological emergencies, when data need to be quickly and accurately retrieved. Retrieval speed and accuracy are the main objectives of an implementation based on a dynamic taxonomies model. Dynamic taxonomies are a general knowledge management model for configuring complex, heterogeneous information bases that support exploratory searching. At each stage of the process, the user can explore or browse the database in a guided yet unconstrained way by selecting the alternatives suggested for further refining the search. Dynamic taxonomies have been successfully applied to such diverse and apparently unrelated domains as e-commerce and medical diagnosis. Here, we describe the application of dynamic taxonomies to our database and compare it to traditional relational database query methods. The dynamic taxonomy interface, essentially a point-and-click interface, is considerably faster and less error-prone than traditional form-based query interfaces that require the user to remember and type in the "right" search keywords. Finally, dynamic taxonomy users have confirmed that one of the principal benefits of this approach is the confidence of having considered all the relevant information. Dynamic taxonomies and relational databases work in synergy to provide fast and precise searching: one of the most important factors in timely response to emergencies.

  16. An Error Analysis for the Finite Element Method Applied to Convection Diffusion Problems.

    DTIC Science & Technology

    1981-03-01

    D TFhG-]NOLOGY k 4b 00 \\" ) ’b Technical Note BN-962 AN ERROR ANALYSIS FOR THE FINITE ELEMENT METHOD APPLIED TO CONVECTION DIFFUSION PROBLEM by I...Babu~ka and W. G. Szym’czak March 1981 V.. UNVI I Of- ’i -S AN ERROR ANALYSIS FOR THE FINITE ELEMENT METHOD P. - 0 w APPLIED TO CONVECTION DIFFUSION ...AOAO98 895 MARYLAND UNIVYCOLLEGE PARK INST FOR PHYSICAL SCIENCE--ETC F/G 12/I AN ERROR ANALYIS FOR THE FINITE ELEMENT METHOD APPLIED TO CONV..ETC (U

  17. Men having sex with men donor deferral risk assessment: an analysis using risk management principles.

    PubMed

    Leiss, William; Tyshenko, Michael; Krewski, Daniel

    2008-01-01

    This article discusses issues associated with the lifetime deferral from donating blood of men having sex with men (MSM), in the context of well-established risk management principles, including ethical considerations associated with the risk-based approach to social policy matters. Specifically, it deals with the questions about the rationale for the existing policy in Canada of lifetime deferral for MSM, a rationale applied in practice by blood collection agencies and supported by the regulatory authority of Health Canada. We identify several alternative time frames for MSM deferral: sexual abstinence over either a 10-, 5-, or 1-year period or no deferral. Two options are selected for more complete discussion, namely, abstinence for a period of either 1 or 5 years before donation. The available evidence about estimated residual risk (RR)-that is, the risk remaining after various safeguards for blood are applied-strongly suggests that choosing a 1-year deferral period for MSM would almost certainly give rise to an incremental risk of transfusion-transmitted infection (TTI), over existing levels of risk, for blood recipients. The report argues that, under these circumstances, such a policy change would represent an unethical type of risk transfer, from one social group to another, and therefore would be unacceptable. The evidence is less clear when it comes to a change to either a 10- or 5-year deferral period. This is the case in part because the current level of RR is so low that there are, inevitably, substantial ranges of uncertainties associated with the risk estimation. There is no firm evidence that such a change in the deferral period for MSM would result in an incremental level of risk, although the possibility of a very small increase in risk cannot be entirely ruled out. Under these circumstances, other social policy issues, relevant to the idea of changing the deferral period for MSM, become worthy of additional consideration.

  18. The development of a 3D risk analysis method.

    PubMed

    I, Yet-Pole; Cheng, Te-Lung

    2008-05-01

    Much attention has been paid to the quantitative risk analysis (QRA) research in recent years due to more and more severe disasters that have happened in the process industries. Owing to its calculation complexity, very few software, such as SAFETI, can really make the risk presentation meet the practice requirements. However, the traditional risk presentation method, like the individual risk contour in SAFETI, is mainly based on the consequence analysis results of dispersion modeling, which usually assumes that the vapor cloud disperses over a constant ground roughness on a flat terrain with no obstructions and concentration fluctuations, which is quite different from the real situations of a chemical process plant. All these models usually over-predict the hazardous regions in order to maintain their conservativeness, which also increases the uncertainty of the simulation results. On the other hand, a more rigorous model such as the computational fluid dynamics (CFD) model can resolve the previous limitations; however, it cannot resolve the complexity of risk calculations. In this research, a conceptual three-dimensional (3D) risk calculation method was proposed via the combination of results of a series of CFD simulations with some post-processing procedures to obtain the 3D individual risk iso-surfaces. It is believed that such technique will not only be limited to risk analysis at ground level, but also be extended into aerial, submarine, or space risk analyses in the near future.

  19. Risk-benefit analysis: from a logical point of view.

    PubMed

    Spielthenner, Georg

    2012-06-01

    In this paper I am concerned with risk-benefit analysis; that is, the comparison of the risks of a situation to its related benefits. We all face such situations in our daily lives and they are very common in medicine too, where risk-benefit analysis has become an important tool for rational decision-making. This paper explores risk-benefit analysis from a logical point of view. In particular, it seeks a better understanding of the common view that decisions should be made by weighing risks against benefits and that an option should be chosen if its benefits outweigh its risks. I devote a good deal of this paper scrutinizing this popular view. Specifically, I demonstrate that this mode of reasoning is logically faulty if "risk" and "benefit" are taken in their absolute sense. But I also show that arguing in favour of an action because its benefits outweigh its risks can be valid if we refer to incremental risks and benefits.

  20. Leakage risk assessment of the In Salah CO2 storage project: Applying the Certification Framework in a dynamic context.

    SciTech Connect

    Oldenburg, C.M.; Jordan, P.D.; Nicot, J.-P.; Mazzoldi, A.; Gupta, A.K.; Bryant, S.L.

    2010-08-01

    The Certification Framework (CF) is a simple risk assessment approach for evaluating CO{sub 2} and brine leakage risk at geologic carbon sequestration (GCS) sites. In the In Salah CO{sub 2} storage project assessed here, five wells at Krechba produce natural gas from the Carboniferous C10.2 reservoir with 1.7-2% CO{sub 2} that is delivered to the Krechba gas processing plant, which also receives high-CO{sub 2} natural gas ({approx}10% by mole fraction) from additional deeper gas reservoirs and fields to the south. The gas processing plant strips CO{sub 2} from the natural gas that is then injected through three long horizontal wells into the water leg of the Carboniferous gas reservoir at a depth of approximately 1,800 m. This injection process has been going on successfully since 2004. The stored CO{sub 2} has been monitored over the last five years by a Joint Industry Project (JIP) - a collaboration of BP, Sonatrach, and Statoil with co-funding from US DOE and EU DG Research. Over the years the JIP has carried out extensive analyses of the Krechba system including two risk assessment efforts, one before injection started, and one carried out by URS Corporation in September 2008. The long history of injection at Krechba, and the accompanying characterization, modeling, and performance data provide a unique opportunity to test and evaluate risk assessment approaches. We apply the CF to the In Salah CO{sub 2} storage project at two different stages in the state of knowledge of the project: (1) at the pre-injection stage, using data available just prior to injection around mid-2004; and (2) after four years of injection (September 2008) to be comparable to the other risk assessments. The main risk drivers for the project are CO{sub 2} leakage into potable groundwater and into the natural gas cap. Both well leakage and fault/fracture leakage are likely under some conditions, but overall the risk is low due to ongoing mitigation and monitoring activities. Results of

  1. Ergonomic analysis of working postures using OWAS in semi-trailer assembly, applying an individual sampling strategy.

    PubMed

    Brandl, Christopher; Mertens, Alexander; Schlick, Christopher M

    2017-03-01

    In semi-trailer assembly, workers are exposed to several physical risk factors. Awkward working postures have not yet been investigated in semi-trailer assembly, although they are known to be a major risk factor for musculoskeletal disorders. We therefore conducted a comprehensive ergonomic analysis of working postures using the Ovako working posture analysing system (OWAS), with an individual sampling strategy. The postural load in semi-trailer assembly was assessed on the basis of 20,601 observations of 63 workers executing a representative set of nine work tasks. According to the OWAS, the postural load of various working postures and body part positions may have a harmful effect on the musculoskeletal system. We therefore give examples of corrective measures that could improve awkward working postures. Applying an individual sampling strategy was revealed to have advantages over a collective strategy, so this is recommended for future ergonomic analyses.

  2. Quantitative landslide risk analysis Examples from Bíldudalur, NW-Iceland

    NASA Astrophysics Data System (ADS)

    Bell, R.; Glade, T.

    2003-04-01

    Risk analysis, risk evaluation and risk management are integrated in the holistic concept of risk assessment. Internationally, various quantitative, semiquantitative and qualitative approaches exist to analyse the risk to life and/or the economic risk caused by landslides. In Iceland, a method to carry out snow avalanche risk analysis was developed in 1999, followed by rough guidelines on how to integrate results from landslide hazard assessments into a comprehensive landslide and snow avalanche risk assessment in 2002. The Icelandic regulation on hazard zoning due to snow avalanches and landslides, issued by the Icelandic Ministry of the Environment in the year 2000, aims to prevent people living or working within the areas most at risk, until 2010. The regulation requires to carry out landslide and snow avalanche risk analyses, however, an approach to calculate landslide risk in detail is still missing. Therefore, the ultimate goal of this study is to develop such a method and apply it in Bildudalur, NW-Iceland. Within this presentation, the risk analysis focuses on the risks to loose life. To calculate landslide risk, the spatial and temporal probability of occurrence of potential damaging events, as well as the distribution of the elements at risk in space and time, under the consideration of changing vulnerabilities must be determined. Based on existent debris flow and rock fall run-out maps, hazard maps are derived and the respective risks are calculated. Already digitized elements at risk (people in houses) are verified and updated. The damage potential (the number of all of the people living or working at a specific location), derived from official statistics and own investigations, are attributed to each house. The vulnerability of the elements at risk is mainly based on literature studies. The probability of spatial impact (i.e. of the hazardous event impacting a building) is estimated using benchmarks given in literature, results from field

  3. The JPL Cost Risk Analysis Approach that Incorporates Engineering Realism

    NASA Technical Reports Server (NTRS)

    Harmon, Corey C.; Warfield, Keith R.; Rosenberg, Leigh S.

    2006-01-01

    This paper discusses the JPL Cost Engineering Group (CEG) cost risk analysis approach that accounts for all three types of cost risk. It will also describe the evaluation of historical cost data upon which this method is based. This investigation is essential in developing a method that is rooted in engineering realism and produces credible, dependable results to aid decision makers.

  4. Economic Risk Analysis: Using Analytical and Monte Carlo Techniques.

    ERIC Educational Resources Information Center

    O'Donnell, Brendan R.; Hickner, Michael A.; Barna, Bruce A.

    2002-01-01

    Describes the development and instructional use of a Microsoft Excel spreadsheet template that facilitates analytical and Monte Carlo risk analysis of investment decisions. Discusses a variety of risk assessment methods followed by applications of the analytical and Monte Carlo methods. Uses a case study to illustrate use of the spreadsheet tool…

  5. QRA model-based risk impact analysis of traffic flow in urban road tunnels.

    PubMed

    Meng, Qiang; Qu, Xiaobo; Yong, Kum Thong; Wong, Yoke Heng

    2011-12-01

    Road tunnels are vital infrastructures providing underground vehicular passageways for commuters and motorists. Various quantitative risk assessment (QRA) models have recently been developed and employed to evaluate the safety levels of road tunnels in terms of societal risk (as measured by the F/N curve). For a particular road tunnel, traffic volume and proportion of heavy goods vehicles (HGVs) are two adjustable parameters that may significantly affect the societal risk, and are thus very useful in implementing risk reduction solutions. To evaluate the impact the two contributing factors have on the risk, this article first presents an approach that employs a QRA model to generate societal risk for a series of possible combinations of the two factors. Some combinations may result in F/N curves that do not fulfill a predetermined safety target. This article thus proposes an "excess risk index" in order to quantify the road tunnel risk magnitudes that do not pass the safety target. The two-factor impact analysis can be illustrated by a contour chart based on the excess risk. Finally, the methodology has been applied to Singapore's KPE road tunnel and the results show that in terms of meeting the test safety target for societal risk, the traffic capacity of the tunnel should be no more than 1,200 vehs/h/lane, with a maximum proportion of 18% HGVs.

  6. Integrating Household Risk Mitigation Behavior in Flood Risk Analysis: An Agent-Based Model Approach.

    PubMed

    Haer, Toon; Botzen, W J Wouter; de Moel, Hans; Aerts, Jeroen C J H

    2016-11-28

    Recent studies showed that climate change and socioeconomic trends are expected to increase flood risks in many regions. However, in these studies, human behavior is commonly assumed to be constant, which neglects interaction and feedback loops between human and environmental systems. This neglect of human adaptation leads to a misrepresentation of flood risk. This article presents an agent-based model that incorporates human decision making in flood risk analysis. In particular, household investments in loss-reducing measures are examined under three economic decision models: (1) expected utility theory, which is the traditional economic model of rational agents; (2) prospect theory, which takes account of bounded rationality; and (3) a prospect theory model, which accounts for changing risk perceptions and social interactions through a process of Bayesian updating. We show that neglecting human behavior in flood risk assessment studies can result in a considerable misestimation of future flood risk, which is in our case study an overestimation of a factor two. Furthermore, we show how behavior models can support flood risk analysis under different behavioral assumptions, illustrating the need to include the dynamic adaptive human behavior of, for instance, households, insurers, and governments. The method presented here provides a solid basis for exploring human behavior and the resulting flood risk with respect to low-probability/high-impact risks.

  7. Applying under-sampling techniques and cost-sensitive learning methods on risk assessment of breast cancer.

    PubMed

    Hsu, Jia-Lien; Hung, Ping-Cheng; Lin, Hung-Yen; Hsieh, Chung-Ho

    2015-04-01

    Breast cancer is one of the most common cause of cancer mortality. Early detection through mammography screening could significantly reduce mortality from breast cancer. However, most of screening methods may consume large amount of resources. We propose a computational model, which is solely based on personal health information, for breast cancer risk assessment. Our model can be served as a pre-screening program in the low-cost setting. In our study, the data set, consisting of 3976 records, is collected from Taipei City Hospital starting from 2008.1.1 to 2008.12.31. Based on the dataset, we first apply the sampling techniques and dimension reduction method to preprocess the testing data. Then, we construct various kinds of classifiers (including basic classifiers, ensemble methods, and cost-sensitive methods) to predict the risk. The cost-sensitive method with random forest classifier is able to achieve recall (or sensitivity) as 100 %. At the recall of 100 %, the precision (positive predictive value, PPV), and specificity of cost-sensitive method with random forest classifier was 2.9 % and 14.87 %, respectively. In our study, we build a breast cancer risk assessment model by using the data mining techniques. Our model has the potential to be served as an assisting tool in the breast cancer screening.

  8. Probabilistic risk analysis toward cost-effective 3S (safety, safeguards, security) implementation

    NASA Astrophysics Data System (ADS)

    Suzuki, Mitsutoshi; Mochiji, Toshiro

    2014-09-01

    Probabilistic Risk Analysis (PRA) has been introduced for several decades in safety and nuclear advanced countries have already used this methodology in their own regulatory systems. However, PRA has not been developed in safeguards and security so far because of inherent difficulties in intentional and malicious acts. In this paper, probabilistic proliferation and risk analysis based on random process is applied to hypothetical reprocessing process and physical protection system in nuclear reactor with the Markov model that was originally developed by the Proliferation Resistance and Physical Protection Working Group (PRPPWG) in Generation IV International Framework (GIF). Through the challenge to quantify the security risk with a frequency in this model, integrated risk notion among 3S to pursue the cost-effective installation of those countermeasures is discussed in a heroic manner.

  9. Probabilistic risk analysis toward cost-effective 3S (safety, safeguards, security) implementation

    SciTech Connect

    Suzuki, Mitsutoshi; Mochiji, Toshiro

    2014-09-30

    Probabilistic Risk Analysis (PRA) has been introduced for several decades in safety and nuclear advanced countries have already used this methodology in their own regulatory systems. However, PRA has not been developed in safeguards and security so far because of inherent difficulties in intentional and malicious acts. In this paper, probabilistic proliferation and risk analysis based on random process is applied to hypothetical reprocessing process and physical protection system in nuclear reactor with the Markov model that was originally developed by the Proliferation Resistance and Physical Protection Working Group (PRPPWG) in Generation IV International Framework (GIF). Through the challenge to quantify the security risk with a frequency in this model, integrated risk notion among 3S to pursue the cost-effective installation of those countermeasures is discussed in a heroic manner.

  10. DOE 2009 Geothermal Risk Analysis: Methodology and Results (Presentation)

    SciTech Connect

    Young, K. R.; Augustine, C.; Anderson, A.

    2010-02-01

    This presentation summarizes the methodology and results for a probabilistic risk analysis of research, development, and demonstration work-primarily for enhanced geothermal systems (EGS)-sponsored by the U.S. Department of Energy Geothermal Technologies Program.

  11. American Airlines Propeller STOL Transport Economic Risk Analysis

    NASA Technical Reports Server (NTRS)

    Ransone, B.

    1972-01-01

    A Monte Carlo risk analysis on the economics of STOL transports in air passenger traffic established the probability of making the expected internal rate of financial return, or better, in a hypothetical regular Washington/New York intercity operation.

  12. Economic Risk Analysis of Experimental Cropping Systems Using the SMART Risk Tool

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Recently, a variant of stochastic dominance called stochastic efficiency with respect to a function (SERF) has been developed and applied. Unlike traditional stochastic dominance approaches, SERF uses the concept of certainty equivalents (CEs) to rank a set of risk-efficient alternatives instead of ...

  13. A Benefit-Risk Analysis Approach to Capture Regulatory Decision-Making: Non-Small Cell Lung Cancer.

    PubMed

    Raju, G K; Gurumurthi, K; Domike, R; Kazandjian, D; Blumenthal, G; Pazdur, R; Woodcock, J

    2016-12-01

    Drug regulators around the world make decisions about drug approvability based on qualitative benefit-risk analyses. There is much interest in quantifying regulatory approaches to benefit and risk. In this work the use of a quantitative benefit-risk analysis was applied to regulatory decision-making about new drugs to treat advanced non-small cell lung cancer (NSCLC). Benefits and risks associated with 20 US Food and Drug Administration (FDA) decisions associated with a set of candidate treatments submitted between 2003 and 2015 were analyzed. For benefit analysis, the median overall survival (OS) was used where available. When not available, OS was estimated based on overall response rate (ORR) or progression-free survival (PFS). Risks were analyzed based on magnitude (or severity) of harm and likelihood of occurrence. Additionally, a sensitivity analysis was explored to demonstrate analysis of systematic uncertainty. FDA approval decision outcomes considered were found to be consistent with the benefit-risk logic.

  14. Applied Behavior Analysis: Its Impact on the Treatment of Mentally Retarded Emotionally Disturbed People.

    ERIC Educational Resources Information Center

    Matson, Johnny L.; Coe, David A.

    1992-01-01

    This article reviews applications of the applied behavior analysis ideas of B. F. Skinner and others to persons with both mental retardation and emotional disturbance. The review examines implications of behavior analysis for operant conditioning and radical behaviorism, schedules of reinforcement, and emotion and mental illness. (DB)

  15. Sociosexuality Education for Persons with Autism Spectrum Disorders Using Principles of Applied Behavior Analysis

    ERIC Educational Resources Information Center

    Wolfe, Pamela S.; Condo, Bethany; Hardaway, Emily

    2009-01-01

    Applied behavior analysis (ABA) has emerged as one of the most effective empirically based strategies for instructing individuals with autism spectrum disorders (ASD). Four ABA-based strategies that have been found effective are video modeling, visual strategies, social script fading, and task analysis. Individuals with ASD often struggle with…

  16. Applied Behaviour Analysis and Intellectual Disability: A Long-Term Relationship?

    ERIC Educational Resources Information Center

    Remington, Bob

    1998-01-01

    This evaluative review describes the history of applied behavior analysis in the area of developmental disability and its strengths and weaknesses. Emphasis is placed on the fact that behavior analysis can continue to provide valuable insights into the education and treatment of people with mental retardation. (Author/CR)

  17. Improving Skill Development: An Exploratory Study Comparing a Philosophical and an Applied Ethical Analysis Technique

    ERIC Educational Resources Information Center

    Al-Saggaf, Yeslam; Burmeister, Oliver K.

    2012-01-01

    This exploratory study compares and contrasts two types of critical thinking techniques; one is a philosophical and the other an applied ethical analysis technique. The two techniques analyse an ethically challenging situation involving ICT that a recent media article raised to demonstrate their ability to develop the ethical analysis skills of…

  18. An Objective Comparison of Applied Behavior Analysis and Organizational Behavior Management Research

    ERIC Educational Resources Information Center

    Culig, Kathryn M.; Dickinson, Alyce M.; McGee, Heather M.; Austin, John

    2005-01-01

    This paper presents an objective review, analysis, and comparison of empirical studies targeting the behavior of adults published in Journal of Applied Behavior Analysis (JABA) and Journal of Organizational Behavior Management (JOBM) between 1997 and 2001. The purpose of the comparisons was to identify similarities and differences with respect to…

  19. USAWC Coronary Risk and Fitness Analysis

    DTIC Science & Technology

    1980-06-04

    hazardous chemicals or other to an increased risk of cancer of the (especially high fat diets) and bowel substances. mouth, throat , larynx (voice box...There is no sure technique lot death rate 60-80% greater than of the mouth, throat , larynx . giving up smoking People Nnmoke nonsmokers. They are more...mouth, throat . larynx . giving up smoking. People smoke nonsmokers. They are more likely esophagus, pancreas. and bladder. for different reasons and what

  20. Environmental risk assessment in GMO analysis.

    PubMed

    Pirondini, Andrea; Marmiroli, Nelson

    2010-01-01

    Genetically modified or engineered organisms (GMOs, GEOs) are utilised in agriculture, expressing traits of interest, such as insect or herbicide resistance. Soybean, maize, cotton and oilseed rape are the GM crops with the largest acreage in the world. The distribution of GM acreage in the different countries is related with the different positions concerning labelling of GMO products: based on the principle of substantial equivalence, or rather based on the precautionary principle. The paper provides an overview on how the risks associated with release of GMO in the environments can be analysed and predicted, in view of a possible coexistence of GM and non-GM organisms in agriculture.Risk assessment procedures, both qualitative and quantitative, are compared in the context of application to GMOs considering also legislation requirements (Directive 2001/18/EC). Criteria and measurable properties to assess harm for human health and environmental safety are listed, and the possible consequences are evaluated in terms of significance.Finally, a mapping of the possible risks deriving from GMO release is reported, focusing on gene transfer to related species, horizontal gene transfer, direct and indirect effects on non target organisms, development of resistance in target organisms, and effects on biodiversity.

  1. Environmental risk assessment in GMO analysis.

    PubMed

    Pirondini, Andrea; Marmiroli, Nelson

    2008-01-01

    Genetically modified or engineered organisms (GMOs, GEOs) are utilised in agriculture, expressing traits of interest, such as insect or herbicide resistance. Soybean, maize, cotton and oilseed rape are the GM crops with the largest acreage in the world. The distribution of GM acreage in the different countries is related with the different positions concerning labelling of GMO products: based on the principle of substantial equivalence, or rather based on the precautionary principle. The paper provides an overview on how the risks associated with release of GMO in the environments can be analysed and predicted, in view of a possible coexistence of GM and non-GM organisms in agriculture.Risk assessment procedures, both qualitative and quantitative, are compared in the context of application to GMOs considering also legislation requirements (Directive 2001/18/EC). Criteria and measurable properties to assess harm for human health and environmental safety are listed, and the possible consequences are evaluated in terms of significance.Finally, a mapping of the possible risks deriving from GMO release is reported, focusing on gene transfer to related species, horizontal gene transfer, direct and indirect effects on non target organisms, development of resistance in target organisms, and effects on biodiversity.

  2. A Cost-Benefit Analysis Applied to Example Proposals for Army Training and Education Research

    DTIC Science & Technology

    2008-02-01

    ARI Research Note 2008-01 A Cost-Benefit Analysis Applied to Example Proposals for Army Training and Education Research John E. Morrison, J. Dexter...2008 4. TITLE AND SUBTITLE 5a, CONTRACT OR GRANT NUMBER A Cost-Benefit Analysis Applied to Example Proposalsfor Army DASWO1 -04-C-0003 and W74V8H-05-C...elements of the current analysis were 21 proposed R&D efforts derived from concepts discussed in the workshop. Total costs were calculated in two ways: (1

  3. Bayesian Inference for NASA Probabilistic Risk and Reliability Analysis

    NASA Technical Reports Server (NTRS)

    Dezfuli, Homayoon; Kelly, Dana; Smith, Curtis; Vedros, Kurt; Galyean, William

    2009-01-01

    This document, Bayesian Inference for NASA Probabilistic Risk and Reliability Analysis, is intended to provide guidelines for the collection and evaluation of risk and reliability-related data. It is aimed at scientists and engineers familiar with risk and reliability methods and provides a hands-on approach to the investigation and application of a variety of risk and reliability data assessment methods, tools, and techniques. This document provides both: A broad perspective on data analysis collection and evaluation issues. A narrow focus on the methods to implement a comprehensive information repository. The topics addressed herein cover the fundamentals of how data and information are to be used in risk and reliability analysis models and their potential role in decision making. Understanding these topics is essential to attaining a risk informed decision making environment that is being sought by NASA requirements and procedures such as 8000.4 (Agency Risk Management Procedural Requirements), NPR 8705.05 (Probabilistic Risk Assessment Procedures for NASA Programs and Projects), and the System Safety requirements of NPR 8715.3 (NASA General Safety Program Requirements).

  4. The semantic distinction between "risk" and "danger": a linguistic analysis.

    PubMed

    Boholm, Max

    2012-02-01

    The analysis combines frame semantic and corpus linguistic approaches in analyzing the role of agency and decision making in the semantics of the words "risk" and "danger" (both nominal and verbal uses). In frame semantics, the meanings of "risk" and of related words, such as "danger," are analyzed against the background of a specific cognitive-semantic structure (a frame) comprising frame elements such as Protagonist, Bad Outcome, Decision, Possession, and Source. Empirical data derive from the British National Corpus (100 million words). Results indicate both similarities and differences in use. First, both "risk" and "danger" are commonly used to represent situations having potential negative consequences as the result of agency. Second, "risk" and "danger," especially their verbal uses (to risk, to endanger), differ in agent-victim structure, i.e., "risk" is used to express that a person affected by an action is also the agent of the action, while "endanger" is used to express that the one affected is not the agent. Third, "risk," but not "danger," tends to be used to represent rational and goal-directed action. The results therefore to some extent confirm the analysis of "risk" and "danger" suggested by German sociologist Niklas Luhmann. As a point of discussion, the present findings arguably have implications for risk communication.

  5. Risk Analysis for Unintentional Slide Deployment During Airline Operations.

    PubMed

    Ayra, Eduardo S; Insua, David Ríos; Castellanos, María Eugenia; Larbi, Lydia

    2015-09-01

    We present a risk analysis undertaken to mitigate problems in relation to the unintended deployment of slides under normal operations within a commercial airline. This type of incident entails relevant costs for the airline industry. After assessing the likelihood and severity of its consequences, we conclude that such risks need to be managed. We then evaluate the effectiveness of various countermeasures, describing and justifying the chosen ones. We also discuss several issues faced when implementing and communicating the proposed measures, thus fully illustrating the risk analysis process.

  6. Land Use Adaptation Strategies Analysis in Landslide Risk Region

    NASA Astrophysics Data System (ADS)

    Lin, Yu-Ching; Chang, Chin-Hsin; Chen, Ying-Tung

    2013-04-01

    In order to respond to the impact of climate and environmental change on Taiwanese mountain region, this study used GTZ (2004) Risk analysis guidelines to assess the landslide risk for 178 Taiwanese mountain towns. This study used 7 indicators to assess landslide risk, which are rainfall distribution, natural environment vulnerability (e.g., rainfall threshold criterion for debris flow, historical disaster frequency, landslide ratio, and road density), physicality vulnerability (e.g., population density) and socio-economic vulnerability (e.g., population with higher education, death rate and income). The landslide risk map can be obtained by multiplying 7 indicators together and ranking the product. The map had 5 risk ranges, and towns within the range of 4 to 5, which are high landslide risk regions, and have high priority in reducing risk. This study collected the regions with high landslide risk regions and analyzed the difference after Typhoon Morakot (2009). The spatial distribution showed that after significant environmental damage high landslide risk regions moved from central to south Taiwan. The changeable pattern of risk regions pointed out the necessity of updating the risk map periodically. Based on the landslide risk map and the land use investigation data which was provided by the National Land Surveying and Mapping Center in 2007, this study calculated the size of the land use area with landslide disaster risk. According to the above results and discussion, this study can be used to suggest appropriate land use adaptation strategies provided for reducing landslide risk under the impact of climate and environmental change.

  7. Environmental risk analysis of hazardous material rail transportation.

    PubMed

    Saat, Mohd Rapik; Werth, Charles J; Schaeffer, David; Yoon, Hongkyu; Barkan, Christopher P L

    2014-01-15

    An important aspect of railroad environmental risk management involves tank car transportation of hazardous materials. This paper describes a quantitative, environmental risk analysis of rail transportation of a group of light, non-aqueous-phase liquid (LNAPL) chemicals commonly transported by rail in North America. The Hazardous Materials Transportation Environmental Consequence Model (HMTECM) was used in conjunction with a geographic information system (GIS) analysis of environmental characteristics to develop probabilistic estimates of exposure to different spill scenarios along the North American rail network. The risk analysis incorporated the estimated clean-up cost developed using the HMTECM, route-specific probability distributions of soil type and depth to groundwater, annual traffic volume, railcar accident rate, and tank car safety features, to estimate the nationwide annual risk of transporting each product. The annual risk per car-mile (car-km) and per ton-mile (ton-km) was also calculated to enable comparison between chemicals and to provide information on the risk cost associated with shipments of these products. The analysis and the methodology provide a quantitative approach that will enable more effective management of the environmental risk of transporting hazardous materials.

  8. Spherical harmonic decomposition applied to spatial-temporal analysis of human high-density electroencephalogram

    NASA Astrophysics Data System (ADS)

    Wingeier, B. M.; Nunez, P. L.; Silberstein, R. B.

    2001-11-01

    We demonstrate an application of spherical harmonic decomposition to the analysis of the human electroencephalogram (EEG). We implement two methods and discuss issues specific to the analysis of hemispherical, irregularly sampled data. Spatial sampling requirements and performance of the methods are quantified using simulated data. The analysis is applied to experimental EEG data, confirming earlier reports of an approximate frequency-wave-number relationship in some bands.

  9. Dynamic optimization of ISR sensors using a risk-based reward function applied to ground and space surveillance scenarios

    NASA Astrophysics Data System (ADS)

    DeSena, J. T.; Martin, S. R.; Clarke, J. C.; Dutrow, D. A.; Newman, A. J.

    2012-06-01

    As the number and diversity of sensing assets available for intelligence, surveillance and reconnaissance (ISR) operations continues to expand, the limited ability of human operators to effectively manage, control and exploit the ISR ensemble is exceeded, leading to reduced operational effectiveness. Automated support both in the processing of voluminous sensor data and sensor asset control can relieve the burden of human operators to support operation of larger ISR ensembles. In dynamic environments it is essential to react quickly to current information to avoid stale, sub-optimal plans. Our approach is to apply the principles of feedback control to ISR operations, "closing the loop" from the sensor collections through automated processing to ISR asset control. Previous work by the authors demonstrated non-myopic multiple platform trajectory control using a receding horizon controller in a closed feedback loop with a multiple hypothesis tracker applied to multi-target search and track simulation scenarios in the ground and space domains. This paper presents extensions in both size and scope of the previous work, demonstrating closed-loop control, involving both platform routing and sensor pointing, of a multisensor, multi-platform ISR ensemble tasked with providing situational awareness and performing search, track and classification of multiple moving ground targets in irregular warfare scenarios. The closed-loop ISR system is fullyrealized using distributed, asynchronous components that communicate over a network. The closed-loop ISR system has been exercised via a networked simulation test bed against a scenario in the Afghanistan theater implemented using high-fidelity terrain and imagery data. In addition, the system has been applied to space surveillance scenarios requiring tracking of space objects where current deliberative, manually intensive processes for managing sensor assets are insufficiently responsive. Simulation experiment results are presented

  10. Selenium Exposure and Cancer Risk: an Updated Meta-analysis and Meta-regression

    PubMed Central

    Cai, Xianlei; Wang, Chen; Yu, Wanqi; Fan, Wenjie; Wang, Shan; Shen, Ning; Wu, Pengcheng; Li, Xiuyang; Wang, Fudi

    2016-01-01

    The objective of this study was to investigate the associations between selenium exposure and cancer risk. We identified 69 studies and applied meta-analysis, meta-regression and dose-response analysis to obtain available evidence. The results indicated that high selenium exposure had a protective effect on cancer risk (pooled OR = 0.78; 95%CI: 0.73–0.83). The results of linear and nonlinear dose-response analysis indicated that high serum/plasma selenium and toenail selenium had the efficacy on cancer prevention. However, we did not find a protective efficacy of selenium supplement. High selenium exposure may have different effects on specific types of cancer. It decreased the risk of breast cancer, lung cancer, esophageal cancer, gastric cancer, and prostate cancer, but it was not associated with colorectal cancer, bladder cancer, and skin cancer. PMID:26786590

  11. Selenium Exposure and Cancer Risk: an Updated Meta-analysis and Meta-regression.

    PubMed

    Cai, Xianlei; Wang, Chen; Yu, Wanqi; Fan, Wenjie; Wang, Shan; Shen, Ning; Wu, Pengcheng; Li, Xiuyang; Wang, Fudi

    2016-01-20

    The objective of this study was to investigate the associations between selenium exposure and cancer risk. We identified 69 studies and applied meta-analysis, meta-regression and dose-response analysis to obtain available evidence. The results indicated that high selenium exposure had a protective effect on cancer risk (pooled OR = 0.78; 95%CI: 0.73-0.83). The results of linear and nonlinear dose-response analysis indicated that high serum/plasma selenium and toenail selenium had the efficacy on cancer prevention. However, we did not find a protective efficacy of selenium supplement. High selenium exposure may have different effects on specific types of cancer. It decreased the risk of breast cancer, lung cancer, esophageal cancer, gastric cancer, and prostate cancer, but it was not associated with colorectal cancer, bladder cancer, and skin cancer.

  12. Risk Assessment of Infrastructure System of Systems with Precursor Analysis.

    PubMed

    Guo, Zhenyu; Haimes, Yacov Y

    2016-08-01

    Physical infrastructure systems are commonly composed of interconnected and interdependent subsystems, which in their essence constitute system of systems (S-o-S). System owners and policy researchers need tools to foresee potential emergent forced changes and to understand their impact so that effective risk management strategies can be developed. We develop a systemic framework for precursor analysis to support the design of an effective and efficient precursor monitoring and decision support system with the ability to (i) identify and prioritize indicators of evolving risks of system failure; and (ii) evaluate uncertainties in precursor analysis to support informed and rational decision making. This integrated precursor analysis framework is comprised of three processes: precursor identification, prioritization, and evaluation. We use an example of a highway bridge S-o-S to demonstrate the theories and methodologies of the framework. Bridge maintenance processes involve many interconnected and interdependent functional subsystems and decision-making entities and bridge failure can have broad social and economic consequences. The precursor analysis framework, which constitutes an essential part of risk analysis, examines the impact of various bridge inspection and maintenance scenarios. It enables policy researchers and analysts who are seeking a risk perspective on bridge infrastructure in a policy setting to develop more risk informed policies and create guidelines to efficiently allocate limited risk management resources and mitigate severe consequences resulting from bridge failures.

  13. Cost-Benefit Analysis for Optimization of Risk Protection Under Budget Constraints.

    PubMed

    Špačková, Olga; Straub, Daniel

    2015-05-01

    Cost-benefit analysis (CBA) is commonly applied as a tool for deciding on risk protection. With CBA, one can identify risk mitigation strategies that lead to an optimal tradeoff between the costs of the mitigation measures and the achieved risk reduction. In practical applications of CBA, the strategies are typically evaluated through efficiency indicators such as the benefit-cost ratio (BCR) and the marginal cost (MC) criterion. In many of these applications, the BCR is not consistently defined, which, as we demonstrate in this article, can lead to the identification of suboptimal solutions. This is of particular relevance when the overall budget for risk reduction measures is limited and an optimal allocation of resources among different subsystems is necessary. We show that this problem can be formulated as a hierarchical decision problem, where the general rules and decisions on the available budget are made at a central level (e.g., central government agency, top management), whereas the decisions on the specific measures are made at the subsystem level (e.g., local communities, company division). It is shown that the MC criterion provides optimal solutions in such hierarchical optimization. Since most practical applications only include a discrete set of possible risk protection measures, the MC criterion is extended to this situation. The findings are illustrated through a hypothetical numerical example. This study was prepared as part of our work on the optimal management of natural hazard risks, but its conclusions also apply to other fields of risk management.

  14. Risk Analysis for Resource Planning Optimization

    NASA Technical Reports Server (NTRS)

    Cheung, Kar-Ming

    2008-01-01

    This paper describes a systems engineering approach to resource planning by integrating mathematical modeling and constrained optimization, empirical simulation, and theoretical analysis techniques to generate an optimal task plan in the presence of uncertainties.

  15. Evaluation of bitterness in white wine applying descriptive analysis, time-intensity analysis, and temporal dominance of sensations analysis.

    PubMed

    Sokolowsky, Martina; Fischer, Ulrich

    2012-06-30

    Bitterness in wine, especially in white wine, is a complex and sensitive topic as it is a persistent sensation with negative connotation by consumers. However, the molecular base for bitter taste in white wines is still widely unknown yet. At the same time studies dealing with bitterness have to cope with the temporal dynamics of bitter perception. The most common method to describe bitter taste is the static measurement amongst other attributes during a descriptive analysis. A less frequently applied method, the time-intensity analysis, evaluates the temporal gustatory changes focusing on bitterness alone. The most recently developed multidimensional approach of the temporal dominance of sensations method reveals the temporal dominance of bitter taste in relation to other attributes. In order to compare the results comprised with these different sensory methodologies, 13 commercial white wines were evaluated by the same panel. To facilitate a statistical comparison, parameters were extracted from bitterness curves obtained from time-intensity and temporal dominance of sensations analysis and were compared to bitter intensity as well as bitter persistency based on descriptive analysis. Analysis of variance differentiated significantly the wines regarding all measured bitterness parameters obtained from the three sensory techniques. Comparing the information of all sensory parameters by multiple factor analysis and correlation, each technique provided additional valuable information regarding the complex bitter perception in white wine.

  16. Walking the line: Understanding pedestrian behaviour and risk at rail level crossings with cognitive work analysis.

    PubMed

    Read, Gemma J M; Salmon, Paul M; Lenné, Michael G; Stanton, Neville A

    2016-03-01

    Pedestrian fatalities at rail level crossings (RLXs) are a public safety concern for governments worldwide. There is little literature examining pedestrian behaviour at RLXs and no previous studies have adopted a formative approach to understanding behaviour in this context. In this article, cognitive work analysis is applied to understand the constraints that shape pedestrian behaviour at RLXs in Melbourne, Australia. The five phases of cognitive work analysis were developed using data gathered via document analysis, behavioural observation, walk-throughs and critical decision method interviews. The analysis demonstrates the complex nature of pedestrian decision making at RLXs and the findings are synthesised to provide a model illustrating the influences on pedestrian decision making in this context (i.e. time, effort and social pressures). Further, the CWA outputs are used to inform an analysis of the risks to safety associated with pedestrian behaviour at RLXs and the identification of potential interventions to reduce risk.

  17. Advanced dynamical risk analysis for monitoring anaerobic digestion process.

    PubMed

    Hess, Jonathan; Bernard, Olivier

    2009-01-01

    Methanogenic fermentation involves a natural ecosystem that can be used for waste water treatment. This anaerobic process can have two locally stable steady-states and an unstable one making the process hard to handle. The aim of this work is to propose analytical criteria to detect hazardous working modes, namely situations where the system evolves towards the acidification of the plant. We first introduce a commonly used simplified model and recall its main properties. To assess the evolution of the system we study the phase plane and split it into nineteen zones according to some qualitative traits. Then a methodology is introduced to monitor in real-time the trajectory of the system across these zones and determine its position in the plane. It leads to a dynamical risk index based on the analysis of the transitions from one zone to another, and generates a classification of the zones according to their dangerousness. Finally the proposed strategy is applied to a virtual process based on model ADM1. It is worth noting that the proposed approach do not rely on the value of the parameters and is thus very robust.

  18. Risk analysis of tyramine concentration in food production

    NASA Astrophysics Data System (ADS)

    Doudová, L.; Buňka, F.; Michálek, J.; Sedlačík, M.; Buňková, L.

    2013-10-01

    The contribution is focused on risk analysis in food microbiology. This paper evaluates the effect of selected factors on tyramine production in bacterial strains of Lactococcus genus which were assigned as tyramine producers. Tyramine is a biogenic amine sythesized from an amino acid called tyrosine. It can be found in certain foodstuffs (often in cheese), and can cause a pseudo-response in sensitive individuals. The above-mentioned bacteria are commonly used in the biotechnological process of cheese production as starter cultures. The levels of factors were chosen with respect to the conditions which can occur in this technological process. To describe and compare tyramine production in chosen microorganisms, generalized regression models were applied. Tyramine production was modelled by Gompertz curves according to the selected factors (the lactose concentration of 0-1% w/v, NaCl 0-2% w/v and aero/anaerobiosis) for 3 different types of bacterial cultivation. Moreover, estimates of model parameters were calculated and tested; multiple comparisons were discussed as well. The aim of this paper is to find a combination of factors leading to a similar tyramine production level.

  19. Assessing population exposure for landslide risk analysis using dasymetric cartography

    NASA Astrophysics Data System (ADS)

    Garcia, Ricardo A. C.; Oliveira, Sérgio C.; Zêzere, José L.

    2016-12-01

    Assessing the number and locations of exposed people is a crucial step in landslide risk management and emergency planning. The available population statistical data frequently have insufficient detail for an accurate assessment of potentially exposed people to hazardous events, mainly when they occur at the local scale, such as with landslides. The present study aims to apply dasymetric cartography to improving population spatial resolution and to assess the potentially exposed population. An additional objective is to compare the results with those obtained with a more common approach that uses, as spatial units, basic census units, which are the best spatial data disaggregation and detailed information available for regional studies in Portugal. Considering the Portuguese census data and a layer of residential building footprint, which was used as ancillary information, the number of exposed inhabitants differs significantly according to the approach used. When the census unit approach is used, considering the three highest landslide susceptible classes, the number of exposed inhabitants is in general overestimated. Despite the associated uncertainties of a general cost-benefit analysis, the presented methodology seems to be a reliable approach for gaining a first approximation of a more detailed estimation of exposed people. The approach based on dasymetric cartography allows the spatial resolution of population over large areas to be increased and enables the use of detailed landslide susceptibility maps, which are valuable for improving the exposed population assessment.

  20. Applying Association Rule of the Data Mining Method for the Network Event Analysis

    NASA Astrophysics Data System (ADS)

    Kim, Wankyung; Soh, Wooyoung

    2007-12-01

    Network event analysis gives useful information on the network status that helps protect from attacks. It involves finding sets of frequently used packet information such as IP addresses and requires real-time processing by its nature. This paper applies association rules to network event analysis. Originally association rules used for data mining can be applied to find frequent item sets. So, if frequent items occur on networks, information system can guess that there is a threat. But existed association rules such as Apriori algorithm are not suitable for analyzing network events on real-time due to the high usage of CPU and memory and thus low processing speed. This paper develops a network event audit module by applying association rules to network events using a new algorithm instead of Apriori algorithm. Test results show that the application of the new algorithm gives drastically low usage of both CPU and memory for network event analysis compared with existing Apriori algorithm.

  1. Is adaptation or transformation needed? Active nanomaterials and risk analysis

    NASA Astrophysics Data System (ADS)

    Kuzma, Jennifer; Roberts, John Patrick

    2016-07-01

    Nanotechnology has been a key area of funding and policy for the United States and globally for the past two decades. Since nanotechnology research and development became a focus and nanoproducts began to permeate the market, scholars and scientists have been concerned about how to assess the risks that they may pose to human health and the environment. The newest generation of nanomaterials includes biomolecules that can respond to and influence their environments, and there is a need to explore whether and how existing risk-analysis frameworks are challenged by such novelty. To fill this niche, we used a modified approach of upstream oversight assessment (UOA), a subset of anticipatory governance. We first selected case studies of "active nanomaterials," that are early in research and development and designed for use in multiple sectors, and then considered them under several, key risk-analysis frameworks. We found two ways in which the cases challenge the frameworks. The first category relates to how to assess risk under a narrow framing of the term (direct health and environmental harm), and the second involves the definition of what constitutes a "risk" worthy of assessment and consideration in decision making. In light of these challenges, we propose some changes for risk analysis in the face of active nanostructures in order to improve risk governance.

  2. Theoretically Motivated Interventions for Reducing Sexual Risk Taking in Adolescence: A Randomized Controlled Experiment Applying Fuzzy-trace Theory

    PubMed Central

    Reyna, Valerie F.; Mills, Britain A.

    2014-01-01

    Fuzzy-trace theory is a theory of memory, judgment, and decision-making, and their development. We applied advances in this theory to increase the efficacy and durability of a multicomponent intervention to promote risk reduction and avoidance of premature pregnancy and STIs. 734 adolescents from high schools and youth programs in three states (Arizona, Texas, and New York) were randomly assigned to one of three curriculum groups: RTR (Reducing the Risk), RTR+ (a modified version of RTR using fuzzy-trace theory), and a control group. We report effects of curriculum on self-reported behaviors and behavioral intentions plus psychosocial mediators of those effects, namely, attitudes and norms, motives to have sex or get pregnant, self-efficacy and behavioral control, and gist/verbatim constructs. Among 26 outcomes, 19 showed an effect of at least one curriculum relative to the control group: RTR+ produced improvements for 17 outcomes and RTR produced improvements for 12 outcomes. For RTR+, two differences (for perceived parental norms and global benefit perception) were confined to age, gender, or racial/ethnic subgroups. Effects of RTR+ on sexual initiation emerged six months after the intervention, when many adolescents became sexually active. Effects of RTR+ were greater than RTR for nine outcomes, and remained significantly greater than controls at one-year follow-up for 12 outcomes. Consistent with fuzzy-trace theory, results suggest that, by emphasizing gist representations, which are preserved over long time periods and are key memories used in decision-making, the enhanced intervention produced larger and more sustained effects on behavioral outcomes and psychosocial mediators of adolescent risk-taking. PMID:24773191

  3. Preliminary Technical Risk Analysis for the Geothermal Technologies Program

    SciTech Connect

    2009-01-18

    This report explains the goals, methods, and results of a probabilistic analysis of technical risk for a portfolio of R&D projects in the DOE Geothermal Technologies Program (The Program). The analysis is a task by Princeton Energy Resources International, LLC, in support of the National Renewable Energy Laboratory on behalf of the Program. The main challenge in the analysis lies in translating R&D results to a quantitative reflection of technical risk for a key Program metric: levelized cost of energy (LCOE).

  4. Germany wide seasonal flood risk analysis for agricultural crops

    NASA Astrophysics Data System (ADS)

    Klaus, Stefan; Kreibich, Heidi; Kuhlmann, Bernd; Merz, Bruno; Schröter, Kai

    2016-04-01

    In recent years, large-scale flood risk analysis and mapping has gained attention. Regional to national risk assessments are needed, for example, for national risk policy developments, for large-scale disaster management planning and in the (re-)insurance industry. Despite increasing requests for comprehensive risk assessments some sectors have not received much scientific attention, one of these is the agricultural sector. In contrast to other sectors, agricultural crop losses depend strongly on the season. Also flood probability shows seasonal variation. Thus, the temporal superposition of high flood susceptibility of crops and high flood probability plays an important role for agricultural flood risk. To investigate this interrelation and provide a large-scale overview of agricultural flood risk in Germany, an agricultural crop loss model is used for crop susceptibility analyses and Germany wide seasonal flood-frequency analyses are undertaken to derive seasonal flood patterns. As a result, a Germany wide map of agricultural flood risk is shown as well as the crop type most at risk in a specific region. The risk maps may provide guidance for federal state-wide coordinated designation of retention areas.

  5. State of the art in benefit-risk analysis: medicines.

    PubMed

    Luteijn, J M; White, B C; Gunnlaugsdóttir, H; Holm, F; Kalogeras, N; Leino, O; Magnússon, S H; Odekerken, G; Pohjola, M V; Tijhuis, M J; Tuomisto, J T; Ueland, Ø; McCarron, P A; Verhagen, H

    2012-01-01

    Benefit-risk assessment in medicine has been a valuable tool in the regulation of medicines since the 1960s. Benefit-risk assessment takes place in multiple stages during a medicine's life-cycle and can be conducted in a variety of ways, using methods ranging from qualitative to quantitative. Each benefit-risk assessment method is subject to its own specific strengths and limitations. Despite its widespread and long-time use, benefit-risk assessment in medicine is subject to debate and suffers from a number of limitations and is currently still under development. This state of the art review paper will discuss the various aspects and approaches to benefit-risk assessment in medicine in a chronological pathway. The review will discuss all types of benefit-risk assessment a medicinal product will undergo during its lifecycle, from Phase I clinical trials to post-marketing surveillance and health technology assessment for inclusion in public formularies. The benefit-risk profile of a drug is dynamic and differs for different indications and patient groups. In the end of this review we conclude benefit-risk analysis in medicine is a developed practice that is subject to continuous improvement and modernisation. Improvement not only in methodology, but also in cooperation between organizations can improve benefit-risk assessment.

  6. Evaluation dam overtopping risk based on univariate and bivariate flood frequency analysis

    NASA Astrophysics Data System (ADS)

    Goodarzi, E.; Mirzaei, M.; Shui, L. T.; Ziaei, M.

    2011-11-01

    There is a growing tendency to assess the safety levels of existing dams based on risk and uncertainty analysis using mathematical and statistical methods. This research presents the application of risk and uncertainty analysis to dam overtopping based on univariate and bivariate flood frequency analyses by applying Gumbel logistic distribution for the Doroudzan earth-fill dam in south of Iran. The bivariate frequency analysis resulted in six inflow hydrographs with a joint return period of 100-yr. The overtopping risks were computed for all of those hydrographs considering quantile of flood peak discharge (in particular 100-yr), initial depth of water in the reservoir, and discharge coefficient of spillway as uncertain variables. The maximum height of the water, as most important factor in the overtopping analysis, was evaluated using reservoir routing and the Monte Carlo and Latin hypercube techniques were applied for uncertainty analysis. Finally, the achieved results using both univariate and bivariate frequency analysis have been compared to show the significance of bivariate analyses on dam overtopping.

  7. Geotechnical risk analysis by flat dilatometer (DMT)

    NASA Astrophysics Data System (ADS)

    Amoroso, Sara; Monaco, Paola

    2015-04-01

    In the last decades we have assisted at a massive migration from laboratory testing to in situ testing, to the point that, today, in situ testing is often the major part of a geotechnical investigation. The State of the Art indicates that direct-push in situ tests, such as the Cone Penetration Test (CPT) and the Flat Dilatometer Test (DMT), are fast and convenient in situ tests for routine site investigation. In most cases the DMT estimated parameters, in particular the undrained shear strength su and the constrained modulus M, are used with the common design methods of Geotechnical Engineering for evaluating bearing capacity, settlements etc. The paper focuses on the prediction of settlements of shallow foundations, that is probably the No. 1 application of the DMT, especially in sands, where undisturbed samples cannot be retrieved, and on the risk associated with their design. A compilation of documented case histories that compare DMT-predicted vs observed settlements, was collected by Monaco et al. (2006), indicating that, in general, the constrained modulus M can be considered a reasonable "operative modulus" (relevant to foundations in "working conditions") for settlement predictions based on the traditional linear elastic approach. Indeed, the use of a site investigation method, such as DMT, that improve the accuracy of design parameters, reduces risk, and the design can then center on the site's true soil variability without parasitic test variability. In this respect, Failmezger et al. (1999, 2015) suggested to introduce Beta probability distribution, that provides a realistic and useful description of variability for geotechnical design problems. The paper estimates Beta probability distribution in research sites where DMT tests and observed settlements are available. References Failmezger, R.A., Rom, D., Ziegler, S.R. (1999). "SPT? A better approach of characterizing residual soils using other in-situ tests", Behavioral Characterics of Residual Soils, B

  8. Development of economic consequence methodology for process risk analysis.

    PubMed

    Zadakbar, Omid; Khan, Faisal; Imtiaz, Syed

    2015-04-01

    A comprehensive methodology for economic consequence analysis with appropriate models for risk analysis of process systems is proposed. This methodology uses loss functions to relate process deviations in a given scenario to economic losses. It consists of four steps: definition of a scenario, identification of losses, quantification of losses, and integration of losses. In this methodology, the process deviations that contribute to a given accident scenario are identified and mapped to assess potential consequences. Losses are assessed with an appropriate loss function (revised Taguchi, modified inverted normal) for each type of loss. The total loss is quantified by integrating different loss functions. The proposed methodology has been examined on two industrial case studies. Implementation of this new economic consequence methodology in quantitative risk assessment will provide better understanding and quantification of risk. This will improve design, decision making, and risk management strategies.

  9. Approach to proliferation risk assessment based on multiple objective analysis framework

    SciTech Connect

    Andrianov, A.; Kuptsov, I.

    2013-07-01

    The approach to the assessment of proliferation risk using the methods of multi-criteria decision making and multi-objective optimization is presented. The approach allows the taking into account of the specifics features of the national nuclear infrastructure, and possible proliferation strategies (motivations, intentions, and capabilities). 3 examples of applying the approach are shown. First, the approach has been used to evaluate the attractiveness of HEU (high enriched uranium)production scenarios at a clandestine enrichment facility using centrifuge enrichment technology. Secondly, the approach has been applied to assess the attractiveness of scenarios for undeclared production of plutonium or HEU by theft of materials circulating in nuclear fuel cycle facilities and thermal reactors. Thirdly, the approach has been used to perform a comparative analysis of the structures of developing nuclear power systems based on different types of nuclear fuel cycles, the analysis being based on indicators of proliferation risk.

  10. International publication trends in the Journal of Applied Behavior Analysis: 2000-2014.

    PubMed

    Martin, Neil T; Nosik, Melissa R; Carr, James E

    2016-06-01

    Dymond, Clarke, Dunlap, and Steiner's (2000) analysis of international publication trends in the Journal of Applied Behavior Analysis (JABA) from 1970 to 1999 revealed low numbers of publications from outside North America, leading the authors to express concern about the lack of international involvement in applied behavior analysis. They suggested that a future review would be necessary to evaluate any changes in international authorship in the journal. As a follow-up, we analyzed non-U.S. publication trends in the most recent 15 years of JABA and found similar results. We discuss potential reasons for the relative paucity of international authors and suggest potential strategies for increasing non-U.S. contributions to the advancement of behavior analysis.

  11. Predictive Validity of Pressure Ulcer Risk Assessment Tools for Elderly: A Meta-Analysis.

    PubMed

    Park, Seong-Hi; Lee, Young-Shin; Kwon, Young-Mi

    2016-04-01

    Preventing pressure ulcers is one of the most challenging goals existing for today's health care provider. Currently used tools which assess risk of pressure ulcer development rarely evaluate the accuracy of predictability, especially in older adults. The current study aimed at providing a systemic review and meta-analysis of 29 studies using three pressure ulcer risk assessment tools: Braden, Norton, and Waterlow Scales. Overall predictive validities of pressure ulcer risks in the pooled sensitivity and specificity indicated a similar range with a moderate accuracy level in all three scales, while heterogeneity showed more than 80% variability among studies. The studies applying the Braden Scale used five different cut-off points representing the primary cause of heterogeneity. Results indicate that commonly used screening tools for pressure ulcer risk have limitations regarding validity and accuracy for use with older adults due to heterogeneity among studies.

  12. State of the art in benefit-risk analysis: introduction.

    PubMed

    Verhagen, H; Tijhuis, M J; Gunnlaugsdóttir, H; Kalogeras, N; Leino, O; Luteijn, J M; Magnússon, S H; Odekerken, G; Pohjola, M V; Tuomisto, J T; Ueland, Ø; White, B C; Holm, F

    2012-01-01

    Risk-taking is normal in everyday life if there are associated (perceived) benefits. Benefit-Risk Analysis (BRA) compares the risk of a situation to its related benefits and addresses the acceptability of the risk. Over the past years BRA in relation to food and food ingredients has gained attention. Food, and even the same food ingredient, may confer both beneficial and adverse effects. Measures directed at food safety may lead to suboptimal or insufficient levels of ingredients from a benefit perspective. In BRA, benefits and risks of food (ingredients) are assessed in one go and may conditionally be expressed into one currency. This allows the comparison of adverse and beneficial effects to be qualitative and quantitative. A BRA should help policy-makers to make more informed and balanced benefit-risk management decisions. Not allowing food benefits to occur in order to guarantee food safety is a risk management decision much the same as accepting some risk in order to achieve more benefits. BRA in food and nutrition is making progress, but difficulties remain. The field may benefit from looking across its borders to learn from other research areas. The BEPRARIBEAN project (Best Practices for Risk-Benefit Analysis: experience from out of food into food; http://en.opasnet.org/w/Bepraribean) aims to do so, by working together with Medicines, Food Microbiology, Environmental Health, Economics & Marketing-Finance and Consumer Perception. All perspectives are reviewed and subsequently integrated to identify opportunities for further development of BRA for food and food ingredients. Interesting issues that emerge are the varying degrees of risk that are deemed acceptable within the areas and the trend towards more open and participatory BRA processes. A set of 6 'state of the art' papers covering the above areas and a paper integrating the separate (re)views are published in this volume.

  13. Credibility analysis of risk classes by generalized linear model

    NASA Astrophysics Data System (ADS)

    Erdemir, Ovgucan Karadag; Sucu, Meral

    2016-06-01

    In this paper generalized linear model (GLM) and credibility theory which are frequently used in nonlife insurance pricing are combined for reliability analysis. Using full credibility standard, GLM is associated with limited fluctuation credibility approach. Comparison criteria such as asymptotic variance and credibility probability are used to analyze the credibility of risk classes. An application is performed by using one-year claim frequency data of a Turkish insurance company and results of credible risk classes are interpreted.

  14. Advanced uncertainty modelling for container port risk analysis.

    PubMed

    Alyami, Hani; Yang, Zaili; Riahi, Ramin; Bonsall, Stephen; Wang, Jin

    2016-08-13

    Globalization has led to a rapid increase of container movements in seaports. Risks in seaports need to be appropriately addressed to ensure economic wealth, operational efficiency, and personnel safety. As a result, the safety performance of a Container Terminal Operational System (CTOS) plays a growing role in improving the efficiency of international trade. This paper proposes a novel method to facilitate the application of Failure Mode and Effects Analysis (FMEA) in assessing the safety performance of CTOS. The new approach is developed through incorporating a Fuzzy Rule-Based Bayesian Network (FRBN) with Evidential Reasoning (ER) in a complementary manner. The former provides a realistic and flexible method to describe input failure information for risk estimates of individual hazardous events (HEs) at the bottom level of a risk analysis hierarchy. The latter is used to aggregate HEs safety estimates collectively, allowing dynamic risk-based decision support in CTOS from a systematic perspective. The novel feature of the proposed method, compared to those in traditional port risk analysis lies in a dynamic model capable of dealing with continually changing operational conditions in ports. More importantly, a new sensitivity analysis method is developed and carried out to rank the HEs by taking into account their specific risk estimations (locally) and their Risk Influence (RI) to a port's safety system (globally). Due to its generality, the new approach can be tailored for a wide range of applications in different safety and reliability engineering and management systems, particularly when real time risk ranking is required to measure, predict, and improve the associated system safety performance.

  15. Gambler Risk Perception: A Mental Model and Grounded Theory Analysis.

    PubMed

    Spurrier, Michael; Blaszczynski, Alexander; Rhodes, Paul

    2015-09-01

    Few studies have investigated how gamblers perceive risk or the role of risk perception in disordered gambling. The purpose of the current study therefore was to obtain data on lay gamblers' beliefs on these variables and their effects on decision-making, behaviour, and disordered gambling aetiology. Fifteen regular lay gamblers (non-problem/low risk, moderate risk and problem gamblers) completed a semi-structured interview following mental models and grounded theory methodologies. Gambler interview data was compared to an expert 'map' of risk-perception, to identify comparative gaps or differences associated with harmful or safe gambling. Systematic overlapping processes of data gathering and analysis were used to iteratively extend, saturate, test for exception, and verify concepts and themes emerging from the data. The preliminary findings suggested that gambler accounts supported the presence of expert conceptual constructs, and to some degree the role of risk perception in protecting against or increasing vulnerability to harm and disordered gambling. Gambler accounts of causality, meaning, motivation, and strategy were highly idiosyncratic, and often contained content inconsistent with measures of disordered gambling. Disordered gambling appears heavily influenced by relative underestimation of risk and overvaluation of gambling, based on explicit and implicit analysis, and deliberate, innate, contextual, and learned processing evaluations and biases.

  16. Sensitivity and uncertainty analysis of a regulatory risk model

    SciTech Connect

    Kumar, A.; Manocha, A.; Shenoy, T.

    1999-07-01

    Health Risk Assessments (H.R.A.s) are increasingly being used in the environmental decision making process, starting from problem identification to the final clean up activities. A key issue concerning the results of these risk assessments is the uncertainty associated with them. This uncertainty has been associated with highly conservative estimates of risk assessment parameters in past studies. The primary purpose of this study was to investigate error propagation through a risk model. A hypothetical glass plant situated in the state of California was studied. Air emissions from this plant were modeled using the ISCST2 model and the risk was calculated using the ACE2588 model. The downwash was also considered during the concentration calculations. A sensitivity analysis on the risk computations identified five parameters--mixing depth for human consumption, deposition velocity, weathering constant, interception factors for vine crop and the average leaf vegetable consumption--which had the greatest impact on the calculated risk. A Monte Carlo analysis using these five parameters resulted in a distribution with a lesser percentage deviation than the percentage standard deviation of the input parameters.

  17. Risk Analysis and Decision Making FY 2013 Milestone Report

    SciTech Connect

    Engel, David W.; Dalton, Angela C.; Dale, Crystal; Jones, Edward; Thompson, J.

    2013-06-01

    Risk analysis and decision making is one of the critical objectives of CCSI, which seeks to use information from science-based models with quantified uncertainty to inform decision makers who are making large capital investments. The goal of this task is to develop tools and capabilities to facilitate the development of risk models tailored for carbon capture technologies, quantify the uncertainty of model predictions, and estimate the technical and financial risks associated with the system. This effort aims to reduce costs by identifying smarter demonstrations, which could accelerate development and deployment of the technology by several years.

  18. Comprehensive analysis of an ecological risk assessment of the Daliao River estuary, China.

    PubMed

    Yu, Ge; Chen, Jing; Zhang, Xueqing; Li, Zhengyan

    2013-08-01

    At present, most estuarine ecological risk studies are based on terrestrial ecosystem models, which ignore spatial heterogeneity. The Daliao River estuary has representative characteristics of many estuaries in China, and we used this estuary as the study area to formulate an estuarine ecological risk evaluation model. Targeting the estuary's special hydrodynamic condition, this model incorporated variables that were under the influence of human activities and used them as the major factors for partitioning sections of the river according to risk values. It also explored the spatial and temporal distribution laws of estuarine ecological risk. The results showed that, on the whole, the ecological risk of the Daliao River estuary area was relatively high. At a temporal level, runoff was the main factor resulting in differences in ecological risk, while at the spatial level, the ecological risk index was affected by pollutants carried by runoff from upstream, as well as downstream pollution emissions and dilution by seawater at the mouth of the sea. The characteristics of this model make it possible to simulate the spatial and temporal risk distribution in different regions and under different rainfall regimes. This model can thus be applied in other estuarine areas and provides some technical support for analysis and control of ecological destruction in estuary areas.

  19. Hip fracture risk estimation based on principal component analysis of QCT atlas: a preliminary study

    NASA Astrophysics Data System (ADS)

    Li, Wenjun; Kornak, John; Harris, Tamara; Lu, Ying; Cheng, Xiaoguang; Lang, Thomas

    2009-02-01

    We aim to capture and apply 3-dimensional bone fragility features for fracture risk estimation. Using inter-subject image registration, we constructed a hip QCT atlas comprising 37 patients with hip fractures and 38 age-matched controls. In the hip atlas space, we performed principal component analysis to identify the principal components (eigen images) that showed association with hip fracture. To develop and test a hip fracture risk model based on the principal components, we randomly divided the 75 QCT scans into two groups, one serving as the training set and the other as the test set. We applied this model to estimate a fracture risk index for each test subject, and used the fracture risk indices to discriminate the fracture patients and controls. To evaluate the fracture discrimination efficacy, we performed ROC analysis and calculated the AUC (area under curve). When using the first group as the training group and the second as the test group, the AUC was 0.880, compared to conventional fracture risk estimation methods based on bone densitometry, which had AUC values ranging between 0.782 and 0.871. When using the second group as the training group, the AUC was 0.839, compared to densitometric methods with AUC values ranging between 0.767 and 0.807. Our results demonstrate that principal components derived from hip QCT atlas are associated with hip fracture. Use of such features may provide new quantitative measures of interest to osteoporosis.

  20. Dietary Patterns and Pancreatic Cancer Risk: A Meta-Analysis.

    PubMed

    Lu, Pei-Ying; Shu, Long; Shen, Shan-Shan; Chen, Xu-Jiao; Zhang, Xiao-Yan

    2017-01-05

    A number of studies have examined the associations between dietary patterns and pancreatic cancer risk, but the findings have been inconclusive. Herein, we conducted this meta-analysis to assess the associations between dietary patterns and the risk of pancreatic cancer. MEDLINE (provided by the National Library of Medicine) and EBSCO (Elton B. Stephens Company) databases were searched for relevant articles published up to May 2016 that identified common dietary patterns. Thirty-two studies met the inclusion criteria and were finally included in this meta-analysis. A reduced risk of pancreatic cancer was shown for the highest compared with the lowest categories of healthy patterns (odds ratio, OR = 0.86; 95% confidence interval, CI: 0.77-0.95; p = 0.004) and light-moderate drinking patterns (OR = 0.90; 95% CI: 0.83-0.98; p = 0.02). There was evidence of an increased risk for pancreatic cancer in the highest compared with the lowest categories of western-type pattern (OR = 1.24; 95% CI: 1.06-1.45; p = 0.008) and heavy drinking pattern (OR = 1.29; 95% CI: 1.10-1.48; p = 0.002). The results of this meta-analysis demonstrate that healthy and light-moderate drinking patterns may decrease the risk of pancreatic cancer, whereas western-type and heavy drinking patterns may increase the risk of pancreatic cancer. Additional prospective studies are needed to confirm these findings.

  1. Quantitative Risk Analysis of Obstacle Limitation Standards

    NASA Astrophysics Data System (ADS)

    Sandaradura, Amila Silva

    Obstacle limitation surfaces (OLS) are the main safeguard against objects that can pose a hazard to aircraft operations at and around the airports. The standard dimensions of the most of these surfaces were estimated using the pilot's experience at the time when they were included in to the standard documents. As a result, some of these standards may have been overestimated while others may not provide an adequate level of safety. With airports moving to the Safety Management System (SMS) approach to design and operations safety, proper evaluation of the level of safety provided by OLS at specific sites becomes great importance to airport operators. There is no published evidence, however, for the estimation of the safety level provided by the existing OLS standards. Moreover, the rationale used by the ICAO to establish existing OLS standards is not readily available in the standard documents. Therefore this study attempts to collect actual flight path data using information provided by air traffic control radars and construct a methodology to assess the probability of aircraft deviating from their intended/protected path. The extension of the developed methodology can be used to estimate the OLS dimensions that provide an acceptable safety level for the aircraft operations. This will be helpful to estimate safe and efficient standard dimensions of the OLS and assess the risk level of objects to the aircraft operations around airports. In order to assess the existing standards and show the applications of the methodology, three case studies were conducted using aircraft data collected from Ottawa (CYOW), Calgary (CYYC) and Edmonton (CYEG) International Airports.

  2. Risk analysis of 222Rn gas received from East Anatolian Fault Zone in Turkey

    NASA Astrophysics Data System (ADS)

    Yilmaz, Mucahit; Kulahci, Fatih

    2016-06-01

    In this study, risk analysis and probability distribution methodologies are applied for 222Rn gas data received from Sürgü (Malatya) station located on East Anatolian Fault Zone (EAFZ). 222Rn data are recorded between 21.02.2007 and 06.06.2010 dates. For study are used total 1151 222Rn data. Changes in concentration of 222Rn are modeled as statistically.

  3. A quantitative analysis of fish consumption and stroke risk.

    PubMed

    Bouzan, Colleen; Cohen, Joshua T; Connor, William E; Kris-Etherton, Penny M; Gray, George M; König, Ariane; Lawrence, Robert S; Savitz, David A; Teutsch, Steven M

    2005-11-01

    Although a rich source of n-3 polyunsaturated fatty acids (PUFAs) that may confer multiple health benefits, some fish contain methyl mercury (MeHg), which may harm the developing fetus. U.S. government recommendations for women of childbearing age are to modify consumption of high-MeHg fish to reduce MeHg exposure, while recommendations encourage fish consumption among the general population because of the nutritional benefits. The Harvard Center for Risk Analysis convened an expert panel (see acknowledgements) to quantify the net impact of resulting hypothetical changes in fish consumption across the population. This paper estimates the impact of fish consumption on stroke risk. Other papers quantify coronary heart disease mortality risk and the impacts of both prenatal MeHg exposure and maternal intake of n-3 PUFAs on cognitive development. This analysis identified articles in a recent qualitative literature review that are appropriate for the development of a dose-response relationship between fish consumption and stroke risk. Studies had to satisfy quality criteria, quantify fish intake, and report the precision of the relative risk estimates. The analysis combined the relative risk results, weighting each proportionately to its precision. Six studies were identified as appropriate for inclusion in this analysis, including five prospective cohort studies and one case-control study (total of 24 exposure groups). Our analysis indicates that any fish consumption confers substantial relative risk reduction compared to no fish consumption (12% for the linear model), with the possibility that additional consumption confers incremental benefits (central estimate of 2.0% per serving per week).

  4. Evolution of Applied Behavior Analysis in the Treatment of Individuals With Autism

    ERIC Educational Resources Information Center

    Wolery, Mark; Barton, Erin E.; Hine, Jeffrey F.

    2005-01-01

    Two issues of each volume of the Journal of Applied Behavior Analysis were reviewed to identify research reports focusing on individuals with autism. The identified articles were analyzed to describe the ages of individuals with autism, the settings in which the research occurred, the nature of the behaviors targeted for intervention, and the…

  5. Using Applied Behaviour Analysis as Standard Practice in a UK Special Needs School

    ERIC Educational Resources Information Center

    Foran, Denise; Hoerger, Marguerite; Philpott, Hannah; Jones, Elin Walker; Hughes, J. Carl; Morgan, Jonathan

    2015-01-01

    This article describes how applied behaviour analysis can be implemented effectively and affordably in a maintained special needs school in the UK. Behaviour analysts collaborate with classroom teachers to provide early intensive behaviour education for young children with autism spectrum disorders (ASD), and function based behavioural…

  6. Applied Behavior Analysis in the Treatment of Severe Psychiatric Disorders: A Bibliography.

    ERIC Educational Resources Information Center

    Scotti, Joseph R.; And Others

    Clinical research in the area of severe psychiatric disorders constituted the major focus for the discipline of applied behavior analysis during the early 1960s. Recently, however, there appears to be a notable lack of a behavioral focus within many inpatient psychiatric settings and a relative dearth of published behavioral treatment studies with…

  7. Applied Behavior Analysis in Autism Spectrum Disorders: Recent Developments, Strengths, and Pitfalls

    ERIC Educational Resources Information Center

    Matson, Johnny L.; Turygin, Nicole C.; Beighley, Jennifer; Rieske, Robert; Tureck, Kimberly; Matson, Michael L.

    2012-01-01

    Autism has become one of the most heavily researched topics in the field of mental health and education. While genetics has been the most studied of all topics, applied behavior analysis (ABA) has also received a great deal of attention, and has arguably yielded the most promising results of any research area to date. The current paper provides a…

  8. Graphical and Numerical Descriptive Analysis: Exploratory Tools Applied to Vietnamese Data

    ERIC Educational Resources Information Center

    Haughton, Dominique; Phong, Nguyen

    2004-01-01

    This case study covers several exploratory data analysis ideas, the histogram and boxplot, kernel density estimates, the recently introduced bagplot--a two-dimensional extension of the boxplot--as well as the violin plot, which combines a boxplot with a density shape plot. We apply these ideas and demonstrate how to interpret the output from these…

  9. Applied Behavior Analysis Programs for Autism: Sibling Psychosocial Adjustment during and Following Intervention Use

    ERIC Educational Resources Information Center

    Cebula, Katie R.

    2012-01-01

    Psychosocial adjustment in siblings of children with autism whose families were using a home-based, applied behavior analysis (ABA) program was compared to that of siblings in families who were not using any intensive autism intervention. Data gathered from parents, siblings and teachers indicated that siblings in ABA families experienced neither…

  10. A Self-Administered Parent Training Program Based upon the Principles of Applied Behavior Analysis

    ERIC Educational Resources Information Center

    Maguire, Heather M.

    2012-01-01

    Parents often respond to challenging behavior exhibited by their children in such a way that unintentionally strengthens it. Applied behavior analysis (ABA) is a research-based science that has been proven effective in remediating challenging behavior in children. Although many parents could benefit from using strategies from the field of ABA with…

  11. A National UK Census of Applied Behavior Analysis School Provision for Children with Autism

    ERIC Educational Resources Information Center

    Griffith, G. M.; Fletcher, R.; Hastings, R. P.

    2012-01-01

    Over more than a decade, specialist Applied Behavior Analysis (ABA) schools or classes for children with autism have developed in the UK and Ireland. However, very little is known internationally about how ABA is defined in practice in school settings, the characteristics of children supported in ABA school settings, and the staffing structures…

  12. Applied Behaviour Analysis: Does Intervention Intensity Relate to Family Stressors and Maternal Well-Being?

    ERIC Educational Resources Information Center

    Schwichtenberg, A.; Poehlmann, J.

    2007-01-01

    Background: Interventions based on applied behaviour analysis (ABA) are commonly recommended for children with an autism spectrum disorder (ASD); however, few studies address how this intervention model impacts families. The intense requirements that ABA programmes place on children and families are often cited as a critique of the programme,…

  13. Conversation after Right Hemisphere Brain Damage: Motivations for Applying Conversation Analysis

    ERIC Educational Resources Information Center

    Barnes, Scott; Armstrong, Elizabeth

    2010-01-01

    Despite the well documented pragmatic deficits that can arise subsequent to Right Hemisphere Brain Damage (RHBD), few researchers have directly studied everyday conversations involving people with RHBD. In recent years, researchers have begun applying Conversation Analysis (CA) to the everyday talk of people with aphasia. This research programme…

  14. A Case Study in the Misrepresentation of Applied Behavior Analysis in Autism: The Gernsbacher Lectures

    PubMed Central

    Morris, Edward K

    2009-01-01

    I know that most men, including those at ease with problems of the greatest complexity, can seldom accept the simplest and most obvious truth if it be such as would oblige them to admit the falsity of conclusions which they have proudly taught to others, and which they have woven, thread by thread, into the fabrics of their life. (Tolstoy, 1894) This article presents a case study in the misrepresentation of applied behavior analysis for autism based on Morton Ann Gernsbacher's presentation of a lecture titled “The Science of Autism: Beyond the Myths and Misconceptions.” Her misrepresentations involve the characterization of applied behavior analysis, descriptions of practice guidelines, reviews of the treatment literature, presentations of the clinical trials research, and conclusions about those trials (e.g., children's improvements are due to development, not applied behavior analysis). The article also reviews applied behavior analysis' professional endorsements and research support, and addresses issues in professional conduct. It ends by noting the deleterious effects that misrepresenting any research on autism (e.g., biological, developmental, behavioral) have on our understanding and treating it in a transdisciplinary context. PMID:22478522

  15. Lovaas Model of Applied Behavior Analysis. What Works Clearinghouse Intervention Report

    ERIC Educational Resources Information Center

    What Works Clearinghouse, 2010

    2010-01-01

    The "Lovaas Model of Applied Behavior Analysis" is a type of behavioral therapy that initially focuses on discrete trials: brief periods of one-on-one instruction, during which a teacher cues a behavior, prompts the appropriate response, and provides reinforcement to the child. Children in the program receive an average of 35 to 40 hours…

  16. Says Who?: Students Apply Their Critical-Analysis Skills to Fight Town Hall

    ERIC Educational Resources Information Center

    Trimarchi, Ruth

    2002-01-01

    For some time the author looked for a tool to let students apply what they are learning about critical analysis in the science classroom to a relevant life experience. The opportunity occurred when a proposal to use environmentally friendly cleaning products in town buildings appeared on the local town meeting agenda. Using a copy of the proposal…

  17. Logical Criteria Applied in Writing and in Editing by Text Analysis.

    ERIC Educational Resources Information Center

    Mandersloot, Wim G. B.

    1996-01-01

    Argues that technical communication editing is most effective if it deals with structure first, and that structure deficiencies can be detected by applying a range of logical analysis criteria to each text part. Concludes that lists, headings, classifications, and organograms must comply with the laws of categorization and relevant logical…

  18. Benefit-Risk Analysis for Decision-Making: An Approach.

    PubMed

    Raju, G K; Gurumurthi, K; Domike, R

    2016-12-01

    The analysis of benefit and risk is an important aspect of decision-making throughout the drug lifecycle. In this work, the use of a benefit-risk analysis approach to support decision-making was explored. The proposed approach builds on the qualitative US Food and Drug Administration (FDA) approach to include a more explicit analysis based on international standards and guidance that enables aggregation and comparison of benefit and risk on a common basis and a lifecycle focus. The approach is demonstrated on six decisions over the lifecycle (e.g., accelerated approval, withdrawal, and traditional approval) using two case studies: natalizumab for multiple sclerosis (MS) and bedaquiline for multidrug-resistant tuberculosis (MDR-TB).

  19. A semi-quantitative approach to GMO risk-benefit analysis.

    PubMed

    Morris, E Jane

    2011-10-01

    In many countries there are increasing calls for the benefits of genetically modified organisms (GMOs) to be considered as well as the risks, and for a risk-benefit analysis to form an integral part of GMO regulatory frameworks. This trend represents a shift away from the strict emphasis on risks, which is encapsulated in the Precautionary Principle that forms the basis for the Cartagena Protocol on Biosafety, and which is reflected in the national legislation of many countries. The introduction of risk-benefit analysis of GMOs would be facilitated if clear methodologies were available to support the analysis. Up to now, methodologies for risk-benefit analysis that would be applicable to the introduction of GMOs have not been well defined. This paper describes a relatively simple semi-quantitative methodology that could be easily applied as a decision support tool, giving particular consideration to the needs of regulators in developing countries where there are limited resources and experience. The application of the methodology is demonstrated using the release of an insect resistant maize variety in South Africa as a case study. The applicability of the method in the South African regulatory system is also discussed, as an example of what might be involved in introducing changes into an existing regulatory process.

  20. An analysis program for occupational cohort mortality and update cancer risk in copper miners.

    PubMed

    Chen, R

    1996-01-01

    Author has developed a computer analysis system to deal with data from occupational follow-up studies, including: (1) input and administration of data; (2) calculation of person-years at risk and follow-up rate; (3) standardised mortality ratios for all cause-of-death categories; (4) the differences and trends of the cancer risks among subcategories defined by variables such as year and age at death, year and age at start of exposure, duration of exposure and time since first exposure, and job titles; and (5) life expectancy analysis. It is explained and applied in an updated cohort of copper miners. The computed results showed that the SMR for all cancer was elevated to 129 (95% CI 117-142). The SMR increased with calendar periods and a higher risk of cancer deaths was found in the miners employed in the 1950s. The miners who were exposed at a younger age had more chance of developing cancer. The risk of cancer deaths increased with the time since first exposure and more strongly with the duration of exposure. The SMR of cancer in underground miners reached up to 137 significantly. All analysis suggests that the occupational exposure (possibly silica dust) could be considered as a risk factor of cancer among the copper miners. The analysis of life expectancy indicated that deaths from circulatory system disease shorten more life expectancy for the miners.