Science.gov

Sample records for applying risk analysis

  1. Risk analysis for confined space entries: Critical analysis of four tools applied to three risk scenarios.

    PubMed

    Burlet-Vienney, Damien; Chinniah, Yuvin; Bahloul, Ali; Roberge, Brigitte

    2016-06-01

    Investigation reports of fatal confined space accidents nearly always point to a problem of identifying or underestimating risks. This paper compares 4 different risk analysis tools developed for confined spaces by applying them to 3 hazardous scenarios. The tools were namely 1. a checklist without risk estimation (Tool A), 2. a checklist with a risk scale (Tool B), 3. a risk calculation without a formal hazard identification stage (Tool C), and 4. a questionnaire followed by a risk matrix (Tool D). Each tool's structure and practical application were studied. Tools A and B gave crude results comparable to those of more analytic tools in less time. Their main limitations were lack of contextual information for the identified hazards and greater dependency on the user's expertise and ability to tackle hazards of different nature. Tools C and D utilized more systematic approaches than tools A and B by supporting risk reduction based on the description of the risk factors. Tool D is distinctive because of 1. its comprehensive structure with respect to the steps suggested in risk management, 2. its dynamic approach to hazard identification, and 3. its use of data resulting from the risk analysis. PMID:26864350

  2. A Hygrothermal Risk Analysis Applied to Residential Unvented Attics

    SciTech Connect

    Pallin, Simon B; Kehrer, Manfred

    2013-01-01

    Aresidential building, constructed with an unvented attic, is acommonroof assembly in the United States.The expected hygrothermal performance and service life of the roof are difficult to estimate due to a number of varying parameters.Typical parameters expected to vary are the climate, direction, and slope of the roof as well as the radiation properties of the surface material. Furthermore, influential parameters are indoor moisture excess, air leakages through the attic floor, and leakages from air-handling unit and ventilation ducts. In addition, the type of building materials such as the insulation material and closed or open cell spray polyurethane foam will influence the future performance of the roof. A development of a simulation model of the roof assembly will enable a risk and sensitivity analysis, in which the most important varying parameters on the hygrothermal performance can be determined. The model is designed to perform probabilistic simulations using mathematical and hygrothermal calculation tools. The varying input parameters can be chosen from existing measurements, simulations, or standards. An analysis is applied to determine the risk of consequences, such as mold growth, rot, or energy demand of the HVAC unit. Furthermore, the future performance of the roof can be simulated in different climates to facilitate the design of an efficient and reliable roof construction with the most suitable technical solution and to determine the most appropriate building materials for a given climate

  3. Risk-informed criticality analysis as applied to waste packages subject to a subsurface igneous intrusion

    NASA Astrophysics Data System (ADS)

    Kimball, Darby Suzan

    Practitioners of many branches of nuclear facility safety use probabilistic risk assessment (PRA) methodology, which evaluates the reliability of a system along with the consequences of various failure states. One important exception is nuclear criticality safety, which traditionally produces binary results (critical or subcritical, based upon value of the effective multiplication factor, keff). For complex systems, criticality safety can benefit from application of the more flexible PRA techniques. A new risk-based technique in criticality safety analysis is detailed. In addition to identifying the most reactive configuration(s) and determining subcriticality, it yields more information about the relative reactivity contributions of various factors. By analyzing a more complete system, confidence that the system will remain subcritical is increased and areas where additional safety features would be most effective are indicated. The first step in the method is to create a criticality event tree (a specialized form of event tree where multiple outcomes stemming from a single event are acceptable). The tree lists events that impact reactivity by changing a system parameter. Next, the value of keff is calculated for the end states using traditional methods like the MCNP code. As calculations progress, the criticality event tree is modified; event branches demonstrated to have little effect on reactivity may be collapsed (thus reducing the total number of criticality runs), and branches may be added if more information is needed to characterize the system. When the criticality event tree is mature, critical limits are determined according to traditional validation techniques. Finally, results are evaluated. Criticality for the system is determined by comparing the value of k eff for each end state to the critical limit derived for those cases. The relative contributions of various events to criticality are identified by comparing end states resulting from different

  4. Applying Uncertainty Analysis to a Risk Assessment for the Pesticide Permethrin

    EPA Science Inventory

    We discuss the application of methods of uncertainty analysis from our previous poster to the problem of a risk assessment for exposure to the food-use pesticide permethrin resulting from residential pesticide crack and crevice application. Exposures are simulated by the SHEDS (S...

  5. Risk-based analysis methods applied to nuclear power plant technical specifications

    SciTech Connect

    Wagner, D.P.; Minton, L.A.; Gaertner, J.P.

    1989-03-01

    A computer-aided methodology and practical applications of risk-based evaluation of technical specifications are described. The methodology, developed for use by the utility industry, is a part of the overall process of improving nuclear power plant technical specifications. The SOCRATES computer program uses the results of a probabilistic risk assessment or a system-level risk analysis to calculate changes in risk due to changes in the surveillance test interval and/or the allowed outage time stated in the technical specification. The computer program can accommodate various testing strategies (such as staggered or simultaneous testing) to allow modeling of component testing as it is carried out at the plant. The methods and computer program are an integral part of a larger decision process aimed at determining benefits from technical specification changes. These benefits can include cost savings to the utilities by reducing forced shutdowns and decreasing labor requirements for test and maintenance activities, with no adverse impacts on risk. The methodology and the SOCRATES computer program have been used extensively toe valuate several actual technical specifications in case studies demonstrating the methods. Summaries of these applications demonstrate the types of results achieved and the usefulness of the risk-based evaluation in improving the technical specifications.

  6. Large-scale risk prediction applied to Genetic Analysis Workshop 17 mini-exome sequence data

    PubMed Central

    2011-01-01

    We consider the application of Efron’s empirical Bayes classification method to risk prediction in a genome-wide association study using the Genetic Analysis Workshop 17 (GAW17) data. A major advantage of using this method is that the effect size distribution for the set of possible features is empirically estimated and that all subsequent parameter estimation and risk prediction is guided by this distribution. Here, we generalize Efron’s method to allow for some of the peculiarities of the GAW17 data. In particular, we introduce two ways to extend Efron’s model: a weighted empirical Bayes model and a joint covariance model that allows the model to properly incorporate the annotation information of single-nucleotide polymorphisms (SNPs). In the course of our analysis, we examine several aspects of the possible simulation model, including the identity of the most important genes, the differing effects of synonymous and nonsynonymous SNPs, and the relative roles of covariates and genes in conferring disease risk. Finally, we compare the three methods to each other and to other classifiers (random forest and neural network). PMID:22373389

  7. A review of dendrogeomorphological research applied to flood risk analysis in Spain

    NASA Astrophysics Data System (ADS)

    Díez-Herrero, A.; Ballesteros, J. A.; Ruiz-Villanueva, V.; Bodoque, J. M.

    2013-08-01

    Over the last forty years, applying dendrogeomorphology to palaeoflood analysis has improved estimates of the frequency and magnitude of past floods worldwide. This paper reviews the main results obtained by applying dendrogeomorphology to flood research in several case studies in Central Spain. These dendrogeomorphological studies focused on the following topics: (1) anatomical analysis to understand the physiological response of trees to flood damage and improve sampling efficiency; (2) compiling robust flood chronologies in ungauged mountain streams, (3) determining flow depth and estimating flood discharge using two-dimensional hydraulic modelling, and comparing them with other palaeostage indicators; (4) calibrating hydraulic model parameters (i.e. Manning roughness); and (5) implementing stochastic-based, cost-benefit analysis to select optimal mitigation measures. The progress made in these areas is presented with suggestions for further research to improve the applicability of dendrogeochronology to palaeoflood studies. Further developments will include new methods for better identification of the causes of specific types of flood damage to trees (e.g. tilted trees) or stable isotope analysis of tree rings to identify the climatic conditions associated with periods of increasing flood magnitude or frequency.

  8. Applying Data Envelopment Analysis to Preventive Medicine: A Novel Method for Constructing a Personalized Risk Model of Obesity

    PubMed Central

    Narimatsu, Hiroto; Nakata, Yoshinori; Nakamura, Sho; Sato, Hidenori; Sho, Ri; Otani, Katsumi; Kawasaki, Ryo; Kubota, Isao; Ueno, Yoshiyuki; Kato, Takeo; Yamashita, Hidetoshi; Fukao, Akira; Kayama, Takamasa

    2015-01-01

    Data envelopment analysis (DEA) is a method of operations research that has not yet been applied in the field of obesity research. However, DEA might be used to evaluate individuals’ susceptibility to obesity, which could help establish effective risk models for the onset of obesity. Therefore, we conducted this study to evaluate the feasibility of applying DEA to predict obesity, by calculating efficiency scores and evaluating the usefulness of risk models. In this study, we evaluated data from the Takahata study, which was a population-based cohort study (with a follow-up study) of Japanese people who are >40 years old. For our analysis, we used the input-oriented Charnes-Cooper-Rhodes model of DEA, and defined the decision-making units (DMUs) as individual subjects. The inputs were defined as (1) exercise (measured as calories expended) and (2) the inverse of food intake (measured as calories ingested). The output was defined as the inverse of body mass index (BMI). Using the β coefficients for the participants’ single nucleotide polymorphisms, we then calculated their genetic predisposition score (GPS). Both efficiency scores and GPS were available for 1,620 participants from the baseline survey, and for 708 participants from the follow-up survey. To compare the strengths of the associations, we used models of multiple linear regressions. To evaluate the effects of genetic factors and efficiency score on body mass index (BMI), we used multiple linear regression analysis, with BMI as the dependent variable, GPS and efficiency scores as the explanatory variables, and several demographic controls, including age and sex. Our results indicated that all factors were statistically significant (p < 0.05), with an adjusted R2 value of 0.66. Therefore, it is possible to use DEA to predict environmentally driven obesity, and thus to establish a well-fitted model for risk of obesity. PMID:25973987

  9. Challenges in paleoflood hydrology applied to risk analysis in mountainous watersheds - A review

    NASA Astrophysics Data System (ADS)

    Bodoque, J. M.; Díez-Herrero, A.; Eguibar, M. A.; Benito, G.; Ruiz-Villanueva, V.; Ballesteros-Cánovas, J. A.

    2015-10-01

    In many regions of the world flood events in mountain basins are one of the greatest risks to the local population, due to the pressure placed on land use by social and economic development. Conventional hydrologic-hydraulic methodological approaches are not usually feasible in mountainous basins because they are not gauged at all or, in the best-case scenario, are poorly gauged. In this context, palaeohydrological research offers a valuable alternative to the above approaches. However, many palaeohydrological data sources and associated methods have been proposed and initially used in large basins with extensive floodplains. As a result, when they are used in mountainous areas they must be adapted to include different techniques, since the problems to be addressed are different and less data is usually available. In this paper, we review classic data sources and different analytical methods and discuss their advantages and shortcomings with particular attention to mountain basins. For this purpose, examples are provided where improvements in the palaeohydrologic methods are proposed by incorporating uncertainties, describing sources of error or putting forward hypotheses for hydraulic calculation to make palaeoflood hydrology more objective and useful in risk assessment.

  10. Applying Multiple Criteria Decision Analysis to Comparative Benefit-Risk Assessment: Choosing among Statins in Primary Prevention.

    PubMed

    Tervonen, Tommi; Naci, Huseyin; van Valkenhoef, Gert; Ades, Anthony E; Angelis, Aris; Hillege, Hans L; Postmus, Douwe

    2015-10-01

    Decision makers in different health care settings need to weigh the benefits and harms of alternative treatment strategies. Such health care decisions include marketing authorization by regulatory agencies, practice guideline formulation by clinical groups, and treatment selection by prescribers and patients in clinical practice. Multiple criteria decision analysis (MCDA) is a family of formal methods that help make explicit the tradeoffs that decision makers accept between the benefit and risk outcomes of different treatment options. Despite the recent interest in MCDA, certain methodological aspects are poorly understood. This paper presents 7 guidelines for applying MCDA in benefit-risk assessment and illustrates their use in the selection of a statin drug for the primary prevention of cardiovascular disease. We provide guidance on the key methodological issues of how to define the decision problem, how to select a set of nonoverlapping evaluation criteria, how to synthesize and summarize the evidence, how to translate relative measures to absolute ones that permit comparisons between the criteria, how to define suitable scale ranges, how to elicit partial preference information from the decision makers, and how to incorporate uncertainty in the analysis. Our example on statins indicates that fluvastatin is likely to be the most preferred drug by our decision maker and that this result is insensitive to the amount of preference information incorporated in the analysis. PMID:25986470

  11. Applied Behavior Analysis

    ERIC Educational Resources Information Center

    Szapacs, Cindy

    2006-01-01

    Teaching strategies that work for typically developing children often do not work for those diagnosed with an autism spectrum disorder. However, teaching strategies that work for children with autism do work for typically developing children. In this article, the author explains how the principles and concepts of Applied Behavior Analysis can be…

  12. Municipal solid waste management health risk assessment from air emissions for China by applying life cycle analysis.

    PubMed

    Li, Hua; Nitivattananon, Vilas; Li, Peng

    2015-05-01

    This study is to quantify and objectively evaluate the extent of environmental health risks from three waste treatment options suggested by the national municipal solid waste management enhancing strategy (No [2011] 9 of the State Council, promulgated on 19 April 2011), which includes sanitary landfill, waste-to-energy incineration and compost, together with the material recovery facility through a case study in Zhangqiu City of China. It addresses potential chronic health risks from air emissions to residential receptors in the impacted area. It combines field survey, analogue survey, design documents and life cycle inventory methods in defining the source strength of chemicals of potential concern. The modelling of life cycle inventory and air dispersion is via integrated waste management(IWM)-2 and Screening Air Dispersion Model (Version 3.0) (SCREEN3). The health risk assessment is in accordance with United States Environmental Protection Agency guidance Risk Assessment Guidance for Superfund (RAGS), Volume I: Human Health Evaluation Manual (Part F, Supplemental Guidance for Inhalation Risk Assessment). The exposure concentration is based on long-term exposure to the maximum ground level contaminant in air under the 'reasonable worst situation' emissions and then directly compared with reference for concentration and unit risk factor/cancer slope factor derived from the national air quality standard (for a conventional pollutant) and toxicological studies (for a specific pollutant). Results from this study suggest that the option of compost with material recovery facility treatment may pose less negative health impacts than other options; the sensitivity analysis shows that the landfill integrated waste management collection rate has a great influence on the impact results. Further investigation is needed to validate or challenge the findings of this study. PMID:25908094

  13. Applied Behavior Analysis in Education.

    ERIC Educational Resources Information Center

    Cooper, John O.

    1982-01-01

    Applied behavioral analysis in education is expanding rapidly. This article describes the dimensions of applied behavior analysis and the contributions this technology offers teachers in the area of systematic applications, direct and daily measurement, and experimental methodology. (CJ)

  14. An engineering geology approach applied to objective risk analysis'' and for the quantification of the probability of success'' of petroleum leads'' and prospects''

    SciTech Connect

    Font, R.G. )

    1994-12-01

    In engineering geology, people often labor to standardize and quantify techniques and methods utilized in the solution of critical problems. Applications related to the field of petroleum technology are certainly no exception. For example, risk analysis of petroleum exploratory projects and prospects is often arbitrary and biased. The calculation of prospect risk introduced in this article is designed to remove subjectivity'' by establishing quantitative standards.'' The technique allows the same method of analysis to be applied to prospects worldwide. It may be used as introduced in this paper (as employed by the author) or as a model, since it possesses the flexibility to be modified by individual users as long as standards are internally defined and utilized by all members of an exploration team. Risk analysis has been discussed in much detail. However, an original method for establishing quantitative standards for risk assessment is addressed and introduced in this paper. As defined here, prospect risk (described as the probability of success'' or Ps'') is the product of the following: Ps = (Trap) [times] (Reservoir) [times] (Source) [times] (Recovery) [times] (Timing). Well-defined, quantitative standards must be established if one is to remove subjectivity from risk assessment. In order to establish such standards to be used uniformly by explorationists, each of the above-referenced factors is individually evaluated and numerically defined utilizing the category and ranking system outlined in Table 1. If all five, independent parameters had individual values of 1.00, then the prospective venture would have an overall probability of success of 100 percent. Similarly, if all parameters exhibited values of 0.90, the overall chance of success would equal 77 percent. Values of 0.80 equate to 33 percent, values of 0.70 to 17 percent, values of 0.60 to 8 percent, values of 0.50 to 1 percent, and values of 0.30 to 0.2 percent.

  15. Exploring Students at Risk for Reading Comprehension Difficulties in South Korea: The RTI Approach Applying Latent Class Growth Analysis

    ERIC Educational Resources Information Center

    Kim, Dongil; Kim, Woori; Koh, Hyejung; Lee, Jaeho; Shin, Jaehyun; Kim, Heeju

    2014-01-01

    The purpose of this study was to identify students at risk of reading comprehension difficulties by using the responsiveness to intervention (RTI) approach. The participants were 177 students in Grades 1-3 in three elementary schools in South Korea. The students received Tier 1 instruction of RTI from March to May 2011, and their performance was…

  16. Risk Analysis

    NASA Technical Reports Server (NTRS)

    Morring, Frank, Jr.

    2004-01-01

    A National Academies panel says the Hubble Space Telescope is too valuable ;or gamblingon a long-shot robotic mission to extend its service life, and urges Directly contradicting Administrator Sean O'Keefe, who killed a planned fifth shuttle servicing mission to the telescope on grounds it was too dangerous for a human crew in the post-Challenger environment, the expert committee found that upgrades to shuttle safety actually should make it less hazardous to fly to the telescope than it was before Columbia was lost. Risks of a telescope-servicing mission are only marginally greater than the planned missions to the International Space Station (ISS) O'Keefe has authorized, the panel found. After comparing those risks to the dangers inherent in trying to develop a complex space robot in the 39 months remaining in the Hubble s estimated service life, the panel opted for the human mission to save one of the major achievements of the American space program, in the words of Louis J. Lanzerotti, its chairman.

  17. The Relative Importance of the Vadose Zone in Multimedia Risk Assessment Modeling Applied at a National Scale: An Analysis of Benzene Using 3MRA

    NASA Astrophysics Data System (ADS)

    Babendreier, J. E.

    2002-05-01

    Evaluating uncertainty and parameter sensitivity in environmental models can be a difficult task, even for low-order, single-media constructs driven by a unique set of site-specific data. The challenge of examining ever more complex, integrated, higher-order models is a formidable one, particularly in regulatory settings applied on a national scale. Quantitative assessment of uncertainty and sensitivity within integrated, multimedia models that simulate hundreds of sites, spanning multiple geographical and ecological regions, will ultimately require a systematic, comparative approach coupled with sufficient computational power. The Multimedia, Multipathway, and Multireceptor Risk Assessment Model (3MRA) is an important code being developed by the United States Environmental Protection Agency for use in site-scale risk assessment (e.g. hazardous waste management facilities). The model currently entails over 700 variables, 185 of which are explicitly stochastic. The 3MRA can start with a chemical concentration in a waste management unit (WMU). It estimates the release and transport of the chemical throughout the environment, and predicts associated exposure and risk. The 3MRA simulates multimedia (air, water, soil, sediments), pollutant fate and transport, multipathway exposure routes (food ingestion, water ingestion, soil ingestion, air inhalation, etc.), multireceptor exposures (resident, gardener, farmer, fisher, ecological habitats and populations), and resulting risk (human cancer and non-cancer effects, ecological population and community effects). The 3MRA collates the output for an overall national risk assessment, offering a probabilistic strategy as a basis for regulatory decisions. To facilitate model execution of 3MRA for purposes of conducting uncertainty and sensitivity analysis, a PC-based supercomputer cluster was constructed. Design of SuperMUSE, a 125 GHz Windows-based Supercomputer for Model Uncertainty and Sensitivity Evaluation is described

  18. FOOD RISK ANALYSIS

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Food risk analysis is a holistic approach to food safety because it considers all aspects of the problem. Risk assessment modeling is the foundation of food risk analysis. Proper design and simulation of the risk assessment model is important to properly predict and control risk. Because of knowl...

  19. Simulation Assisted Risk Assessment Applied to Launch Vehicle Conceptual Design

    NASA Technical Reports Server (NTRS)

    Mathias, Donovan L.; Go, Susie; Gee, Ken; Lawrence, Scott

    2008-01-01

    A simulation-based risk assessment approach is presented and is applied to the analysis of abort during the ascent phase of a space exploration mission. The approach utilizes groupings of launch vehicle failures, referred to as failure bins, which are mapped to corresponding failure environments. Physical models are used to characterize the failure environments in terms of the risk due to blast overpressure, resulting debris field, and the thermal radiation due to a fireball. The resulting risk to the crew is dynamically modeled by combining the likelihood of each failure, the severity of the failure environments as a function of initiator and time of the failure, the robustness of the crew module, and the warning time available due to early detection. The approach is shown to support the launch vehicle design process by characterizing the risk drivers and identifying regions where failure detection would significantly reduce the risk to the crew.

  20. Applying STAMP in Accident Analysis

    NASA Technical Reports Server (NTRS)

    Leveson, Nancy; Daouk, Mirna; Dulac, Nicolas; Marais, Karen

    2003-01-01

    Accident models play a critical role in accident investigation and analysis. Most traditional models are based on an underlying chain of events. These models, however, have serious limitations when used for complex, socio-technical systems. Previously, Leveson proposed a new accident model (STAMP) based on system theory. In STAMP, the basic concept is not an event but a constraint. This paper shows how STAMP can be applied to accident analysis using three different views or models of the accident process and proposes a notation for describing this process.

  1. Arctic Risk Management (ARMNet) Network: Linking Risk Management Practitioners and Researchers Across the Arctic Regions of Canada and Alaska To Improve Risk, Emergency and Disaster Preparedness and Mitigation Through Comparative Analysis and Applied Research

    NASA Astrophysics Data System (ADS)

    Garland, A.

    2015-12-01

    The Arctic Risk Management Network (ARMNet) was conceived as a trans-disciplinary hub to encourage and facilitate greater cooperation, communication and exchange among American and Canadian academics and practitioners actively engaged in the research, management and mitigation of risks, emergencies and disasters in the Arctic regions. Its aim is to assist regional decision-makers through the sharing of applied research and best practices and to support greater inter-operability and bilateral collaboration through improved networking, joint exercises, workshops, teleconferences, radio programs, and virtual communications (eg. webinars). Most importantly, ARMNet is a clearinghouse for all information related to the management of the frequent hazards of Arctic climate and geography in North America, including new and emerging challenges arising from climate change, increased maritime polar traffic and expanding economic development in the region. ARMNet is an outcome of the Arctic Observing Network (AON) for Long Term Observations, Governance, and Management Discussions, www.arcus.org/search-program. The AON goals continue with CRIOS (www.ariesnonprofit.com/ARIESprojects.php) and coastal erosion research (www.ariesnonprofit.com/webinarCoastalErosion.php) led by the North Slope Borough Risk Management Office with assistance from ARIES (Applied Research in Environmental Sciences Nonprofit, Inc.). The constituency for ARMNet will include all northern academics and researchers, Arctic-based corporations, First Responders (FRs), Emergency Management Offices (EMOs) and Risk Management Offices (RMOs), military, Coast Guard, northern police forces, Search and Rescue (SAR) associations, boroughs, territories and communities throughout the Arctic. This presentation will be of interest to all those engaged in Arctic affairs, describe the genesis of ARMNet and present the results of stakeholder meetings and webinars designed to guide the next stages of the Project.

  2. Budget Risk & Prioritization Analysis Tool

    Energy Science and Technology Software Center (ESTSC)

    2010-12-31

    BRPAtool performs the following: •Assists managers in making solid decisions on what scope/activities to reduce and/or eliminate, to meet constrained budgets, based on multiple risk factors •Enables analysis of different budget scenarios •Can analyze risks and cost for each activity based on technical, quantifiable risk criteria and management-determined risks •Real-time analysis •Enables managers to determine the multipliers and where funding is best applied •Promotes solid budget defense

  3. Quantitative risk analysis applied to innocuity and potency tests on the oil-adjuvanted vaccine against foot and mouth disease in Argentina.

    PubMed

    Cané, B G; Rodríguez Toledo, J; Falczuk, A; Leanes, L F; Manetti, J C; Maradei, E; Verde, P

    1995-12-01

    The authors describe the method used in Argentina for quantification of risk in controls of the potency and innocuity of foot and mouth disease vaccine. Quantitative risk analysis is a relatively new tool in the animal health field, and is in line with the principles of transparency and equivalency of the Sanitary and Phytosanitary Agreement of the Uruguay Round of the General Agreement on Tariffs and Trade (GATT: now World Trade Organisation [WTO]). The risk assessment is presented through a description of the steps involved in manufacturing the vaccine, and the controls performed by the manufacturer and by the National Health Animal Service (Servicio Nacional de Sanidad Animal: SENASA). The adverse situation is considered as the lack of potency or innocuity of the vaccine, and the risk is estimated using a combination of the Monte Carlo simulation and the application of a Bayesian model. PMID:8639949

  4. Quantitative environmental risk analysis

    SciTech Connect

    Klovning, J.; Nilsen, E.F.

    1995-12-31

    According to regulations relating to implementation and rise of risk analysis in the petroleum activities issued by the Norwegian Petroleum Directorate, it is mandatory for an operator on the Norwegian Continental Shelf to establish acceptance criteria for environmental risk in the activities and carry out environmental risk analysis. This paper presents a {open_quotes}new{close_quotes} method for environmental risk analysis developed by the company. The objective has been to assist the company to meet rules and regulations and to assess and describe the environmental risk in a systematic manner. In the environmental risk analysis the most sensitive biological resource in the affected area is used to assess the environmental damage. The analytical method is based on the methodology for quantitative risk analysis related to loss of life. In addition it incorporates the effect of seasonal fluctuations in the environmental risk evaluations. The paper is describing the function of the main analytical sequences exemplified through an analysis of environmental risk related to exploration drilling in an environmental sensitive area on the Norwegian Continental Shelf.

  5. Risk analysis methodology survey

    NASA Technical Reports Server (NTRS)

    Batson, Robert G.

    1987-01-01

    NASA regulations require that formal risk analysis be performed on a program at each of several milestones as it moves toward full-scale development. Program risk analysis is discussed as a systems analysis approach, an iterative process (identification, assessment, management), and a collection of techniques. These techniques, which range from simple to complex network-based simulation were surveyed. A Program Risk Analysis Handbook was prepared in order to provide both analyst and manager with a guide for selection of the most appropriate technique.

  6. Risk/Stress Analysis.

    ERIC Educational Resources Information Center

    Schwerdtfeger, Don; Howell, Richard E.

    1986-01-01

    Identifies stress as a definite health hazard and risk factor involved in a variety of health situations. Proposes that stress identification efforts be considered in environmental analysis so that a more complete approach to risk assessment and management and health hazard prevention can occur. (ML)

  7. Is risk analysis scientific?

    PubMed

    Hansson, Sven Ove; Aven, Terje

    2014-07-01

    This article discusses to what extent risk analysis is scientific in view of a set of commonly used definitions and criteria. We consider scientific knowledge to be characterized by its subject matter, its success in developing the best available knowledge in its fields of study, and the epistemic norms and values that guide scientific investigations. We proceed to assess the field of risk analysis according to these criteria. For this purpose, we use a model for risk analysis in which science is used as a base for decision making on risks, which covers the five elements evidence, knowledge base, broad risk evaluation, managerial review and judgment, and the decision; and that relates these elements to the domains experts and decisionmakers, and to the domains fact-based or value-based. We conclude that risk analysis is a scientific field of study, when understood as consisting primarily of (i) knowledge about risk-related phenomena, processes, events, etc., and (ii) concepts, theories, frameworks, approaches, principles, methods and models to understand, assess, characterize, communicate, and manage risk, in general and for specific applications (the instrumental part). PMID:24919396

  8. Evaluation of Cardiovascular Risk Scores Applied to NASA's Astronant Corps

    NASA Technical Reports Server (NTRS)

    Jain, I.; Charvat, J. M.; VanBaalen, M.; Lee, L.; Wear, M. L.

    2014-01-01

    In an effort to improve cardiovascular disease (CVD) risk prediction, this analysis evaluates and compares the applicability of multiple CVD risk scores to the NASA Astronaut Corps which is extremely healthy at selection.

  9. Program risk analysis handbook

    NASA Technical Reports Server (NTRS)

    Batson, R. G.

    1987-01-01

    NASA regulations specify that formal risk analysis be performed on a program at each of several milestones. Program risk analysis is discussed as a systems analysis approach, an iterative process (identification, assessment, management), and a collection of techniques. These techniques, which range from extremely simple to complex network-based simulation, are described in this handbook in order to provide both analyst and manager with a guide for selection of the most appropriate technique. All program risk assessment techniques are shown to be based on elicitation and encoding of subjective probability estimates from the various area experts on a program. Techniques to encode the five most common distribution types are given. Then, a total of twelve distinct approaches to risk assessment are given. Steps involved, good and bad points, time involved, and degree of computer support needed are listed. Why risk analysis should be used by all NASA program managers is discussed. Tools available at NASA-MSFC are identified, along with commercially available software. Bibliography (150 entries) and a program risk analysis check-list are provided.

  10. Applying a weed risk assessment approach to GM crops.

    PubMed

    Keese, Paul K; Robold, Andrea V; Myers, Ruth C; Weisman, Sarah; Smith, Joe

    2014-12-01

    Current approaches to environmental risk assessment of genetically modified (GM) plants are modelled on chemical risk assessment methods, which have a strong focus on toxicity. There are additional types of harms posed by plants that have been extensively studied by weed scientists and incorporated into weed risk assessment methods. Weed risk assessment uses robust, validated methods that are widely applied to regulatory decision-making about potentially problematic plants. They are designed to encompass a broad variety of plant forms and traits in different environments, and can provide reliable conclusions even with limited data. The knowledge and experience that underpin weed risk assessment can be harnessed for environmental risk assessment of GM plants. A case study illustrates the application of the Australian post-border weed risk assessment approach to a representative GM plant. This approach is a valuable tool to identify potential risks from GM plants. PMID:24046097

  11. Targeted assets risk analysis.

    PubMed

    Bouwsema, Barry

    2013-01-01

    Risk assessments utilising the consolidated risk assessment process as described by Public Safety Canada and the Centre for Security Science utilise the five threat categories of natural, human accidental, technological, human intentional and chemical, biological, radiological, nuclear or explosive (CBRNE). The categories of human intentional and CBRNE indicate intended actions against specific targets. It is therefore necessary to be able to identify which pieces of critical infrastructure represent the likely targets of individuals with malicious intent. Using the consolidated risk assessment process and the target capabilities list, coupled with the CARVER methodology and a security vulnerability analysis, it is possible to identify these targeted assets and their weaknesses. This process can help emergency managers to identify where resources should be allocated and funding spent. Targeted Assets Risk Analysis (TARA) presents a new opportunity to improve how risk is measured, monitored, managed and minimised through the four phases of emergency management, namely, prevention, preparation, response and recovery. To reduce risk throughout Canada, Defence Research and Development Canada is interested in researching the potential benefits of a comprehensive approach to risk assessment and management. The TARA provides a framework against which potential human intentional threats can be measured and quantified, thereby improving safety for all Canadians. PMID:23615063

  12. DWPF risk analysis summary

    SciTech Connect

    Shedrow, C.B.

    1990-10-01

    This document contains selected risk analysis data from Chapter 9 (Safety Analysis) of the Defense Waste Processing Facility Safety Analysis Report DWPF SAR and draft Addendum 1 to the Waste Tank Farms SAR. Although these data may be revised prior to finalization of the draft SAR and the draft addendum, they are presently the best available information and were therefore used in preparing the risk analysis portion of the DWPF Environmental Analysis (DWPF EA). This information has been extracted from those draft documents an approved under separate cover so that it can be used as reference material for the DWPF EA when it is placed in the public reading rooms. 9 refs., 4 tabs.

  13. How to ensure that the results of climate risk analysis make a difference? - Experience from applied research addressing the challenges of climate change

    NASA Astrophysics Data System (ADS)

    Schneiderbauer, Stefan; Zebisch, Marc; Becker, Daniel; Pedoth, Lydia; Renner, Kathrin; Kienberger, Stefan

    2016-04-01

    Changing climate conditions may have beneficial or adverse effects on the social-ecological systems we are living in. In any case, the possible effects result from complex and interlinked physical and social processes embedded in these systems. Traditional research addresses these bio-physical and societal issues in a separate way. Therefore, in general, studies on risks related to climate change are still mono-disciplinary in nature with an increasing amount of work following a multi-disciplinary approach. The quality and usefulness of the results of such research for policy or decision making in practice may further be limited by study designs that do not acknowledge appropriately the significance of integrating or at least mixing qualitative and quantitative information and knowledge. Finally, the acceptance of study results - particularly when containing some kind of assessments - is often endangered by insufficient and / or late involvement of stakeholders and users. The above mentioned limitations have often been brought up in the recent past. However, despite that a certain consensus could be achieved in the last years recognising the need to tackle these issues, little progress has been made in terms of implementation within the context of (research) studies. This paper elaborates in detail on reasons that hamper the application of - interdisciplinary (i.e. natural and social science), - trans-disciplinary (i.e. co-production of knowledge) and - integrative (i.e. combining qualitative and quantitative approaches) work. It is based on the experience gained through a number of applied climate change vulnerability studies carried out within the context of various GIZ-financed development cooperation projects, a consultancy project for the German Environment Agency as well as the workshop series INQUIMUS, which tackles particularly the issues of mixing qualitative and quantitative research approaches. Potentials and constraints of possible attempts for

  14. The Andrews’ Principles of Risk, Need, and Responsivity as Applied in Drug Abuse Treatment Programs: Meta-Analysis of Crime and Drug Use Outcomes

    PubMed Central

    Prendergast, Michael L.; Pearson, Frank S.; Podus, Deborah; Hamilton, Zachary K.; Greenwell, Lisa

    2013-01-01

    Objectives The purpose of the present meta-analysis was to answer the question: Can the Andrews principles of risk, needs, and responsivity, originally developed for programs that treat offenders, be extended to programs that treat drug abusers? Methods Drawing from a dataset that included 243 independent comparisons, we conducted random-effects meta-regression and ANOVA-analog meta-analyses to test the Andrews principles by averaging crime and drug use outcomes over a diverse set of programs for drug abuse problems. Results For crime outcomes, in the meta-regressions the point estimates for each of the principles were substantial, consistent with previous studies of the Andrews principles. There was also a substantial point estimate for programs exhibiting a greater number of the principles. However, almost all of the 95% confidence intervals included the zero point. For drug use outcomes, in the meta-regressions the point estimates for each of the principles was approximately zero; however, the point estimate for programs exhibiting a greater number of the principles was somewhat positive. All of the estimates for the drug use principles had confidence intervals that included the zero point. Conclusions This study supports previous findings from primary research studies targeting the Andrews principles that those principles are effective in reducing crime outcomes, here in meta-analytic research focused on drug treatment programs. By contrast, programs that follow the principles appear to have very little effect on drug use outcomes. Primary research studies that experimentally test the Andrews principles in drug treatment programs are recommended. PMID:24058325

  15. The basic importance of applied behavior analysis

    PubMed Central

    Epling, W. Frank; Pierce, W. David

    1986-01-01

    We argue that applied behavior analysis is relevant to basic research. Modification studies, and a broad range of investigations that focus on the precipitating and maintaining conditions of socially significant human behavior, have basic importance. Applied behavior analysis may aid basic researchers in the design of externally valid experiments and thereby enhance the theoretical significance of basic research for understanding human behavior. Applied research with humans, directed at culturally-important problems, will help to propagate the science of human behavior. Such a science will also be furthered by analogue experiments that model socially important behavior. Analytical-applied studies and analogue experiments are forms of applied behavior analysis that could suggest new environment-behavior relationships. These relationships could lead to basic research and principles that further the prediction, control, and understanding of behavior. PMID:22478650

  16. Applied Behavior Analysis and Statistical Process Control?

    ERIC Educational Resources Information Center

    Hopkins, B. L.

    1995-01-01

    Incorporating statistical process control (SPC) methods into applied behavior analysis is discussed. It is claimed that SPC methods would likely reduce applied behavior analysts' intimate contacts with problems and would likely yield poor treatment and research decisions. Cases and data presented by Pfadt and Wheeler (1995) are cited as examples.…

  17. Competing Uses of Underground Systems Related to Energy Supply: Applying Single- and Multiphase Simulations for Site Characterization and Risk-Analysis

    NASA Astrophysics Data System (ADS)

    Kissinger, A.; Walter, L.; Darcis, M.; Flemisch, B.; Class, H.

    2012-04-01

    Global climate change, shortage of resources and the resulting turn towards renewable sources of energy lead to a growing demand for the utilization of subsurface systems. Among these competing uses are Carbon Capture and Storage (CCS), geothermal energy, nuclear waste disposal, "renewable" methane or hydrogen storage as well as the ongoing production of fossil resources like oil, gas, and coal. Besides competing among themselves, these technologies may also create conflicts with essential public interests like water supply. For example, the injection of CO2 into the underground causes an increase in pressure reaching far beyond the actual radius of influence of the CO2 plume, potentially leading to large amounts of displaced salt water. Finding suitable sites is a demanding task for several reasons. Natural systems as opposed to technical systems are always characterized by heterogeneity. Therefore, parameter uncertainty impedes reliable predictions towards capacity and safety of a site. State of the art numerical simulations combined with stochastic approaches need to be used to obtain a more reliable assessment of the involved risks and the radii of influence of the different processes. These simulations may include the modeling of single- and multiphase non-isothermal flow, geo-chemical and geo-mechanical processes in order to describe all relevant physical processes adequately. Stochastic approaches have the aim to estimate a bandwidth of the key output parameters based on uncertain input parameters. Risks for these different underground uses can then be made comparable with each other. Along with the importance and the urgency of the competing processes this may lead to a more profound basis for a decision. Communicating risks to stake holders and a concerned public is crucial for the success of finding a suitable site for CCS (or other subsurface utilization). We present and discuss first steps towards an approach for addressing the issue of competitive

  18. Applying the partitioned multiobjective risk method (PMRM) to portfolio selection.

    PubMed

    Reyes Santos, Joost; Haimes, Yacov Y

    2004-06-01

    The analysis of risk-return tradeoffs and their practical applications to portfolio analysis paved the way for Modern Portfolio Theory (MPT), which won Harry Markowitz a 1992 Nobel Prize in Economics. A typical approach in measuring a portfolio's expected return is based on the historical returns of the assets included in a portfolio. On the other hand, portfolio risk is usually measured using volatility, which is derived from the historical variance-covariance relationships among the portfolio assets. This article focuses on assessing portfolio risk, with emphasis on extreme risks. To date, volatility is a major measure of risk owing to its simplicity and validity for relatively small asset price fluctuations. Volatility is a justified measure for stable market performance, but it is weak in addressing portfolio risk under aberrant market fluctuations. Extreme market crashes such as that on October 19, 1987 ("Black Monday") and catastrophic events such as the terrorist attack of September 11, 2001 that led to a four-day suspension of trading on the New York Stock Exchange (NYSE) are a few examples where measuring risk via volatility can lead to inaccurate predictions. Thus, there is a need for a more robust metric of risk. By invoking the principles of the extreme-risk-analysis method through the partitioned multiobjective risk method (PMRM), this article contributes to the modeling of extreme risks in portfolio performance. A measure of an extreme portfolio risk, denoted by f(4), is defined as the conditional expectation for a lower-tail region of the distribution of the possible portfolio returns. This article presents a multiobjective problem formulation consisting of optimizing expected return and f(4), whose solution is determined using Evolver-a software that implements a genetic algorithm. Under business-as-usual market scenarios, the results of the proposed PMRM portfolio selection model are found to be compatible with those of the volatility-based model

  19. Quantitative Microbial Risk Assessment Tutorial: Pour Point Analysis of Land-applied Microbial Loadings and Comparison of Simulated and Gaging Station Results

    EPA Science Inventory

    This tutorial demonstrates a pour point analysis • Initiates execution of the SDMPB.• Navigates the SDMPB.• Chooses a pour point within a watershed, delineates the sub-area that contributes to that pour point, and collects data for it.• Considers land applicat...

  20. Total Risk Approach in Applying PRA to Criticality Safety

    SciTech Connect

    Huang, S T

    2005-03-24

    As nuclear industry continues marching from an expert-base support to more procedure-base support, it is important to revisit the total risk concept to criticality safety. A key objective of criticality safety is to minimize total criticality accident risk. The purpose of this paper is to assess key constituents of total risk concept pertaining to criticality safety from an operations support perspective and to suggest a risk-informed means of utilizing criticality safety resources for minimizing total risk. A PRA methodology was used to assist this assessment. The criticality accident history was assessed to provide a framework for our evaluation. In supporting operations, the work of criticality safety engineers ranges from knowing the scope and configurations of a proposed operation, performing criticality hazards assessment to derive effective controls, assisting in training operators, response to floor questions, surveillance to ensure implementation of criticality controls, and response to criticality mishaps. In a compliance environment, the resource of criticality safety engineers is increasingly being directed towards tedious documentation effort to meet some regulatory requirements to the effect of weakening the floor support for criticality safety. By applying a fault tree model to identify the major contributors of criticality accidents, a total risk picture is obtained to address relative merits of various actions. Overall, human failure is the key culprit in causing criticality accidents. Factors such as failure to follow procedures, lacks of training, lack of expert support at the floor level etc. are main contributors. Other causes may include lack of effective criticality controls such as inadequate criticality safety evaluation. Not all of the causes are equally important in contributing to criticality mishaps. Applying the limited resources to strengthen the weak links would reduce risk more than continuing emphasis on the strong links of

  1. Defining applied behavior analysis: An historical analogy

    PubMed Central

    Deitz, Samuel M.

    1982-01-01

    This article examines two criteria for a definition of applied behavior analysis. The criteria are derived from a 19th century attempt to establish medicine as a scientific field. The first criterion, experimental determinism, specifies the methodological boundaries of an experimental science. The second criterion, philosophic doubt, clarifies the tentative nature of facts and theories derived from those facts. Practices which will advance the science of behavior are commented upon within each criteria. To conclude, the problems of a 19th century form of empiricism in medicine are related to current practices in applied behavior analysis. PMID:22478557

  2. Risk Analysis Virtual ENvironment

    Energy Science and Technology Software Center (ESTSC)

    2014-02-10

    RAVEN has 3 major functionalities: 1. Provides a Graphical User Interface for the pre- and post-processing of the RELAP-7 input and output. 2. Provides the capability to model nuclear power plants control logic for the RELAP-7 code and dynamic control of the accident scenario evolution. This capability is based on a software structure that realizes a direct connection between the RELAP-7 solver engine (MOOSE) and a python environment where the variables describing the plant statusmore » are accessible in a scripting environment. RAVEN support the generation of the probabilistic scenario control by supplying a wide range of probability and cumulative distribution functions and their inverse functions. 3. Provides a general environment to perform probability risk analysis for RELAP-7, RELAP-5 and any generic MOOSE based applications. The probabilistic analysis is performed by sampling the input space of the coupled code parameters and it is enhanced by using modern artificial intelligence algorithms that accelerate the identification of the areas of major risk (in the input parameter space). This environment also provides a graphical visualization capability to analyze the outcomes. Among other approaches, the classical Monte Carlo and Latin Hypercube sampling algorithms are available. For the acceleration of the convergence of the sampling methodologies, Support Vector Machines, Bayesian regression, and collocation stochastic polynomials chaos are implemented. The same methodologies here described could be used to solve optimization and uncertainties propagation problems using the RAVEN framework.« less

  3. Risk Analysis Virtual ENvironment

    SciTech Connect

    2014-02-10

    RAVEN has 3 major functionalities: 1. Provides a Graphical User Interface for the pre- and post-processing of the RELAP-7 input and output. 2. Provides the capability to model nuclear power plants control logic for the RELAP-7 code and dynamic control of the accident scenario evolution. This capability is based on a software structure that realizes a direct connection between the RELAP-7 solver engine (MOOSE) and a python environment where the variables describing the plant status are accessible in a scripting environment. RAVEN support the generation of the probabilistic scenario control by supplying a wide range of probability and cumulative distribution functions and their inverse functions. 3. Provides a general environment to perform probability risk analysis for RELAP-7, RELAP-5 and any generic MOOSE based applications. The probabilistic analysis is performed by sampling the input space of the coupled code parameters and it is enhanced by using modern artificial intelligence algorithms that accelerate the identification of the areas of major risk (in the input parameter space). This environment also provides a graphical visualization capability to analyze the outcomes. Among other approaches, the classical Monte Carlo and Latin Hypercube sampling algorithms are available. For the acceleration of the convergence of the sampling methodologies, Support Vector Machines, Bayesian regression, and collocation stochastic polynomials chaos are implemented. The same methodologies here described could be used to solve optimization and uncertainties propagation problems using the RAVEN framework.

  4. Applied mathematics analysis of the multibody systems

    NASA Astrophysics Data System (ADS)

    Sahin, H.; Kar, A. K.; Tacgin, E.

    2012-08-01

    A methodology is developed for the analysis of the multibody systems that is applied on the vehicle as a case study. The previous study emphasizes the derivation of the multibody dynamics equations of motion for bogie [2]. In this work, we have developed a guide-way for the analysis of the dynamical behavior of the multibody systems for mainly validation, verification of the realistic mathematical model and partly for the design of the alternative optimum vehicle parameters.

  5. Applied behavior analysis and statistical process control?

    PubMed Central

    Hopkins, B L

    1995-01-01

    This paper examines Pfadt and Wheeler's (1995) suggestions that the methods of statistical process control (SPC) be incorporated into applied behavior analysis. The research strategies of SPC are examined and compared to those of applied behavior analysis. I argue that the statistical methods that are a part of SPC would likely reduce applied behavior analysts' intimate contacts with the problems with which they deal and would, therefore, likely yield poor treatment and research decisions. Examples of these kinds of results and decisions are drawn from the cases and data Pfadt and Wheeler present. This paper also describes and clarifies many common misconceptions about SPC, including W. Edwards Deming's involvement in its development, its relationship to total quality management, and its confusion with various other methods designed to detect sources of unwanted variability. PMID:7592156

  6. Probabilistic Exposure Analysis for Chemical Risk Characterization

    PubMed Central

    Bogen, Kenneth T.; Cullen, Alison C.; Frey, H. Christopher; Price, Paul S.

    2009-01-01

    This paper summarizes the state of the science of probabilistic exposure assessment (PEA) as applied to chemical risk characterization. Current probabilistic risk analysis methods applied to PEA are reviewed. PEA within the context of risk-based decision making is discussed, including probabilistic treatment of related uncertainty, interindividual heterogeneity, and other sources of variability. Key examples of recent experience gained in assessing human exposures to chemicals in the environment, and other applications to chemical risk characterization and assessment, are presented. It is concluded that, although improvements continue to be made, existing methods suffice for effective application of PEA to support quantitative analyses of the risk of chemically induced toxicity that play an increasing role in key decision-making objectives involving health protection, triage, civil justice, and criminal justice. Different types of information required to apply PEA to these different decision contexts are identified, and specific PEA methods are highlighted that are best suited to exposure assessment in these separate contexts. PMID:19223660

  7. Positive Behavior Support and Applied Behavior Analysis

    ERIC Educational Resources Information Center

    Johnston, J. M.; Foxx, R. M.; Jacobson, J. W.; Green, G.; Mulick, J. A.

    2006-01-01

    This article reviews the origins and characteristics of the positive behavior support (PBS) movement and examines those features in the context of the field of applied behavior analysis (ABA). We raise a number of concerns about PBS as an approach to delivery of behavioral services and its impact on how ABA is viewed by those in human services. We…

  8. Applied Behavior Analysis: Beyond Discrete Trial Teaching

    ERIC Educational Resources Information Center

    Steege, Mark W.; Mace, F. Charles; Perry, Lora; Longenecker, Harold

    2007-01-01

    We discuss the problem of autism-specific special education programs representing themselves as Applied Behavior Analysis (ABA) programs when the only ABA intervention employed is Discrete Trial Teaching (DTT), and often for limited portions of the school day. Although DTT has many advantages to recommend its use, it is not well suited to teach…

  9. Caldwell University's Department of Applied Behavior Analysis.

    PubMed

    Reeve, Kenneth F; Reeve, Sharon A

    2016-05-01

    Since 2004, faculty members at Caldwell University have developed three successful graduate programs in Applied Behavior Analysis (i.e., PhD, MA, non-degree programs), increased program faculty from two to six members, developed and operated an on-campus autism center, and begun a stand-alone Applied Behavior Analysis Department. This paper outlines a number of strategies used to advance these initiatives, including those associated with an extensive public relations campaign. We also outline challenges that have limited our programs' growth. These strategies, along with a consideration of potential challenges, might prove useful in guiding academicians who are interested in starting their own programs in behavior analysis. PMID:27606194

  10. Security risk assessment: applying the concepts of fuzzy logic.

    PubMed

    Bajpai, Shailendra; Sachdeva, Anish; Gupta, J P

    2010-01-15

    Chemical process industries (CPI) handling hazardous chemicals in bulk can be attractive targets for deliberate adversarial actions by terrorists, criminals and disgruntled employees. It is therefore imperative to have comprehensive security risk management programme including effective security risk assessment techniques. In an earlier work, it has been shown that security risk assessment can be done by conducting threat and vulnerability analysis or by developing Security Risk Factor Table (SRFT). HAZOP type vulnerability assessment sheets can be developed that are scenario based. In SRFT model, important security risk bearing factors such as location, ownership, visibility, inventory, etc., have been used. In this paper, the earlier developed SRFT model has been modified using the concepts of fuzzy logic. In the modified SRFT model, two linguistic fuzzy scales (three-point and four-point) are devised based on trapezoidal fuzzy numbers. Human subjectivity of different experts associated with previous SRFT model is tackled by mapping their scores to the newly devised fuzzy scale. Finally, the fuzzy score thus obtained is defuzzyfied to get the results. A test case of a refinery is used to explain the method and compared with the earlier work. PMID:19744788

  11. Multidimensional Risk Analysis: MRISK

    NASA Technical Reports Server (NTRS)

    McCollum, Raymond; Brown, Douglas; O'Shea, Sarah Beth; Reith, William; Rabulan, Jennifer; Melrose, Graeme

    2015-01-01

    Multidimensional Risk (MRISK) calculates the combined multidimensional score using Mahalanobis distance. MRISK accounts for covariance between consequence dimensions, which de-conflicts the interdependencies of consequence dimensions, providing a clearer depiction of risks. Additionally, in the event the dimensions are not correlated, Mahalanobis distance reduces to Euclidean distance normalized by the variance and, therefore, represents the most flexible and optimal method to combine dimensions. MRISK is currently being used in NASA's Environmentally Responsible Aviation (ERA) project o assess risk and prioritize scarce resources.

  12. Tropospheric Delay Raytracing Applied in VLBI Analysis

    NASA Astrophysics Data System (ADS)

    MacMillan, D. S.; Eriksson, D.; Gipson, J. M.

    2013-12-01

    Tropospheric delay modeling error continues to be one of the largest sources of error in VLBI analysis. For standard operational solutions, we use the VMF1 elevation-dependent mapping functions derived from ECMWF data. These mapping functions assume that tropospheric delay at a site is azimuthally symmetric. As this assumption does not reflect reality, we have determined the raytrace delay along the signal path through the troposphere for each VLBI quasar observation. We determined the troposphere refractivity fields from the pressure, temperature, specific humidity and geopotential height fields of the NASA GSFC GEOS-5 numerical weather model. We discuss results from analysis of the CONT11 R&D and the weekly operational R1+R4 experiment sessions. When applied in VLBI analysis, baseline length repeatabilities were better for 66-72% of baselines with raytraced delays than with VMF1 mapping functions. Vertical repeatabilities were better for 65% of sites.

  13. Wavelet analysis applied to the IRAS cirrus

    NASA Technical Reports Server (NTRS)

    Langer, William D.; Wilson, Robert W.; Anderson, Charles H.

    1994-01-01

    The structure of infrared cirrus clouds is analyzed with Laplacian pyramid transforms, a form of non-orthogonal wavelets. Pyramid and wavelet transforms provide a means to decompose images into their spatial frequency components such that all spatial scales are treated in an equivalent manner. The multiscale transform analysis is applied to IRAS 100 micrometer maps of cirrus emission in the north Galactic pole region to extract features on different scales. In the maps we identify filaments, fragments and clumps by separating all connected regions. These structures are analyzed with respect to their Hausdorff dimension for evidence of the scaling relationships in the cirrus clouds.

  14. Positive Behavior Support and Applied Behavior Analysis

    PubMed Central

    Johnston, J.M; Foxx, Richard M; Jacobson, John W; Green, Gina; Mulick, James A

    2006-01-01

    This article reviews the origins and characteristics of the positive behavior support (PBS) movement and examines those features in the context of the field of applied behavior analysis (ABA). We raise a number of concerns about PBS as an approach to delivery of behavioral services and its impact on how ABA is viewed by those in human services. We also consider the features of PBS that have facilitated its broad dissemination and how ABA might benefit from emulating certain practices of the PBS movement. PMID:22478452

  15. Sneak analysis applied to process systems

    NASA Astrophysics Data System (ADS)

    Whetton, Cris

    Traditional safety analyses, such as HAZOP, FMEA, FTA, and MORT, are less than effective at identifying hazards resulting from incorrect 'flow' - whether this be flow of information, actions, electric current, or even the literal flow of process fluids. Sneak Analysis (SA) has existed since the mid nineteen-seventies as a means of identifying such conditions in electric circuits; in which area, it is usually known as Sneak Circuit Analysis (SCA). This paper extends the ideas of Sneak Circuit Analysis to a general method of Sneak Analysis applied to process plant. The methods of SA attempt to capitalize on previous work in the electrical field by first producing a pseudo-electrical analog of the process and then analyzing the analog by the existing techniques of SCA, supplemented by some additional rules and clues specific to processes. The SA method is not intended to replace any existing method of safety analysis; instead, it is intended to supplement such techniques as HAZOP and FMEA by providing systematic procedures for the identification of a class of potential problems which are not well covered by any other method.

  16. Multiattribute risk analysis in nuclear emergency management.

    PubMed

    Hämäläinen, R P; Lindstedt, M R; Sinkko, K

    2000-08-01

    Radiation protection authorities have seen a potential for applying multiattribute risk analysis in nuclear emergency management and planning to deal with conflicting objectives, different parties involved, and uncertainties. This type of approach is expected to help in the following areas: to ensure that all relevant attributes are considered in decision making; to enhance communication between the concerned parties, including the public; and to provide a method for explicitly including risk analysis in the process. A multiattribute utility theory analysis was used to select a strategy for protecting the population after a simulated nuclear accident. The value-focused approach and the use of a neutral facilitator were identified as being useful. PMID:11051070

  17. Object Oriented Risk Analysis Workshop

    NASA Astrophysics Data System (ADS)

    Pons, M. Güell I.; Jaboyedoff, M.

    2009-04-01

    In the framework of the RISET Project (Interfaculty Network of Support to Education and Technology) an educational tool for introducing risk analysis has been developed. This workshop enables to carry a group of students (role-play game) through a step-by-step process of risk identification and quantification. The aim is to assess risk in a characteristic alpine village regarding natural hazards (rockfall, snow avalanche, flooding…) and is oriented to affected objects such as buildings, infrastructures... The workshop contains the following steps: 1.- Planning of the study and definition of stakeholders 2.- Hazard identification 3.- Risk analysis 4.- Risk assessment 5.- Proposition of mitigation measures 6- Risk management and cost-benefit analysis. During the process, information related to past events and useful concepts are provided in order to bring up discussion and decision making. The Risk Matrix and other graphical tools allow having a visual representation of the risk level and help to prioritize counter measures. At the end of the workshop, there is the possibility to compare the results between different groups and print out a summarizing report. This approach provides a rapid and comprehensible risk evaluation. The workshop is accessible from the internet and will be used for educational purposes at bachelor and master level as well as for external persons dealing with risk analysis.

  18. Risk analysis and management

    NASA Technical Reports Server (NTRS)

    Smith, H. E.

    1990-01-01

    Present software development accomplishments are indicative of the emerging interest in and increasing efforts to provide risk assessment backbone tools in the manned spacecraft engineering community. There are indications that similar efforts are underway in the chemical processes industry and are probably being planned for other high risk ground base environments. It appears that complex flight systems intended for extended manned planetary exploration will drive this technology.

  19. Applied spectrophotometry: analysis of a biochemical mixture.

    PubMed

    Trumbo, Toni A; Schultz, Emeric; Borland, Michael G; Pugh, Michael Eugene

    2013-01-01

    Spectrophotometric analysis is essential for determining biomolecule concentration of a solution and is employed ubiquitously in biochemistry and molecular biology. The application of the Beer-Lambert-Bouguer Lawis routinely used to determine the concentration of DNA, RNA or protein. There is however a significant difference in determining the concentration of a given species (RNA, DNA, protein) in isolation (a contrived circumstance) as opposed to determining that concentration in the presence of other species (a more realistic situation). To present the student with a more realistic laboratory experience and also to fill a hole that we believe exists in student experience prior to reaching a biochemistry course, we have devised a three week laboratory experience designed so that students learn to: connect laboratory practice with theory, apply the Beer-Lambert-Bougert Law to biochemical analyses, demonstrate the utility and limitations of example quantitative colorimetric assays, demonstrate the utility and limitations of UV analyses for biomolecules, develop strategies for analysis of a solution of unknown biomolecular composition, use digital micropipettors to make accurate and precise measurements, and apply graphing software. PMID:23625877

  20. Terrestrial wildlife risk assessment for TCDD in land-applied pulp and paper mill sludge

    SciTech Connect

    Meyn, O.; Zeeman, M.; Wise, M.J.; Keane, S.E.

    1997-09-01

    A risk assessment was performed to evaluate the potential effects of 2,3,7,8-tetrachlorodibenzo-p-dioxin (TCDD) in land-applied pulp and paper mill sludge on terrestrial wildlife. Three representative bird and mammal species were assessed for potential individual risk of adverse effects. A dietary model was used to estimate TCDD exposures of adult birds and mammals, and a pharmacokinetic model was used to estimate exposure for avian embryos. Using the quotient method, modeled exposure levels were compared to published no-observed-adverse-effect levels (NOAELs) for birds and mammals to calculate risk. Monte Carlo analysis was used to consider the variability and uncertainty in the risk estimates. The results suggest that TCDD in land-applied pulp and paper sludge may pose significant individual risks to terrestrial wildlife under certain circumstances. Shrews were found to be most at risk due to their high consumption rate of food items that are expected to bioconcentrate the TCDD from soil at the application sites. Of all possible pathways, only dietary exposure was considered in this investigation. The analysis centered on parameter uncertainty and does not include an assessment of alternative models, although this could be a significant source of uncertainty.

  1. Reliability analysis applied to structural tests

    NASA Technical Reports Server (NTRS)

    Diamond, P.; Payne, A. O.

    1972-01-01

    The application of reliability theory to predict, from structural fatigue test data, the risk of failure of a structure under service conditions because its load-carrying capability is progressively reduced by the extension of a fatigue crack, is considered. The procedure is applicable to both safe-life and fail-safe structures and, for a prescribed safety level, it will enable an inspection procedure to be planned or, if inspection is not feasible, it will evaluate the life to replacement. The theory has been further developed to cope with the case of structures with initial cracks, such as can occur in modern high-strength materials which are susceptible to the formation of small flaws during the production process. The method has been applied to a structure of high-strength steel and the results are compared with those obtained by the current life estimation procedures. This has shown that the conventional methods can be unconservative in certain cases, depending on the characteristics of the structure and the design operating conditions. The suitability of the probabilistic approach to the interpretation of the results from full-scale fatigue testing of aircraft structures is discussed and the assumptions involved are examined.

  2. Magnetic Analysis Techniques Applied to Desert Varnish

    NASA Technical Reports Server (NTRS)

    Schmidgall, E. R.; Moskowitz, B. M.; Dahlberg, E. D.; Kuhlman, K. R.

    2003-01-01

    Desert varnish is a black or reddish coating commonly found on rock samples from arid regions. Typically, the coating is very thin, less than half a millimeter thick. Previous research has shown that the primary components of desert varnish are silicon oxide clay minerals (60%), manganese and iron oxides (20-30%), and trace amounts of other compounds [1]. Desert varnish is thought to originate when windborne particles containing iron and manganese oxides are deposited onto rock surfaces where manganese oxidizing bacteria concentrate the manganese and form the varnish [4,5]. If desert varnish is indeed biogenic, then the presence of desert varnish on rock surfaces could serve as a biomarker, indicating the presence of microorganisms. This idea has considerable appeal, especially for Martian exploration [6]. Magnetic analysis techniques have not been extensively applied to desert varnish. The only previous magnetic study reported that based on room temperature demagnetization experiments, there were noticeable differences in magnetic properties between a sample of desert varnish and the substrate sandstone [7]. Based upon the results of the demagnetization experiments, the authors concluded that the primary magnetic component of desert varnish was either magnetite (Fe3O4) or maghemite ( Fe2O3).

  3. Risk and value analysis of SETI.

    PubMed

    Billingham, J

    1990-01-01

    This paper attempts to apply a traditional risk and value analysis to the Search for Extraterrestrial Intelligence--SETI. In view of the difficulties of assessing the probability of success, a comparison is made between SETI and a previous search for extraterrestrial life, the biological component of Project Viking. Our application of simple Utility Theory, given some reasonable assumptions, suggests that SETI is at least as worthwhile as the biological experiment on Viking. PMID:11538075

  4. Risk and value analysis of SETI

    NASA Technical Reports Server (NTRS)

    Billingham, J.

    1990-01-01

    This paper attempts to apply a traditional risk and value analysis to the Search for Extraterrestrial Intelligence--SETI. In view of the difficulties of assessing the probability of success, a comparison is made between SETI and a previous search for extraterrestrial life, the biological component of Project Viking. Our application of simple Utility Theory, given some reasonable assumptions, suggests that SETI is at least as worthwhile as the biological experiment on Viking.

  5. Initial Decision and Risk Analysis

    SciTech Connect

    Engel, David W.

    2012-02-29

    Decision and Risk Analysis capabilities will be developed for industry consideration and possible adoption within Year 1. These tools will provide a methodology for merging qualitative ranking of technology maturity and acknowledged risk contributors with quantitative metrics that drive investment decision processes. Methods and tools will be initially introduced as applications to the A650.1 case study, but modular spreadsheets and analysis routines will be offered to industry collaborators as soon as possible to stimulate user feedback and co-development opportunities.

  6. Moving Forward: Positive Behavior Support and Applied Behavior Analysis

    ERIC Educational Resources Information Center

    Tincani, Matt

    2007-01-01

    A controversy has emerged about the relationship between positive behavior support and applied behavior analysis. Some behavior analysts suggest that positive behavior support and applied behavior analysis are the same (e.g., Carr & Sidener, 2002). Others argue that positive behavior support is harmful to applied behavior analysis (e.g., Johnston,…

  7. GROWING NEED FOR RISK ANALYSIS

    EPA Science Inventory

    Risk analysis has been increasingly receiving attention in making environmental decisions. or example, in its May 18, 1993 Combustion Strategy announcement, EPA required that any issuance of a new hazardous waste combustion permit be preceded by the performance of a complete (dir...

  8. Introduction: Conversation Analysis in Applied Linguistics

    ERIC Educational Resources Information Center

    Sert, Olcay; Seedhouse, Paul

    2011-01-01

    This short, introductory paper presents an up-to-date account of works within the field of Applied Linguistics which have been influenced by a Conversation Analytic paradigm. The article reviews recent studies in classroom interaction, materials development, proficiency assessment and language teacher education. We believe that the publication of…

  9. Applying the lessons of high risk industries to health care.

    PubMed

    Hudson, P

    2003-12-01

    High risk industries such as commercial aviation and the oil and gas industry have achieved exemplary safety performance. This paper reviews how they have managed to do that. The primary reasons are the positive attitudes towards safety and the operation of effective formal safety management systems. The safety culture provides an important explanation of why such organisations perform well. An evolutionary model of safety culture is provided in which there is a range of cultures from the pathological through the reactive to the calculative. Later, the proactive culture can evolve towards the generative organisation, an alternative description of the high reliability organisation. The current status of health care is reviewed, arguing that it has a much higher level of accidents and has a reactive culture, lagging behind both high risk industries studied in both attitude and systematic management of patient risks. PMID:14645741

  10. Intelligent adversary risk analysis: a bioterrorism risk management model.

    PubMed

    Parnell, Gregory S; Smith, Christopher M; Moxley, Frederick I

    2010-01-01

    The tragic events of 9/11 and the concerns about the potential for a terrorist or hostile state attack with weapons of mass destruction have led to an increased emphasis on risk analysis for homeland security. Uncertain hazards (natural and engineering) have been successfully analyzed using probabilistic risk analysis (PRA). Unlike uncertain hazards, terrorists and hostile states are intelligent adversaries who can observe our vulnerabilities and dynamically adapt their plans and actions to achieve their objectives. This article compares uncertain hazard risk analysis with intelligent adversary risk analysis, describes the intelligent adversary risk analysis challenges, and presents a probabilistic defender-attacker-defender model to evaluate the baseline risk and the potential risk reduction provided by defender investments. The model includes defender decisions prior to an attack; attacker decisions during the attack; defender actions after an attack; and the uncertainties of attack implementation, detection, and consequences. The risk management model is demonstrated with an illustrative bioterrorism problem with notional data. PMID:20002893

  11. Risk analysis of spent fuel transportation

    SciTech Connect

    Not Available

    1986-01-01

    This book discusses the kinds of judgments that must go into a technical analysis of risk and moves on to the sociopolitical aspects of risk analysis where the same set of facts can be honestly but differently interpreted. Also outlines options available in risk management and reviews courts' involvement with risk analysis.

  12. Experiences of Uav Surveys Applied to Environmental Risk Management

    NASA Astrophysics Data System (ADS)

    Caprioli, M.; Trizzino, R.; Mazzone, F.; Scarano, M.

    2016-06-01

    In this paper the results of some surveys carried out in an area of Apulian territory affected by serious environmental hazard are presented. Unmanned Aerial Vehicles (UAV) are emerging as a key engineering tool for future environmental survey tasks. UAVs are increasingly seen as an attractive low-cost alternative or supplement to aerial and terrestrial photogrammetry due to their low cost, flexibility, availability and readiness for duty. In addition, UAVs can be operated in hazardous or temporarily inaccessible locations, that makes them very suitable for the assessment and management of environmental risk conditions. In order to verify the reliability of these technologies an UAV survey and A LIDAR survey have been carried outalong about 1 km of coast in the Salento peninsula, near the towns of San Foca, Torre dellOrso and SantAndrea( Lecce, Southern Italy). This area is affected by serious environmental risks due to the presence of dangerous rocky cliffs named falesie. The UAV platform was equipped with a photogrammetric measurement system that allowed us to obtain a mobile mapping of the fractured fronts of dangerous rocky cliffs. UAV-images data have been processed using dedicated software (AgisoftPhotoscan). The point clouds obtained from both the UAV and LIDAR surveys have been processed using Cloud Compare software, with the aim of testing the UAV results with respect to the LIDAR ones. The total error obtained was of centimeter-order that is a very satisfactory result. The environmental information has been arranged in an ArcGIS platform in order to assess the risk levels. The possibility to repeat the survey at time intervals more or less close together depending on the measured levels of risk and to compare the output allows following the trend of the dangerous phenomena. In conclusion, for inaccessible locations of dangerous rocky bodies the UAV survey coupled with GIS methodology proved to be a key engineering tool for the management of environmental

  13. Putting problem formulation at the forefront of GMO risk analysis.

    PubMed

    Tepfer, Mark; Racovita, Monica; Craig, Wendy

    2013-01-01

    When applying risk assessment and the broader process of risk analysis to decisions regarding the dissemination of genetically modified organisms (GMOs), the process has a tendency to become remarkably complex. Further, as greater numbers of countries consider authorising the large-scale dissemination of GMOs, and as GMOs with more complex traits reach late stages of development, there has been increasing concern about the burden posed by the complexity of risk analysis. We present here an improved approach for GMO risk analysis that gives a central role to problem formulation. Further, the risk analysis strategy has been clarified and simplified in order to make rigorously scientific risk assessment and risk analysis more broadly accessible to diverse stakeholder groups. PMID:23160540

  14. Applying the Gender Lens to Risk Factors and Outcome after Adult Cardiac Surgery

    PubMed Central

    Eifert, Sandra; Guethoff, Sonja; Kaczmarek, Ingo; Beiras-Fernandez, Andres; Seeland, Ute; Gulbins, Helmut; Seeburger, Jörg; Deutsch, Oliver; Jungwirth, Bettina; Katsari, Elpiniki; Dohmen, Pascal; Pfannmueller, Bettina; Hultgren, Rebecka; Schade, Ina; Kublickiene, Karolina; Mohr, Friedrich W.; Gansera, Brigitte

    2014-01-01

    Summary Background Applying the gender lens to risk factors and outcome after adult cardiac surgery is of major clinical interest, as the inclusion of sex and gender in research design and analysis may guarantee more comprehensive cardiovascular science and may consecutively result in a more effective surgical treatment as well as cost savings in cardiac surgery. Methods We have reviewed classical cardiovascular risk factors (diabetes, arterial hypertension, hyperlipidemia, smoking) according to a gender-based approach. Furthermore, we have examined comorbidities such as depression, renal insufficiency, and hormonal influences in regard to gender. Gender-sensitive economic aspects have been evaluated, surgical outcome has been analyzed, and cardiovascular research has been considered from a gender perspective. Results The influence of typical risk factors and outcome after cardiac surgery has been evaluated from a gender perspective, and the gender-specific distribution of these risk factors is reported on. The named comorbidities are listed. Economic aspects demonstrated a gender gap. Outcome after coronary and valvular surgeries as well as after heart transplantation are displayed in this regard. Results after postoperative use of intra-aortic balloon pump are shown. Gender-related aspects of clinical and biomedical cardiosurgical research are reported. Conclusions Female gender has become an independent risk factor of survival after the majority of cardiosurgical procedures. Severely impaired left ventricular ejection fraction independently predicts survival in men, whereas age does in females. PMID:26288584

  15. Statistical Uncertainty Analysis Applied to Criticality Calculation

    SciTech Connect

    Hartini, Entin; Andiwijayakusuma, Dinan; Susmikanti, Mike; Nursinta, A. W.

    2010-06-22

    In this paper, we present an uncertainty methodology based on a statistical approach, for assessing uncertainties in criticality prediction using monte carlo method due to uncertainties in the isotopic composition of the fuel. The methodology has been applied to criticality calculations with MCNP5 with additional stochastic input of the isotopic fuel composition. The stochastic input were generated using the latin hypercube sampling method based one the probability density function of each nuclide composition. The automatic passing of the stochastic input to the MCNP and the repeated criticality calculation is made possible by using a python script to link the MCNP and our latin hypercube sampling code.

  16. Cancer Risk Assessment: Should New Science be Applied? Workgroup summary

    SciTech Connect

    Richard J. Bull; Antone L. Brooks

    2002-12-15

    OAK-B135 A symposium discussing the implications of certain phenomena observed in radiation biology for cancer risk assessment in general. In July of 2002 a workshop was convened that explored some of the intercellular phenomena that appear to condition responses to carcinogen exposure. Effects that result from communication between cells that appear to either increase the sphere of damage or to modify the sensitivity of cells to further damage were of particular interest. Much of the discussion focused on the effects of ionizing radiation that were transmitted from cells directly hit to cells not receiving direct exposure to radiation (bystander cells). In cell culture, increased rates of mutation, chromosomal aberration, apoptosis, genomic instability, and decreased clonogenic survival have all been observed in cells that have experienced no direct radiation. In addition, there is evidence that low doses of radiation or certain chemicals give rise to adaptive responses in which the treated cells develop resistance to the effects of high doses given in subsequent exposures. Data were presented at the workshop indicating that low dose exposure of animals to radiation and some chemicals frequently reduces the spontaneous rate of mutation in vitro and tumor responses in vivo. Finally, it was concluded that considerable improvement in understanding of how genetic variation may modify the impact of these phenomena is necessary before the risk implications can be fully appreciated. The workshop participants discussed the substantive challenge that these data present with respect to simple linear methodologies that are currently used in cancer risk assessment and attempted to identify broad strategies by which these phenomena may start to be used to refine cancer risk assessment methods in the future.

  17. Applying Personal Genetic Data to Injury Risk Assessment in Athletes

    PubMed Central

    Goodlin, Gabrielle T.; Roos, Andrew K.; Roos, Thomas R.; Hawkins, Claire; Beache, Sydney; Baur, Stephen; Kim, Stuart K.

    2015-01-01

    Recent studies have identified genetic markers associated with risk for certain sports-related injuries and performance-related conditions, with the hope that these markers could be used by individual athletes to personalize their training and diet regimens. We found that we could greatly expand the knowledge base of sports genetic information by using published data originally found in health and disease studies. For example, the results from large genome-wide association studies for low bone mineral density in elderly women can be re-purposed for low bone mineral density in young endurance athletes. In total, we found 124 single-nucleotide polymorphisms associated with: anterior cruciate ligament tear, Achilles tendon injury, low bone mineral density and stress fracture, osteoarthritis, vitamin/mineral deficiencies, and sickle cell trait. Of these single nucleotide polymorphisms, 91% have not previously been used in sports genetics. We conducted a pilot program on fourteen triathletes using this expanded knowledge base of genetic variants associated with sports injury. These athletes were genotyped and educated about how their individual genetic make-up affected their personal risk profile during an hour-long personal consultation. Overall, participants were favorable of the program, found it informative, and most acted upon their genetic results. This pilot program shows that recent genetic research provides valuable information to help reduce sports injuries and to optimize nutrition. There are many genetic studies for health and disease that can be mined to provide useful information to athletes about their individual risk for relevant injuries. PMID:25919592

  18. Evaluating the Risks of Clinical Research: Direct Comparative Analysis

    PubMed Central

    Abdoler, Emily; Roberson-Nay, Roxann; Pine, Daniel S.; Wendler, David

    2014-01-01

    Abstract Objectives: Many guidelines and regulations allow children and adolescents to be enrolled in research without the prospect of clinical benefit when it poses minimal risk. However, few systematic methods exist to determine when research risks are minimal. This situation has led to significant variation in minimal risk judgments, raising concern that some children are not being adequately protected. To address this concern, we describe a new method for implementing the widely endorsed “risks of daily life” standard for minimal risk. This standard defines research risks as minimal when they do not exceed the risks posed by daily life activities or routine examinations. Methods: This study employed a conceptual and normative analysis, and use of an illustrative example. Results: Different risks are composed of the same basic elements: Type, likelihood, and magnitude of harm. Hence, one can compare the risks of research and the risks of daily life by comparing the respective basic elements with each other. We use this insight to develop a systematic method, direct comparative analysis, for implementing the “risks of daily life” standard for minimal risk. The method offers a way of evaluating research procedures that pose the same types of risk as daily life activities, such as the risk of experiencing anxiety, stress, or other psychological harm. We thus illustrate how direct comparative analysis can be applied in practice by using it to evaluate whether the anxiety induced by a respiratory CO2 challenge poses minimal or greater than minimal risks in children and adolescents. Conclusions: Direct comparative analysis is a systematic method for applying the “risks of daily life” standard for minimal risk to research procedures that pose the same types of risk as daily life activities. It thereby offers a method to protect children and adolescents in research, while ensuring that important studies are not blocked because of unwarranted concerns about

  19. Science, Skepticism, and Applied Behavior Analysis

    PubMed Central

    Normand, Matthew P

    2008-01-01

    Pseudoscientific claims concerning medical and psychological treatments of all varieties are commonplace. As behavior analysts, a sound skeptical approach to our science and practice is essential. The present paper offers an overview of science and skepticism and discusses the relationship of skepticism to behavior analysis, with an emphasis on the types of issues concerning behavior analysts in practice. PMID:22477687

  20. Applied Spectrophotometry: Analysis of a Biochemical Mixture

    ERIC Educational Resources Information Center

    Trumbo, Toni A.; Schultz, Emeric; Borland, Michael G.; Pugh, Michael Eugene

    2013-01-01

    Spectrophotometric analysis is essential for determining biomolecule concentration of a solution and is employed ubiquitously in biochemistry and molecular biology. The application of the Beer-Lambert-Bouguer Lawis routinely used to determine the concentration of DNA, RNA or protein. There is however a significant difference in determining the…

  1. Command Process Modeling & Risk Analysis

    NASA Technical Reports Server (NTRS)

    Meshkat, Leila

    2011-01-01

    Commanding Errors may be caused by a variety of root causes. It's important to understand the relative significance of each of these causes for making institutional investment decisions. One of these causes is the lack of standardized processes and procedures for command and control. We mitigate this problem by building periodic tables and models corresponding to key functions within it. These models include simulation analysis and probabilistic risk assessment models.

  2. 38 CFR 75.115 - Risk analysis.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 38 Pensions, Bonuses, and Veterans' Relief 2 2010-07-01 2010-07-01 false Risk analysis. 75.115...) INFORMATION SECURITY MATTERS Data Breaches § 75.115 Risk analysis. If a data breach involving sensitive... analysis or VA's Office of Inspector General conducts an independent risk analysis of the data breach....

  3. 38 CFR 75.115 - Risk analysis.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 38 Pensions, Bonuses, and Veterans' Relief 2 2011-07-01 2011-07-01 false Risk analysis. 75.115...) INFORMATION SECURITY MATTERS Data Breaches § 75.115 Risk analysis. If a data breach involving sensitive... analysis or VA's Office of Inspector General conducts an independent risk analysis of the data breach....

  4. 38 CFR 75.115 - Risk analysis.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 38 Pensions, Bonuses, and Veterans' Relief 2 2014-07-01 2014-07-01 false Risk analysis. 75.115...) INFORMATION SECURITY MATTERS Data Breaches § 75.115 Risk analysis. If a data breach involving sensitive... analysis or VA's Office of Inspector General conducts an independent risk analysis of the data breach....

  5. Scanning methods applied to bitemark analysis

    NASA Astrophysics Data System (ADS)

    Bush, Peter J.; Bush, Mary A.

    2010-06-01

    The 2009 National Academy of Sciences report on forensics focused criticism on pattern evidence subdisciplines in which statements of unique identity are utilized. One principle of bitemark analysis is that the human dentition is unique to the extent that a perpetrator may be identified based on dental traits in a bitemark. Optical and electron scanning methods were used to measure dental minutia and to investigate replication of detail in human skin. Results indicated that being a visco-elastic substrate, skin effectively reduces the resolution of measurement of dental detail. Conclusions indicate caution in individualization statements.

  6. Artificial intelligence applied to process signal analysis

    NASA Technical Reports Server (NTRS)

    Corsberg, Dan

    1988-01-01

    Many space station processes are highly complex systems subject to sudden, major transients. In any complex process control system, a critical aspect of the human/machine interface is the analysis and display of process information. Human operators can be overwhelmed by large clusters of alarms that inhibit their ability to diagnose and respond to a disturbance. Using artificial intelligence techniques and a knowledge base approach to this problem, the power of the computer can be used to filter and analyze plant sensor data. This will provide operators with a better description of the process state. Once a process state is recognized, automatic action could be initiated and proper system response monitored.

  7. Applying programmatic risk assessment to nuclear materials stabilization R and D planning

    SciTech Connect

    Kenley, C.R.; Brown-van Hoozer, S.A.

    1997-10-01

    A systems engineering approach to programmatic risk assessment, derived from the aerospace industry, was applied to various stabilization technologies to assess their relative maturity and availability for use in stabilizing nuclear materials. The assessment provided valuable information for trading off available technologies and identified the at-risk technologies that will require close tracking by the Department of Energy (DOE) to mitigate programmatic risks.

  8. Toward applied behavior analysis of life aloft.

    PubMed

    Brady, J V

    1990-01-01

    This article deals with systems at multiple levels, at least from cell to organization. It also deals with learning, decision making, and other behavior at multiple levels. Technological development of a human behavioral ecosystem appropriate to space environments requires an analytic and synthetic orientation, explicitly experimental in nature, dictated by scientific and pragmatic considerations, and closely approximating procedures of established effectiveness in other areas of natural science. The conceptual basis of such an approach has its roots in environmentalism which has two main features: (1) knowledge comes from experience rather than from innate ideas, divine revelation, or other obscure sources; and (2) action is governed by consequences rather than by instinct, reason, will, beliefs, attitudes or even the currently fashionable cognitions. Without an experimentally derived data base founded upon such a functional analysis of human behavior, the overgenerality of "ecological systems" approaches render them incapable of ensuring the successful establishment of enduring space habitats. Without an experimentally derived function account of individual behavioral variability, a natural science of behavior cannot exist. And without a natural science of behavior, the social sciences will necessarily remain in their current status as disciplines of less than optimal precision or utility. Such a functional analysis of human performance should provide an operational account of behavior change in a manner similar to the way in which Darwin's approach to natural selection accounted for the evolution of phylogenetic lines (i.e., in descriptive, nonteleological terms). Similarly, as Darwin's account has subsequently been shown to be consonant with information obtained at the cellular level, so too should behavior principles ultimately prove to be in accord with an account of ontogenetic adaptation at a biochemical level. It would thus seem obvious that the most

  9. Toward applied behavior analysis of life aloft

    NASA Technical Reports Server (NTRS)

    Brady, J. V.

    1990-01-01

    This article deals with systems at multiple levels, at least from cell to organization. It also deals with learning, decision making, and other behavior at multiple levels. Technological development of a human behavioral ecosystem appropriate to space environments requires an analytic and synthetic orientation, explicitly experimental in nature, dictated by scientific and pragmatic considerations, and closely approximating procedures of established effectiveness in other areas of natural science. The conceptual basis of such an approach has its roots in environmentalism which has two main features: (1) knowledge comes from experience rather than from innate ideas, divine revelation, or other obscure sources; and (2) action is governed by consequences rather than by instinct, reason, will, beliefs, attitudes or even the currently fashionable cognitions. Without an experimentally derived data base founded upon such a functional analysis of human behavior, the overgenerality of "ecological systems" approaches render them incapable of ensuring the successful establishment of enduring space habitats. Without an experimentally derived function account of individual behavioral variability, a natural science of behavior cannot exist. And without a natural science of behavior, the social sciences will necessarily remain in their current status as disciplines of less than optimal precision or utility. Such a functional analysis of human performance should provide an operational account of behavior change in a manner similar to the way in which Darwin's approach to natural selection accounted for the evolution of phylogenetic lines (i.e., in descriptive, nonteleological terms). Similarly, as Darwin's account has subsequently been shown to be consonant with information obtained at the cellular level, so too should behavior principles ultimately prove to be in accord with an account of ontogenetic adaptation at a biochemical level. It would thus seem obvious that the most

  10. Tsunamis: Global Exposure and Local Risk Analysis

    NASA Astrophysics Data System (ADS)

    Harbitz, C. B.; Løvholt, F.; Glimsdal, S.; Horspool, N.; Griffin, J.; Davies, G.; Frauenfelder, R.

    2014-12-01

    The 2004 Indian Ocean tsunami led to a better understanding of the likelihood of tsunami occurrence and potential tsunami inundation, and the Hyogo Framework for Action (HFA) was one direct result of this event. The United Nations International Strategy for Disaster Risk Reduction (UN-ISDR) adopted HFA in January 2005 in order to reduce disaster risk. As an instrument to compare the risk due to different natural hazards, an integrated worldwide study was implemented and published in several Global Assessment Reports (GAR) by UN-ISDR. The results of the global earthquake induced tsunami hazard and exposure analysis for a return period of 500 years are presented. Both deterministic and probabilistic methods (PTHA) are used. The resulting hazard levels for both methods are compared quantitatively for selected areas. The comparison demonstrates that the analysis is rather rough, which is expected for a study aiming at average trends on a country level across the globe. It is shown that populous Asian countries account for the largest absolute number of people living in tsunami prone areas, more than 50% of the total exposed people live in Japan. Smaller nations like Macao and the Maldives are among the most exposed by population count. Exposed nuclear power plants are limited to Japan, China, India, Taiwan, and USA. On the contrary, a local tsunami vulnerability and risk analysis applies information on population, building types, infrastructure, inundation, flow depth for a certain tsunami scenario with a corresponding return period combined with empirical data on tsunami damages and mortality. Results and validation of a GIS tsunami vulnerability and risk assessment model are presented. The GIS model is adapted for optimal use of data available for each study. Finally, the importance of including landslide sources in the tsunami analysis is also discussed.

  11. North energy system risk analysis features

    NASA Astrophysics Data System (ADS)

    Prokhorov, V. A.; Prokhorov, D. V.

    2015-12-01

    Risk indicator analysis for a decentralized energy system of the North was carried out. Based on analysis of damages caused by accidents at energy systems, their structure is selected, and a North energy system risk determination method was proposed.

  12. Goals Analysis Procedure Guidelines for Applying the Goals Analysis Process

    NASA Technical Reports Server (NTRS)

    Motley, Albert E., III

    2000-01-01

    One of the key elements to successful project management is the establishment of the "right set of requirements", requirements that reflect the true customer needs and are consistent with the strategic goals and objectives of the participating organizations. A viable set of requirements implies that each individual requirement is a necessary element in satisfying the stated goals and that the entire set of requirements, taken as a whole, is sufficient to satisfy the stated goals. Unfortunately, it is the author's experience that during project formulation phases' many of the Systems Engineering customers do not conduct a rigorous analysis of the goals and objectives that drive the system requirements. As a result, the Systems Engineer is often provided with requirements that are vague, incomplete, and internally inconsistent. To complicate matters, most systems development methodologies assume that the customer provides unambiguous, comprehensive and concise requirements. This paper describes the specific steps of a Goals Analysis process applied by Systems Engineers at the NASA Langley Research Center during the formulation of requirements for research projects. The objective of Goals Analysis is to identify and explore all of the influencing factors that ultimately drive the system's requirements.

  13. FORTRAN computer program for seismic risk analysis

    USGS Publications Warehouse

    McGuire, Robin K.

    1976-01-01

    A program for seismic risk analysis is described which combines generality of application, efficiency and accuracy of operation, and the advantage of small storage requirements. The theoretical basis for the program is first reviewed, and the computational algorithms used to apply this theory are described. The information required for running the program is listed. Published attenuation functions describing the variation with earthquake magnitude and distance of expected values for various ground motion parameters are summarized for reference by the program user. Finally, suggestions for use of the program are made, an example problem is described (along with example problem input and output) and the program is listed.

  14. Risk analysis of heat recovery steam generator with semi quantitative risk based inspection API 581

    NASA Astrophysics Data System (ADS)

    Prayogo, Galang Sandy; Haryadi, Gunawan Dwi; Ismail, Rifky; Kim, Seon Jin

    2016-04-01

    Corrosion is a major problem that most often occurs in the power plant. Heat recovery steam generator (HRSG) is an equipment that has a high risk to the power plant. The impact of corrosion damage causing HRSG power plant stops operating. Furthermore, it could be threaten the safety of employees. The Risk Based Inspection (RBI) guidelines by the American Petroleum Institute (API) 58 has been used to risk analysis in the HRSG 1. By using this methodology, the risk that caused by unexpected failure as a function of the probability and consequence of failure can be estimated. This paper presented a case study relating to the risk analysis in the HRSG, starting with a summary of the basic principles and procedures of risk assessment and applying corrosion RBI for process industries. The risk level of each HRSG equipment were analyzed: HP superheater has a medium high risk (4C), HP evaporator has a medium-high risk (4C), and the HP economizer has a medium risk (3C). The results of the risk assessment using semi-quantitative method of standard API 581 based on the existing equipment at medium risk. In the fact, there is no critical problem in the equipment components. Damage mechanisms were prominent throughout the equipment is thinning mechanism. The evaluation of the risk approach was done with the aim of reducing risk by optimizing the risk assessment activities.

  15. Applying Qualitative Hazard Analysis to Support Quantitative Safety Analysis for Proposed Reduced Wake Separation Conops

    NASA Technical Reports Server (NTRS)

    Shortle, John F.; Allocco, Michael

    2005-01-01

    This paper describes a scenario-driven hazard analysis process to identify, eliminate, and control safety-related risks. Within this process, we develop selective criteria to determine the applicability of applying engineering modeling to hypothesized hazard scenarios. This provides a basis for evaluating and prioritizing the scenarios as candidates for further quantitative analysis. We have applied this methodology to proposed concepts of operations for reduced wake separation for closely spaced parallel runways. For arrivals, the process identified 43 core hazard scenarios. Of these, we classified 12 as appropriate for further quantitative modeling, 24 that should be mitigated through controls, recommendations, and / or procedures (that is, scenarios not appropriate for quantitative modeling), and 7 that have the lowest priority for further analysis.

  16. Robust regression applied to fractal/multifractal analysis.

    NASA Astrophysics Data System (ADS)

    Portilla, F.; Valencia, J. L.; Tarquis, A. M.; Saa-Requejo, A.

    2012-04-01

    Fractal and multifractal are concepts that have grown increasingly popular in recent years in the soil analysis, along with the development of fractal models. One of the common steps is to calculate the slope of a linear fit commonly using least squares method. This shouldn't be a special problem, however, in many situations using experimental data the researcher has to select the range of scales at which is going to work neglecting the rest of points to achieve the best linearity that in this type of analysis is necessary. Robust regression is a form of regression analysis designed to circumvent some limitations of traditional parametric and non-parametric methods. In this method we don't have to assume that the outlier point is simply an extreme observation drawn from the tail of a normal distribution not compromising the validity of the regression results. In this work we have evaluated the capacity of robust regression to select the points in the experimental data used trying to avoid subjective choices. Based on this analysis we have developed a new work methodology that implies two basic steps: • Evaluation of the improvement of linear fitting when consecutive points are eliminated based on R p-value. In this way we consider the implications of reducing the number of points. • Evaluation of the significance of slope difference between fitting with the two extremes points and fitted with the available points. We compare the results applying this methodology and the common used least squares one. The data selected for these comparisons are coming from experimental soil roughness transect and simulated based on middle point displacement method adding tendencies and noise. The results are discussed indicating the advantages and disadvantages of each methodology. Acknowledgements Funding provided by CEIGRAM (Research Centre for the Management of Agricultural and Environmental Risks) and by Spanish Ministerio de Ciencia e Innovación (MICINN) through project no

  17. Modeling environmental and human health risks of veterinary medicinal products applied in pond aquaculture.

    PubMed

    Rico, Andreu; Geng, Yue; Focks, Andreas; Van den Brink, Paul J

    2013-04-01

    A model called ERA-AQUA was developed to assess the risks posed by the use of veterinary medicinal products (VMPs) applied in aquaculture ponds for the targeted produce, surrounding aquatic ecosystems, consumers, and trade of the aquaculture produce. The model calculates risks by following a risk quotient approach, calculating predicted exposure concentrations (exposure assessment) and predicted no-effect concentrations (effect assessment) for the endpoint under study. The exposure assessment is performed by combining information on the environmental characteristics of the aquaculture pond, characteristics of the cultured species, aquaculture management practices, and physicochemical properties of the compound under study. The model predicts concentrations of VMPs in the pond water, pond sediment, cultured species, and watercourse receiving pond effluent discharges by mass balance equations. The effect assessment is performed by combining (eco)toxicological information and food safety threshold concentrations for the studied compound. In the present study, the scientific background, strengths, and limitations of the ERA-AQUA model are presented together with a sensitivity analysis and an example showing its potential applications. PMID:23401106

  18. Starlink corn: a risk analysis.

    PubMed Central

    Bucchini, Luca; Goldman, Lynn R

    2002-01-01

    Modern biotechnology has dramatically increased our ability to alter the agronomic traits of plants. Among the novel traits that biotechnology has made available, an important group includes Bacillus thuringiensis-derived insect resistance. This technology has been applied to potatoes, cotton, and corn. Benefits of Bt crops, and biotechnology generally, can be realized only if risks are assessed and managed properly. The case of Starlink corn, a plant modified with a gene that encodes the Bt protein Cry9c, was a severe test of U.S. regulatory agencies. The U.S. Environmental Protection Agency had restricted its use to animal feed due to concern about the potential for allergenicity. However, Starlink corn was later found throughout the human food supply, resulting in food recalls by the Food and Drug Administration and significant disruption of the food supply. Here we examine the regulatory history of Starlink, the assessment framework employed by the U.S. government, assumptions and information gaps, and the key elements of government efforts to manage the product. We explore the impacts on regulations, science, and society and conclude that only significant advances in our understanding of food allergies and improvements in monitoring and enforcement will avoid similar events in the future. Specifically, we need to develop a stronger fundamental basis for predicting allergic sensitization and reactions if novel proteins are to be introduced in this fashion. Mechanisms are needed to assure that worker and community aeroallergen risks are considered. Requirements are needed for the development of valid assays so that enforcement and post market surveillance activities can be conducted. PMID:11781159

  19. Fuzzy Comprehensive Evaluation Method Applied in the Real Estate Investment Risks Research

    NASA Astrophysics Data System (ADS)

    ML(Zhang Minli), Zhang; Wp(Yang Wenpo), Yang

    Real estate investment is a high-risk and high returned of economic activity, the key of real estate analysis is the identification of their types of investment risk and the risk of different types of effective prevention. But, as the financial crisis sweeping the world, the real estate industry also faces enormous risks, how effective and correct evaluation of real estate investment risks becomes the multitudinous scholar concern[1]. In this paper, real estate investment risks were summarized and analyzed, and comparative analysis method is discussed and finally presented fuzzy comprehensive evaluation method, not only in theory has the advantages of science, in the application also has the reliability, for real estate investment risk assessment provides an effective means for investors in real estate investing guidance on risk factors and forecasts.

  20. Animal Research in the "Journal of Applied Behavior Analysis"

    ERIC Educational Resources Information Center

    Edwards, Timothy L.; Poling, Alan

    2011-01-01

    This review summarizes the 6 studies with nonhuman animal subjects that have appeared in the "Journal of Applied Behavior Analysis" and offers suggestions for future research in this area. Two of the reviewed articles described translational research in which pigeons were used to illustrate and examine behavioral phenomena of applied significance…

  1. B. F. Skinner's contributions to applied behavior analysis

    PubMed Central

    Morris, Edward K.; Smith, Nathaniel G.; Altus, Deborah E.

    2005-01-01

    Our paper reviews and analyzes B. F. Skinner's contributions to applied behavior analysis in order to assess his role as the field's originator and founder. We found, first, that his contributions fall into five categorizes: the style and content of his science, his interpretations of typical and atypical human behavior, the implications he drew from his science for application, his descriptions of possible applications, and his own applications to nonhuman and human behavior. Second, we found that he explicitly or implicitly addressed all seven dimensions of applied behavior analysis. These contributions and the dimensions notwithstanding, he neither incorporated the field's scientific (e.g., analytic) and social dimensions (e.g., applied) into any program of published research such that he was its originator, nor did he systematically integrate, advance, and promote the dimensions so to have been its founder. As the founder of behavior analysis, however, he was the father of applied behavior analysis. PMID:22478444

  2. Some still-current dimensions of applied behavior analysis

    PubMed Central

    Baer, Donald M.; Wolf, Montrose M.; Risley, Todd R.

    1987-01-01

    Twenty years ago, an anthropological note described the current dimensions of applied behavior analysis as it was prescribed and practiced in 1968: It was, or ought to become, applied, behavioral, analytic, technological, conceptual, effective, and capable of appropriately generalized outcomes. A similar anthropological note today finds the same dimensions still prescriptive, and to an increasing extent, descriptive. Several new tactics have become evident, however, some in the realm of conceptual analysis, some in the sociological status of the discipline, and some in its understanding of the necessary systemic nature of any applied discipline that is to operate in the domain of important human behaviors. PMID:16795703

  3. The possibilities of applying a risk-oriented approach to the NPP reliability and safety enhancement problem

    NASA Astrophysics Data System (ADS)

    Komarov, Yu. A.

    2014-10-01

    An analysis and some generalizations of approaches to risk assessments are presented. Interconnection between different interpretations of the "risk" notion is shown, and the possibility of applying the fuzzy set theory to risk assessments is demonstrated. A generalized formulation of the risk assessment notion is proposed in applying risk-oriented approaches to the problem of enhancing reliability and safety in nuclear power engineering. The solution of problems using the developed risk-oriented approaches aimed at achieving more reliable and safe operation of NPPs is described. The results of studies aimed at determining the need (advisability) to modernize/replace NPP elements and systems are presented together with the results obtained from elaborating the methodical principles of introducing the repair concept based on the equipment technical state. The possibility of reducing the scope of tests and altering the NPP systems maintenance strategy is substantiated using the risk-oriented approach. A probabilistic model for estimating the validity of boric acid concentration measurements is developed.

  4. Applying programmatic risk assessment to nuclear materials stabilization R and D planning

    SciTech Connect

    Brown-Van Hoozer, S.A.; Kenley, C.R.

    1997-10-01

    A systems engineering approach to programmatic risk assessment, derived from the aerospace industry, was applied to various stabilization technologies to assess their relative maturity and availability for use in stabilizing nuclear materials. The assessment provided valuable information for trading off available technologies and identified the at-risk technologies that will require close tracking by the Department of Energy (DOE) to mitigate programmatic risks. This paper presents the programmatic risk assessment methodology developed for the 1995 R and D Plan and updated for the 1996 R and D Plan. Results of the 1996 assessment also are presented (DOE/ID-10561, 1996).

  5. Environmental transport in the Oil Shale Risk Analysis.

    PubMed

    Feerer, J L; Gratt, L B

    1983-06-01

    The Oil Shale Risk Analysis differs from similar efforts in coal and nuclear energy in that the industry is not yet developed to a commercial scale. Many assumptions are necessary to predict the future oil shale industry pollutants, the environmental transport of these pollutants, and subsequent human health and environmental effects. The environmental transport analysis in the Oil Shale Risk Analysis is used as an example of applying assumptions to the best available data to predict potential environmental effects of a future commercial industry. The analysis provides information to aid in formulating and managing a program of environmental research focused on reducing uncertainties in critical areas. PMID:6879167

  6. Modeling Finite-Time Failure Probabilities in Risk Analysis Applications.

    PubMed

    Dimitrova, Dimitrina S; Kaishev, Vladimir K; Zhao, Shouqi

    2015-10-01

    In this article, we introduce a framework for analyzing the risk of systems failure based on estimating the failure probability. The latter is defined as the probability that a certain risk process, characterizing the operations of a system, reaches a possibly time-dependent critical risk level within a finite-time interval. Under general assumptions, we define two dually connected models for the risk process and derive explicit expressions for the failure probability and also the joint probability of the time of the occurrence of failure and the excess of the risk process over the risk level. We illustrate how these probabilistic models and results can be successfully applied in several important areas of risk analysis, among which are systems reliability, inventory management, flood control via dam management, infectious disease spread, and financial insolvency. Numerical illustrations are also presented. PMID:26010201

  7. Conceptual issues with risk analysis in Switzerland

    NASA Astrophysics Data System (ADS)

    Nicolet, Pierrick; Jaboyedoff, Michel; Lévy, Sébastien

    2015-04-01

    frequency corresponding the next return period, to consider each magnitude only once). The consequence of this is that the risk is underestimated in-between the classes bounds, since the 30 years return periods applies from 30 to 100 years and so on. These examples show that conceptual errors are easily made in risk analysis and affect the results. In addition, even when accounting for the uncertainty on the input variables (e.g. using a Monte-Carlo approach) it is not sure that the fluctuation range assigned to the inputs will be large enough to include the 'correct' output. Furthermore, since calibration data are often not available, and since input variables suffers from deep uncertainties, it is generally difficult to assess the result quality and a conceptual mistake can go unnoticed. As a conclusion, the uncertainty assessment needs not only to consider the uncertainty on the inputs, but needs to carefully review the model structure to ensure a good match with the context.

  8. The Significance of Regional Analysis in Applied Geography.

    ERIC Educational Resources Information Center

    Sommers, Lawrence M.

    Regional analysis is central to applied geographic research, contributing to better planning and policy development for a variety of societal problems facing the United States. The development of energy policy serves as an illustration of the capabilities of this type of analysis. The United States has had little success in formulating a national…

  9. Overview of MSFC's Applied Fluid Dynamics Analysis Group Activities

    NASA Technical Reports Server (NTRS)

    Garcia, Roberto; Griffin, Lisa; Williams, Robert

    2003-01-01

    TD64, the Applied Fluid Dynamics Analysis Group, is one of several groups with high-fidelity fluids design and analysis expertise in the Space Transportation Directorate at Marshall Space Flight Center (MSFC). TD64 assists personnel working on other programs. The group participates in projects in the following areas: turbomachinery activities, nozzle activities, combustion devices, and the Columbia accident investigation.

  10. RISK ANALYSIS: CASE HISTORY OF PUCCINIA JACEAE ON YELLOW STARTHISTLE

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Risk analysis has five components: Risk awareness, Risk perception, Risk assessment, Risk management, and Risk communication. Using the case with the foreign plant pathogen, Puccinia jaceae, under evaluation for biological control of yellow starthistle (Centaurea solstitialis, YST), approaches and...

  11. ANALYSIS OF LAMB MORTALITY USING COMPETING RISKS

    Technology Transfer Automated Retrieval System (TEKTRAN)

    A competing risks model was used to describe lamb mortality up to four weeks of age in a composite sheep flock with 8,642 lamb records. Discrete survival methods were applied using sire and animal models. The results indicated that substantial variation exists in the risk of lambs dying from diffe...

  12. Quantitative Analysis of the Interdisciplinarity of Applied Mathematics

    PubMed Central

    Zhang, Pengyuan

    2015-01-01

    The increasing use of mathematical techniques in scientific research leads to the interdisciplinarity of applied mathematics. This viewpoint is validated quantitatively here by statistical and network analysis on the corpus PNAS 1999–2013. A network describing the interdisciplinary relationships between disciplines in a panoramic view is built based on the corpus. Specific network indicators show the hub role of applied mathematics in interdisciplinary research. The statistical analysis on the corpus content finds that algorithms, a primary topic of applied mathematics, positively correlates, increasingly co-occurs, and has an equilibrium relationship in the long-run with certain typical research paradigms and methodologies. The finding can be understood as an intrinsic cause of the interdisciplinarity of applied mathematics. PMID:26352604

  13. Methodology, the matching law, and applied behavior analysis

    PubMed Central

    Vyse, Stuart A.

    1986-01-01

    The practical value of the quantitative analysis of behavior is limited by two methodological characteristics of this area of research: the use of (a) steady-state strategies and (b) relative vs. absolute response rates. Applied behavior analysts are concerned with both transition-state and steady-state behavior, and applied interventions are typically evaluated by their effects on absolute response rates. Quantitative analyses of behavior will have greater practical value when methods are developed for their extension to traditional rate-of-response variables measured across time. Although steady-state and relative-rate-of-response strategies are appropriate to the experimental analysis of many behavioral phenomena, these methods are rarely used by applied behavior analysts and further separate the basic and applied areas. PMID:22478657

  14. Quantitative Analysis of the Interdisciplinarity of Applied Mathematics.

    PubMed

    Xie, Zheng; Duan, Xiaojun; Ouyang, Zhenzheng; Zhang, Pengyuan

    2015-01-01

    The increasing use of mathematical techniques in scientific research leads to the interdisciplinarity of applied mathematics. This viewpoint is validated quantitatively here by statistical and network analysis on the corpus PNAS 1999-2013. A network describing the interdisciplinary relationships between disciplines in a panoramic view is built based on the corpus. Specific network indicators show the hub role of applied mathematics in interdisciplinary research. The statistical analysis on the corpus content finds that algorithms, a primary topic of applied mathematics, positively correlates, increasingly co-occurs, and has an equilibrium relationship in the long-run with certain typical research paradigms and methodologies. The finding can be understood as an intrinsic cause of the interdisciplinarity of applied mathematics. PMID:26352604

  15. Risk Analysis for Resource Planning Optimization

    NASA Technical Reports Server (NTRS)

    Chueng, Kar-Ming

    2008-01-01

    The main purpose of this paper is to introduce a risk management approach that allows planners to quantify the risk and efficiency tradeoff in the presence of uncertainties, and to make forward-looking choices in the development and execution of the plan. Demonstrate a planning and risk analysis framework that tightly integrates mathematical optimization, empirical simulation, and theoretical analysis techniques to solve complex problems.

  16. Risk Analysis and Uncertainty: Implications for Counselling

    ERIC Educational Resources Information Center

    Hassenzahl, David

    2004-01-01

    Over the past two decades, the risk analysis community has made substantial advances in understanding and describing uncertainty. Uncertainty is ubiquitous, complex, both quantitative and qualitative in nature, and often irreducible. Uncertainty thus creates a challenge when using risk analysis to evaluate the rationality of group and individual…

  17. 38 CFR 75.115 - Risk analysis.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 38 Pensions, Bonuses, and Veterans' Relief 2 2012-07-01 2012-07-01 false Risk analysis. 75.115 Section 75.115 Pensions, Bonuses, and Veterans' Relief DEPARTMENT OF VETERANS AFFAIRS (CONTINUED) INFORMATION SECURITY MATTERS Data Breaches § 75.115 Risk analysis. If a data breach involving sensitive personal information that is processed...

  18. 38 CFR 75.115 - Risk analysis.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 38 Pensions, Bonuses, and Veterans' Relief 2 2013-07-01 2013-07-01 false Risk analysis. 75.115 Section 75.115 Pensions, Bonuses, and Veterans' Relief DEPARTMENT OF VETERANS AFFAIRS (CONTINUED) INFORMATION SECURITY MATTERS Data Breaches § 75.115 Risk analysis. If a data breach involving sensitive personal information that is processed...

  19. RISK ASSESSMENT FOR BENEFITS ANALYSIS

    EPA Science Inventory

    Among the important types of information considered in decision making at the U.S. Environmental Protection Agency (EPA) are the outputs of risk assessments and benefit-cost analyses. Risk assessments present estimates of the adverse consequences of exposure to environmental poll...

  20. Carbon Fiber Risk Analysis. [conference

    NASA Technical Reports Server (NTRS)

    1979-01-01

    The scope and status of the effort to assess the risks associated with the accidental release of carbon/graphite fibers from civil aircraft is presented. Vulnerability of electrical and electronic equipment to carbon fibers, dispersal of carbon fibers, effectiveness of filtering systems, impact of fiber induced failures, and risk methodology are among the topics covered.

  1. Negative reinforcement in applied behavior analysis: an emerging technology.

    PubMed Central

    Iwata, B A

    1987-01-01

    Although the effects of negative reinforcement on human behavior have been studied for a number of years, a comprehensive body of applied research does not exist at this time. This article describes three aspects of negative reinforcement as it relates to applied behavior analysis: behavior acquired or maintained through negative reinforcement, the treatment of negatively reinforced behavior, and negative reinforcement as therapy. A consideration of research currently being done in these areas suggests the emergence of an applied technology on negative reinforcement. PMID:3323157

  2. Topographic Avalanche Risk: DEM Sensitivity Analysis

    NASA Astrophysics Data System (ADS)

    Nazarkulova, Ainura; Strobl, Josef

    2015-04-01

    GIS-based models are frequently used to assess the risk and trigger probabilities of (snow) avalanche releases, based on parameters and geomorphometric derivatives like elevation, exposure, slope, proximity to ridges and local relief energy. Numerous models, and model-based specific applications and project results have been published based on a variety of approaches and parametrizations as well as calibrations. Digital Elevation Models (DEM) come with many different resolution (scale) and quality (accuracy) properties, some of these resulting from sensor characteristics and DEM generation algorithms, others from different DEM processing workflows and analysis strategies. This paper explores the impact of using different types and characteristics of DEMs for avalanche risk modeling approaches, and aims at establishing a framework for assessing the uncertainty of results. The research question is derived from simply demonstrating the differences in release risk areas and intensities by applying identical models to DEMs with different properties, and then extending this into a broader sensitivity analysis. For the quantification and calibration of uncertainty parameters different metrics are established, based on simple value ranges, probabilities, as well as fuzzy expressions and fractal metrics. As a specific approach the work on DEM resolution-dependent 'slope spectra' is being considered and linked with the specific application of geomorphometry-base risk assessment. For the purpose of this study focusing on DEM characteristics, factors like land cover, meteorological recordings and snowpack structure and transformation are kept constant, i.e. not considered explicitly. Key aims of the research presented here are the development of a multi-resolution and multi-scale framework supporting the consistent combination of large area basic risk assessment with local mitigation-oriented studies, and the transferability of the latter into areas without availability of

  3. Risk analysis of analytical validations by probabilistic modification of FMEA.

    PubMed

    Barends, D M; Oldenhof, M T; Vredenbregt, M J; Nauta, M J

    2012-05-01

    Risk analysis is a valuable addition to validation of an analytical chemistry process, enabling not only detecting technical risks, but also risks related to human failures. Failure Mode and Effect Analysis (FMEA) can be applied, using a categorical risk scoring of the occurrence, detection and severity of failure modes, and calculating the Risk Priority Number (RPN) to select failure modes for correction. We propose a probabilistic modification of FMEA, replacing the categorical scoring of occurrence and detection by their estimated relative frequency and maintaining the categorical scoring of severity. In an example, the results of traditional FMEA of a Near Infrared (NIR) analytical procedure used for the screening of suspected counterfeited tablets are re-interpretated by this probabilistic modification of FMEA. Using this probabilistic modification of FMEA, the frequency of occurrence of undetected failure mode(s) can be estimated quantitatively, for each individual failure mode, for a set of failure modes, and the full analytical procedure. PMID:22410502

  4. Analytic concepts for assessing risk as applied to human space flight

    SciTech Connect

    Garrick, B.J.

    1997-04-30

    Quantitative risk assessment (QRA) principles provide an effective framework for quantifying individual elements of risk, including the risk to astronauts and spacecraft of the radiation environment of space flight. The concept of QRA is based on a structured set of scenarios that could lead to different damage states initiated by either hardware failure, human error, or external events. In the context of a spacecraft risk assessment, radiation may be considered as an external event and analyzed in the same basic way as any other contributor to risk. It is possible to turn up the microscope on any particular contributor to risk and ask more detailed questions than might be necessary to simply assess safety. The methods of QRA allow for as much fine structure in the analysis as is desired. For the purpose of developing a basis for comprehensive risk management and considering the tendency to {open_quotes}fear anything nuclear,{close_quotes} radiation risk is a prime candidate for examination beyond that necessary to answer the basic question of risk. Thus, rather than considering only the customary damage states of fatalities or loss of a spacecraft, it is suggested that the full range of damage be analyzed to quantify radiation risk. Radiation dose levels in the form of a risk curve accomplish such a result. If the risk curve is the complementary cumulative distribution function, then it answers the extended question of what is the likelihood of receiving a specific dose of radiation or greater. Such results can be converted to specific health effects as desired. Knowing the full range of the radiation risk of a space mission and the contributors to that risk provides the information necessary to take risk management actions [operational, design, scheduling of missions around solar particle events (SPE), etc.] that clearly control radiation exposure.

  5. Seismic risk assessment as applied to the Zion Nuclear Generating Station

    SciTech Connect

    Wells, J.

    1984-08-01

    To assist the US Nuclear Regulatory Commission (NRC) in its licensing and evaluation role, the NRC funded the Seismic Safety Margins Research Program (SSMRP) at Lawrence Livermore National Laboratory (LLNL) with the goal of developing tools and data bases to evaluate the risk of earthquake caused radioactive release from a commercial nuclear power plant. This paper describes the SSMRP risk assessment methodology and the results generated by applying this methodology to the Zion Nuclear Generating Station. In addition to describing the failure probabilities and risk values, the effects of assumptions about plant configuration, plant operation, and dependence will be given.

  6. A risk analysis model in concurrent engineering product development.

    PubMed

    Wu, Desheng Dash; Kefan, Xie; Gang, Chen; Ping, Gui

    2010-09-01

    Concurrent engineering has been widely accepted as a viable strategy for companies to reduce time to market and achieve overall cost savings. This article analyzes various risks and challenges in product development under the concurrent engineering environment. A three-dimensional early warning approach for product development risk management is proposed by integrating graphical evaluation and review technique (GERT) and failure modes and effects analysis (FMEA). Simulation models are created to solve our proposed concurrent engineering product development risk management model. Solutions lead to identification of key risk controlling points. This article demonstrates the value of our approach to risk analysis as a means to monitor various risks typical in the manufacturing sector. This article has three main contributions. First, we establish a conceptual framework to classify various risks in concurrent engineering (CE) product development (PD). Second, we propose use of existing quantitative approaches for PD risk analysis purposes: GERT, FMEA, and product database management (PDM). Based on quantitative tools, we create our approach for risk management of CE PD and discuss solutions of the models. Third, we demonstrate the value of applying our approach using data from a typical Chinese motor company. PMID:20840492

  7. Applying Association Rule Discovery Algorithm to Multipoint Linkage Analysis.

    PubMed

    Mitsuhashi; Hishigaki; Takagi

    1997-01-01

    Knowledge discovery in large databases (KDD) is being performed in several application domains, for example, the analysis of sales data, and is expected to be applied to other domains. We propose a KDD approach to multipoint linkage analysis, which is a way of ordering loci on a chromosome. Strict multipoint linkage analysis based on maximum likelihood estimation is a computationally tough problem. So far various kinds of approximate methods have been implemented. Our method based on the discovery of association between genetic recombinations is so different from others that it is useful to recheck the result of them. In this paper, we describe how to apply the framework of association rule discovery to linkage analysis, and also discuss that filtering input data and interpretation of discovered rules after data mining are practically important as well as data mining process itself. PMID:11072310

  8. Carbon Fiber Risk Analysis: Conclusions

    NASA Technical Reports Server (NTRS)

    Huston, R. J.

    1979-01-01

    It was concluded that preliminary estimates indicate that the public risk due to accidental release of carbon fiber from air transport aircraft is small. It was also concluded that further work is required to increase confidence in these estimates.

  9. Applied behavior analysis: New directions from the laboratory

    PubMed Central

    Epling, W. Frank; Pierce, W. David

    1983-01-01

    Applied behavior analysis began when laboratory based principles were extended to humans inorder to change socially significant behavior. Recent laboratory findings may have applied relevance; however, the majority of basic researchers have not clearly communicated the practical implications of their work. The present paper samples some of the new findings and attempts to demonstrate their applied importance. Schedule-induced behavior which occurs as a by-product of contingencies of reinforcement is discussed. Possible difficulties in treatment and management of induced behaviors are considered. Next, the correlation-based law of effect and the implications of relative reinforcement are explored in terms of applied examples. Relative rate of reinforcement is then extended to the literature dealing with concurrent operants. Concurrent operant models may describe human behavior of applied importance, and several techniques for modification of problem behavior are suggested. As a final concern, the paper discusses several new paradigms. While the practical importance of these models is not clear at the moment, it may be that new practical advantages will soon arise. Thus, it is argued that basic research continues to be of theoretical and practical importance to applied behavior analysis. PMID:22478574

  10. Overview of MSFC's Applied Fluid Dynamics Analysis Group Activities

    NASA Technical Reports Server (NTRS)

    Garcia, Roberto; Wang, Tee-See; Griffin, Lisa; Turner, James E. (Technical Monitor)

    2001-01-01

    This document is a presentation graphic which reviews the activities of the Applied Fluid Dynamics Analysis Group at Marshall Space Flight Center (i.e., Code TD64). The work of this group focused on supporting the space transportation programs. The work of the group is in Computational Fluid Dynamic tool development. This development is driven by hardware design needs. The major applications for the design and analysis tools are: turbines, pumps, propulsion-to-airframe integration, and combustion devices.

  11. Overview of MSFC's Applied Fluid Dynamics Analysis Group Activities

    NASA Technical Reports Server (NTRS)

    Garcia, Roberto; Griffin, Lisa; Williams, Robert

    2002-01-01

    This viewgraph report presents an overview of activities and accomplishments of NASA's Marshall Space Flight Center's Applied Fluid Dynamics Analysis Group. Expertise in this group focuses on high-fidelity fluids design and analysis with application to space shuttle propulsion and next generation launch technologies. Topics covered include: computational fluid dynamics research and goals, turbomachinery research and activities, nozzle research and activities, combustion devices, engine systems, MDA development and CFD process improvements.

  12. Bridging the two cultures of risk analysis

    SciTech Connect

    Jasanoff, S. )

    1993-04-01

    During the past 15 years, risk analysis has come of age as an interdisciplinary field of remarkable breadth, nurturing connections among fields as diverse as mathematics, biostatistics, toxicology, and engineering on one hand, and law, psychology, sociology, and economics on the other hand. In this editorial, the author addresses the question: What has the presence of social scientists in the network meant to the substantive development of the field of risk analysis The answers offered here discuss the substantial progress in bridging the two cultures of risk analysis. Emphasis is made of the continual need for monitoring risk analysis. Topics include: the micro-worlds of risk assessment; constraining assumptions; and exchange programs. 14 refs.

  13. Measurement uncertainty analysis techniques applied to PV performance measurements

    SciTech Connect

    Wells, C.

    1992-10-01

    The purpose of this presentation is to provide a brief introduction to measurement uncertainty analysis, outline how it is done, and illustrate uncertainty analysis with examples drawn from the PV field, with particular emphasis toward its use in PV performance measurements. The uncertainty information we know and state concerning a PV performance measurement or a module test result determines, to a significant extent, the value and quality of that result. What is measurement uncertainty analysis It is an outgrowth of what has commonly been called error analysis. But uncertainty analysis, a more recent development, gives greater insight into measurement processes and tests, experiments, or calibration results. Uncertainty analysis gives us an estimate of the I interval about a measured value or an experiment's final result within which we believe the true value of that quantity will lie. Why should we take the time to perform an uncertainty analysis A rigorous measurement uncertainty analysis: Increases the credibility and value of research results; allows comparisons of results from different labs; helps improve experiment design and identifies where changes are needed to achieve stated objectives (through use of the pre-test analysis); plays a significant role in validating measurements and experimental results, and in demonstrating (through the post-test analysis) that valid data have been acquired; reduces the risk of making erroneous decisions; demonstrates quality assurance and quality control measures have been accomplished; define Valid Data as data having known and documented paths of: Origin, including theory; measurements; traceability to measurement standards; computations; uncertainty analysis of results.

  14. Animal research in the Journal of Applied Behavior Analysis.

    PubMed

    Edwards, Timothy L; Poling, Alan

    2011-01-01

    This review summarizes the 6 studies with nonhuman animal subjects that have appeared in the Journal of Applied Behavior Analysis and offers suggestions for future research in this area. Two of the reviewed articles described translational research in which pigeons were used to illustrate and examine behavioral phenomena of applied significance (say-do correspondence and fluency), 3 described interventions that changed animals' behavior (self-injury by a baboon, feces throwing and spitting by a chimpanzee, and unsafe trailer entry by horses) in ways that benefited the animals and the people in charge of them, and 1 described the use of trained rats that performed a service to humans (land-mine detection). We suggest that each of these general research areas merits further attention and that the Journal of Applied Behavior Analysis is an appropriate outlet for some of these publications. PMID:21709802

  15. Positive Behavior Support and Applied Behavior Analysis: A Familial Alliance

    ERIC Educational Resources Information Center

    Dunlap, Glen; Carr, Edward G.; Horner, Robert H.; Zarcone, Jennifer R.; Schwartz, Ilene

    2008-01-01

    Positive behavior support (PBS) emerged in the mid-1980s as an approach for understanding and addressing problem behaviors. PBS was derived primarily from applied behavior analysis (ABA). Over time, however, PBS research and practice has incorporated evaluative methods, assessment and intervention procedures, and conceptual perspectives associated…

  16. B. F. Skinner's Contributions to Applied Behavior Analysis

    ERIC Educational Resources Information Center

    Morris, Edward K.; Smith, Nathaniel G.; Altus, Deborah E.

    2005-01-01

    Our paper reviews and analyzes B. F. Skinner's contributions to applied behavior analysis in order to assess his role as the field's originator and founder. We found, first, that his contributions fall into five categorizes: the style and content of his science, his interpretations of typical and atypical human behavior, the implications he drew…

  17. Applied Behavior Analysis Is a Science And, Therefore, Progressive

    ERIC Educational Resources Information Center

    Leaf, Justin B.; Leaf, Ronald; McEachin, John; Taubman, Mitchell; Ala'i-Rosales, Shahla; Ross, Robert K.; Smith, Tristram; Weiss, Mary Jane

    2016-01-01

    Applied behavior analysis (ABA) is a science and, therefore, involves progressive approaches and outcomes. In this commentary we argue that the spirit and the method of science should be maintained in order to avoid reductionist procedures, stifled innovation, and rote, unresponsive protocols that become increasingly removed from meaningful…

  18. Opportunities for Applied Behavior Analysis in the Total Quality Movement.

    ERIC Educational Resources Information Center

    Redmon, William K.

    1992-01-01

    This paper identifies critical components of recent organizational quality improvement programs and specifies how applied behavior analysis can contribute to quality technology. Statistical Process Control and Total Quality Management approaches are compared, and behavior analysts are urged to build their research base and market behavior change…

  19. Overview af MSFC's Applied Fluid Dynamics Analysis Group Activities

    NASA Technical Reports Server (NTRS)

    Garcia, Roberto; Griffin, Lisa; Williams, Robert

    2004-01-01

    This paper presents viewgraphs on NASA Marshall Space Flight Center's Applied Fluid Dynamics Analysis Group Activities. The topics include: 1) Status of programs at MSFC; 2) Fluid Mechanics at MSFC; 3) Relevant Fluid Dynamics Activities at MSFC; and 4) Shuttle Return to Flight.

  20. Context, Cognition, and Biology in Applied Behavior Analysis.

    ERIC Educational Resources Information Center

    Morris, Edward K.

    Behavior analysts are having their professional identities challenged by the roles that cognition and biology are said to play in the conduct and outcome of applied behavior analysis and behavior therapy. For cogniphiliacs, cognition and biology are central to their interventions because cognition and biology are said to reflect various processes,…

  1. Progressive-Ratio Schedules and Applied Behavior Analysis

    ERIC Educational Resources Information Center

    Poling, Alan

    2010-01-01

    Establishing appropriate relations between the basic and applied areas of behavior analysis has been of long and persistent interest to the author. In this article, the author illustrates that there is a direct relation between how hard an organism will work for access to an object or activity, as indexed by the largest ratio completed under a…

  2. Safety analysis, risk assessment, and risk acceptance criteria

    SciTech Connect

    Jamali, K.; Stack, D.W.; Sullivan, L.H.; Sanzo, D.L.

    1997-08-01

    This paper discusses a number of topics that relate safety analysis as documented in the Department of Energy (DOE) safety analysis reports (SARs), probabilistic risk assessments (PRA) as characterized primarily in the context of the techniques that have assumed some level of formality in commercial nuclear power plant applications, and risk acceptance criteria as an outgrowth of PRA applications. DOE SARs of interest are those that are prepared for DOE facilities under DOE Order 5480.23 and the implementing guidance in DOE STD-3009-94. It must be noted that the primary area of application for DOE STD-3009 is existing DOE facilities and that certain modifications of the STD-3009 approach are necessary in SARs for new facilities. Moreover, it is the hazard analysis (HA) and accident analysis (AA) portions of these SARs that are relevant to the present discussions. Although PRAs can be qualitative in nature, PRA as used in this paper refers more generally to all quantitative risk assessments and their underlying methods. HA as used in this paper refers more generally to all qualitative risk assessments and their underlying methods that have been in use in hazardous facilities other than nuclear power plants. This discussion includes both quantitative and qualitative risk assessment methods. PRA has been used, improved, developed, and refined since the Reactor Safety Study (WASH-1400) was published in 1975 by the Nuclear Regulatory Commission (NRC). Much debate has ensued since WASH-1400 on exactly what the role of PRA should be in plant design, reactor licensing, `ensuring` plant and process safety, and a large number of other decisions that must be made for potentially hazardous activities. Of particular interest in this area is whether the risks quantified using PRA should be compared with numerical risk acceptance criteria (RACs) to determine whether a facility is `safe.` Use of RACs requires quantitative estimates of consequence frequency and magnitude.

  3. Automating Risk Analysis of Software Design Models

    PubMed Central

    Ruiz, Guifré; Heymann, Elisa; César, Eduardo; Miller, Barton P.

    2014-01-01

    The growth of the internet and networked systems has exposed software to an increased amount of security threats. One of the responses from software developers to these threats is the introduction of security activities in the software development lifecycle. This paper describes an approach to reduce the need for costly human expertise to perform risk analysis in software, which is common in secure development methodologies, by automating threat modeling. Reducing the dependency on security experts aims at reducing the cost of secure development by allowing non-security-aware developers to apply secure development with little to no additional cost, making secure development more accessible. To automate threat modeling two data structures are introduced, identification trees and mitigation trees, to identify threats in software designs and advise mitigation techniques, while taking into account specification requirements and cost concerns. These are the components of our model for automated threat modeling, AutSEC. We validated AutSEC by implementing it in a tool based on data flow diagrams, from the Microsoft security development methodology, and applying it to VOMS, a grid middleware component, to evaluate our model's performance. PMID:25136688

  4. Automating risk analysis of software design models.

    PubMed

    Frydman, Maxime; Ruiz, Guifré; Heymann, Elisa; César, Eduardo; Miller, Barton P

    2014-01-01

    The growth of the internet and networked systems has exposed software to an increased amount of security threats. One of the responses from software developers to these threats is the introduction of security activities in the software development lifecycle. This paper describes an approach to reduce the need for costly human expertise to perform risk analysis in software, which is common in secure development methodologies, by automating threat modeling. Reducing the dependency on security experts aims at reducing the cost of secure development by allowing non-security-aware developers to apply secure development with little to no additional cost, making secure development more accessible. To automate threat modeling two data structures are introduced, identification trees and mitigation trees, to identify threats in software designs and advise mitigation techniques, while taking into account specification requirements and cost concerns. These are the components of our model for automated threat modeling, AutSEC. We validated AutSEC by implementing it in a tool based on data flow diagrams, from the Microsoft security development methodology, and applying it to VOMS, a grid middleware component, to evaluate our model's performance. PMID:25136688

  5. Initial Risk Analysis and Decision Making Framework

    SciTech Connect

    Engel, David W.

    2012-02-01

    Commercialization of new carbon capture simulation initiative (CCSI) technology will include two key elements of risk management, namely, technical risk (will process and plant performance be effective, safe, and reliable) and enterprise risk (can project losses and costs be controlled within the constraints of market demand to maintain profitability and investor confidence). Both of these elements of risk are incorporated into the risk analysis subtask of Task 7. Thus far, this subtask has developed a prototype demonstration tool that quantifies risk based on the expected profitability of expenditures when retrofitting carbon capture technology on a stylized 650 MW pulverized coal electric power generator. The prototype is based on the selection of specific technical and financial factors believed to be important determinants of the expected profitability of carbon capture, subject to uncertainty. The uncertainty surrounding the technical performance and financial variables selected thus far is propagated in a model that calculates the expected profitability of investments in carbon capture and measures risk in terms of variability in expected net returns from these investments. Given the preliminary nature of the results of this prototype, additional work is required to expand the scope of the model to include additional risk factors, additional information on extant and proposed risk factors, the results of a qualitative risk factor elicitation process, and feedback from utilities and other interested parties involved in the carbon capture project. Additional information on proposed distributions of these risk factors will be integrated into a commercial implementation framework for the purpose of a comparative technology investment analysis.

  6. Research in applied mathematics, numerical analysis, and computer science

    NASA Technical Reports Server (NTRS)

    1984-01-01

    Research conducted at the Institute for Computer Applications in Science and Engineering (ICASE) in applied mathematics, numerical analysis, and computer science is summarized and abstracts of published reports are presented. The major categories of the ICASE research program are: (1) numerical methods, with particular emphasis on the development and analysis of basic numerical algorithms; (2) control and parameter identification; (3) computational problems in engineering and the physical sciences, particularly fluid dynamics, acoustics, and structural analysis; and (4) computer systems and software, especially vector and parallel computers.

  7. Risk analysis of computer system designs

    NASA Technical Reports Server (NTRS)

    Vallone, A.

    1981-01-01

    Adverse events during implementation can affect final capabilities, schedule and cost of a computer system even though the system was accurately designed and evaluated. Risk analysis enables the manager to forecast the impact of those events and to timely ask for design revisions or contingency plans before making any decision. This paper presents a structured procedure for an effective risk analysis. The procedure identifies the required activities, separates subjective assessments from objective evaluations, and defines a risk measure to determine the analysis results. The procedure is consistent with the system design evaluation and enables a meaningful comparison among alternative designs.

  8. Treatment integrity in applied behavior analysis with children.

    PubMed

    Gresham, F M; Gansle, K A; Noell, G H

    1993-01-01

    Functional analysis of behavior depends upon accurate measurement of both independent and dependent variables. Quantifiable and controllable operations that demonstrate these functional relationships are necessary for a science of human behavior. Failure to implement independent variables with integrity threatens the internal and external validity of experiments. A review of all applied behavior analysis studies with children as subjects that have been published in the Journal of Applied Behavior Analysis between 1980 and 1990 found that approximately 16% of these studies measured the accuracy of independent variable implementation. Two thirds of these studies did not operationally define the components of the independent variable. Specific recommendations for improving the accuracy of independent variable implementation and for defining independent variables are discussed. PMID:8331022

  9. Predicting pathogen transport and risk of infection from land-applied biosolids

    NASA Astrophysics Data System (ADS)

    Olson, M. S.; Teng, J.; Kumar, A.; Gurian, P.

    2011-12-01

    Biosolids have been recycled as fertilizer to sustainably improve and maintain productive soils and to stimulate plant growth for over forty years, but may contain low levels of microbial pathogens. The Spreadsheet Microbial Assessment of Risk: Tool for Biosolids ("SMART Biosolids") is an environmental transport, exposure and risk model that compiles knowledge on the occurrence, environmental dispersion and attenuation of biosolids-associated pathogens to estimate microbial risk from biosolids land application. The SMART Biosolids model calculates environmental pathogen concentrations and assesses risk associated with exposure to pathogens from land-applied biosolids through five pathways: 1) inhalation of aerosols from land application sites, 2) consumption of groundwater contaminated by land-applied biosolids, 3) direct ingestion of biosolids-amended soils, 4) ingestion of plants contaminated by land-applied biosolids, and 5) consumption of surface water contaminated by runoff from a land application site. The SMART Biosolids model can be applied under a variety of scenarios, thereby providing insight into effective management practices. This study presents example results of the SMART Biosolids model, focusing on the groundwater and surface water pathways, following biosolids application to a typical site in Michigan. Volumes of infiltration and surface water runoff are calculated following a 100-year storm event. Pathogen transport and attenuation through the subsurface and via surface runoff are modeled, and pathogen concentrations in a downstream well and an adjacent pond are calculated. Risks are calculated for residents of nearby properties. For a 100-year storm event occurring immediately after biosolids application, the surface water pathway produces risks that may be of some concern, but best estimates do not exceed the bounds of what has been considered acceptable risk for recreational water use (Table 1); groundwater risks are very uncertain and at the

  10. DETERMINING SIGNIFICANT ENDPOINTS FOR ECOLOGICAL RISK ANALYSIS

    EPA Science Inventory

    Risk analyses, both human health and ecological, will be important factors in determining which DOE sites should be cleaned up and in deciding if acceptable performance standards have been met. Risk analysis procedures for humans use the individual as the 'unit' of observation, a...

  11. CUMULATIVE RISK ANALYSIS FOR ORGANOPHOSPHORUS PESTICIDES

    EPA Science Inventory

    Cumulative Risk Analysis for Organophosphorus Pesticides
    R. Woodrow Setzer, Jr. NHEERL MD-74, USEPA, RTP, NC 27711

    The US EPA has recently completed a risk assessment of the effects of exposure to 33 organophosphorous pesticides (OPs) through the diet, water, and resi...

  12. Risk analysis approach. [of carbon fiber release

    NASA Technical Reports Server (NTRS)

    Huston, R. J.

    1979-01-01

    The assessment of the carbon fiber hazard is outlined. Program objectives, requirements of the risk analysis, and elements associated with the physical phenomena of the accidental release are described.

  13. Risk analysis of dust explosion scenarios using Bayesian networks.

    PubMed

    Yuan, Zhi; Khakzad, Nima; Khan, Faisal; Amyotte, Paul

    2015-02-01

    In this study, a methodology has been proposed for risk analysis of dust explosion scenarios based on Bayesian network. Our methodology also benefits from a bow-tie diagram to better represent the logical relationships existing among contributing factors and consequences of dust explosions. In this study, the risks of dust explosion scenarios are evaluated, taking into account common cause failures and dependencies among root events and possible consequences. Using a diagnostic analysis, dust particle properties, oxygen concentration, and safety training of staff are identified as the most critical root events leading to dust explosions. The probability adaptation concept is also used for sequential updating and thus learning from past dust explosion accidents, which is of great importance in dynamic risk assessment and management. We also apply the proposed methodology to a case study to model dust explosion scenarios, to estimate the envisaged risks, and to identify the vulnerable parts of the system that need additional safety measures. PMID:25264172

  14. Risk analysis in bioequivalence and biowaiver decisions.

    PubMed

    Kubbinga, Marlies; Langguth, Peter; Barends, Dirk

    2013-07-01

    This article evaluates the current biowaiver guidance documents published by the FDA, EU and WHO from a risk based perspective. The authors introduce the use of a Failure Mode and Effect Analysis (FMEA) risk calculation tool to show that current regulatory documents implicitly limit the risk for bioinequivalence after granting a biowaiver by reduction of the incidence, improving the detection and limiting the severity of any unforeseen bioinequivalent product. In addition, the authors use the risk calculation to expose yet unexplored options for future extension of comparative in vitro tools for biowaivers. PMID:23280474

  15. RISK ANALYSIS, ANALYSIS OF VARIANCE: GETTING MORE FROM OUR DATA

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Analysis of variance (ANOVA) and regression are common statistical techniques used to analyze agronomic experimental data and determine significant differences among yields due to treatments or other experimental factors. Risk analysis provides an alternate and complimentary examination of the same...

  16. Measurement uncertainty analysis techniques applied to PV performance measurements

    SciTech Connect

    Wells, C

    1992-10-01

    The purpose of this presentation is to provide a brief introduction to measurement uncertainty analysis, outline how it is done, and illustrate uncertainty analysis with examples drawn from the PV field, with particular emphasis toward its use in PV performance measurements. The uncertainty information we know and state concerning a PV performance measurement or a module test result determines, to a significant extent, the value and quality of that result. What is measurement uncertainty analysis? It is an outgrowth of what has commonly been called error analysis. But uncertainty analysis, a more recent development, gives greater insight into measurement processes and tests, experiments, or calibration results. Uncertainty analysis gives us an estimate of the I interval about a measured value or an experiment`s final result within which we believe the true value of that quantity will lie. Why should we take the time to perform an uncertainty analysis? A rigorous measurement uncertainty analysis: Increases the credibility and value of research results; allows comparisons of results from different labs; helps improve experiment design and identifies where changes are needed to achieve stated objectives (through use of the pre-test analysis); plays a significant role in validating measurements and experimental results, and in demonstrating (through the post-test analysis) that valid data have been acquired; reduces the risk of making erroneous decisions; demonstrates quality assurance and quality control measures have been accomplished; define Valid Data as data having known and documented paths of: Origin, including theory; measurements; traceability to measurement standards; computations; uncertainty analysis of results.

  17. Boston Society's 11th Annual Applied Pharmaceutical Analysis conference.

    PubMed

    Lee, Violet; Liu, Ang; Groeber, Elizabeth; Moghaddam, Mehran; Schiller, James; Tweed, Joseph A; Walker, Gregory S

    2016-02-01

    Boston Society's 11th Annual Applied Pharmaceutical Analysis conference, Hyatt Regency Hotel, Cambridge, MA, USA, 14-16 September 2015 The Boston Society's 11th Annual Applied Pharmaceutical Analysis (APA) conference took place at the Hyatt Regency hotel in Cambridge, MA, on 14-16 September 2015. The 3-day conference affords pharmaceutical professionals, academic researchers and industry regulators the opportunity to collectively participate in meaningful and relevant discussions impacting the areas of pharmaceutical drug development. The APA conference was organized in three workshops encompassing the disciplines of regulated bioanalysis, discovery bioanalysis (encompassing new and emerging technologies) and biotransformation. The conference included a short course titled 'Bioanalytical considerations for the clinical development of antibody-drug conjugates (ADCs)', an engaging poster session, several panel and round table discussions and over 50 diverse talks from leading industry and academic scientists. PMID:26853375

  18. Failure risk assessment by analysis and testing

    NASA Technical Reports Server (NTRS)

    Moore, N.; Ebbeler, D.; Creager, M.

    1992-01-01

    The sources of information on which to base an evaluation of reliability or failure risk of an aerospace flight system are (1) experience from tests and flights and (2) engineering analysis. It is rarely feasible to establish high reliability at high confidence by testing aerospace systems or components. Moreover, failure prediction by conventional, deterministic methods of engineering analysis can become arbitrary and subject to serious misinterpretation when uncertain or approximate information is used to establish analysis parameter values and to calibrate the accuracy of engineering models. The limitations of testing to evaluate failure risk are discussed, and a statistical approach which incorporates both engineering analysis and testing is presented.

  19. Applied behavior analysis at West Virginia University: A brief history.

    PubMed

    Hawkins, R P; Chase, P N; Scotti, J R

    1993-01-01

    The development of an emphasis on applied behavior analysis in the Department of Psychology at West Virginia University is traced. The emphasis began primarily in the early 1970s, under the leadership of Roger Maley and Jon Krapfl, and has continued to expand and evolve with the participation of numerous behavior analysts and behavior therapists, both inside and outside the department. The development has been facilitated by several factors: establishment of a strong behavioral emphasis in the three Clinical graduate programs; change of the graduate program in Experimental Psychology to a program in basic Behavior Analysis; development of nonclinical applied behavior analysis within the Behavior Analysis program; establishment of a joint graduate program with Educational Psychology; establishment of a Community/Systems graduate program; and organization of numerous conferences. Several factors are described that seem to assure a stable role for behavior analysis in the department: a stable and supportive "culture" within the department; American Psychological Association accreditation of the clinical training; a good reputation both within the university and in psychology; and a broader community of behavior analysts and behavior therapists. PMID:16795816

  20. Fuzzy risk analysis for nuclear safeguards

    SciTech Connect

    Zardecki, A.

    1993-05-01

    Analysis of a safeguards system, based on the notion of fuzzy sets and linguistic variables, concerns such as complexity and inherent imprecision in estimating the possibility of loss or compromise. The automated risk analysis allows the risk to be determined for an entire system based on estimates for lowest level components and the component proportion. In addition, for each component (asset) the most effective combination of protection mechanisms against a given set of threats is determined. A distinction between bar and featured risk is made.

  1. Fuzzy risk analysis for nuclear safeguards

    SciTech Connect

    Zardecki, A.

    1993-01-01

    Analysis of a safeguards system, based on the notion of fuzzy sets and linguistic variables, concerns such as complexity and inherent imprecision in estimating the possibility of loss or compromise. The automated risk analysis allows the risk to be determined for an entire system based on estimates for lowest level components and the component proportion. In addition, for each component (asset) the most effective combination of protection mechanisms against a given set of threats is determined. A distinction between bar and featured risk is made.

  2. RiskSOAP: Introducing and applying a methodology of risk self-awareness in road tunnel safety.

    PubMed

    Chatzimichailidou, Maria Mikela; Dokas, Ioannis M

    2016-05-01

    Complex socio-technical systems, such as road tunnels, can be designed and developed with more or less elements that can either positively or negatively affect the capability of their agents to recognise imminent threats or vulnerabilities that possibly lead to accidents. This capability is called risk Situation Awareness (SA) provision. Having as a motive the introduction of better tools for designing and developing systems that are self-aware of their vulnerabilities and react to prevent accidents and losses, this paper introduces the Risk Situation Awareness Provision (RiskSOAP) methodology to the field of road tunnel safety, as a means to measure this capability in this kind of systems. The main objective is to test the soundness and the applicability of RiskSOAP to infrastructure, which is advanced in terms of technology, human integration, and minimum number of safety requirements imposed by international bodies. RiskSOAP is applied to a specific road tunnel in Greece and the accompanying indicator is calculated twice, once for the tunnel design as defined by updated European safety standards and once for the 'as-is' tunnel composition, which complies with the necessary safety requirements, but calls for enhancing safety according to what EU and PIARC further suggest. The derived values indicate the extent to which each tunnel version is capable of comprehending its threats and vulnerabilities based on its elements. The former tunnel version seems to be more enhanced both in terms of it risk awareness capability and safety as well. Another interesting finding is that despite the advanced tunnel safety specifications, there is still room for enriching the safe design and maintenance of the road tunnel. PMID:26938583

  3. Probabilistic Analysis Techniques Applied to Complex Spacecraft Power System Modeling

    NASA Technical Reports Server (NTRS)

    Hojnicki, Jeffrey S.; Rusick, Jeffrey J.

    2005-01-01

    Electric power system performance predictions are critical to spacecraft, such as the International Space Station (ISS), to ensure that sufficient power is available to support all the spacecraft s power needs. In the case of the ISS power system, analyses to date have been deterministic, meaning that each analysis produces a single-valued result for power capability because of the complexity and large size of the model. As a result, the deterministic ISS analyses did not account for the sensitivity of the power capability to uncertainties in model input variables. Over the last 10 years, the NASA Glenn Research Center has developed advanced, computationally fast, probabilistic analysis techniques and successfully applied them to large (thousands of nodes) complex structural analysis models. These same techniques were recently applied to large, complex ISS power system models. This new application enables probabilistic power analyses that account for input uncertainties and produce results that include variations caused by these uncertainties. Specifically, N&R Engineering, under contract to NASA, integrated these advanced probabilistic techniques with Glenn s internationally recognized ISS power system model, System Power Analysis for Capability Evaluation (SPACE).

  4. Quantitative Microbial Risk Assessment in Occupational Settings Applied to the Airborne Human Adenovirus Infection.

    PubMed

    Carducci, Annalaura; Donzelli, Gabriele; Cioni, Lorenzo; Verani, Marco

    2016-01-01

    Quantitative Microbial Risk Assessment (QMRA) methodology, which has already been applied to drinking water and food safety, may also be applied to risk assessment and management at the workplace. The present study developed a preliminary QMRA model to assess microbial risk that is associated with inhaling bioaerosols that are contaminated with human adenovirus (HAdV). This model has been applied to air contamination data from different occupational settings, including wastewater systems, solid waste landfills, and toilets in healthcare settings and offices, with different exposure times. Virological monitoring showed the presence of HAdVs in all the evaluated settings, thus confirming that HAdV is widespread, but with different average concentrations of the virus. The QMRA results, based on these concentrations, showed that toilets had the highest probability of viral infection, followed by wastewater treatment plants and municipal solid waste landfills. Our QMRA approach in occupational settings is novel, and certain caveats should be considered. Nonetheless, we believe it is worthy of further discussions and investigations. PMID:27447658

  5. Quantitative Microbial Risk Assessment in Occupational Settings Applied to the Airborne Human Adenovirus Infection

    PubMed Central

    Carducci, Annalaura; Donzelli, Gabriele; Cioni, Lorenzo; Verani, Marco

    2016-01-01

    Quantitative Microbial Risk Assessment (QMRA) methodology, which has already been applied to drinking water and food safety, may also be applied to risk assessment and management at the workplace. The present study developed a preliminary QMRA model to assess microbial risk that is associated with inhaling bioaerosols that are contaminated with human adenovirus (HAdV). This model has been applied to air contamination data from different occupational settings, including wastewater systems, solid waste landfills, and toilets in healthcare settings and offices, with different exposure times. Virological monitoring showed the presence of HAdVs in all the evaluated settings, thus confirming that HAdV is widespread, but with different average concentrations of the virus. The QMRA results, based on these concentrations, showed that toilets had the highest probability of viral infection, followed by wastewater treatment plants and municipal solid waste landfills. Our QMRA approach in occupational settings is novel, and certain caveats should be considered. Nonetheless, we believe it is worthy of further discussions and investigations. PMID:27447658

  6. Pathogen risk assessment of land applied wastewater and biosolids: A fuzzy set approach

    SciTech Connect

    Dahab, M.F.; Fuerhacker, M.; Zibuschka, F.

    1998-07-01

    There are major concerns associated with land application of wastewater and biosolids including the potential risk to public health from water-borne pollutants that may enter the food chain and from pathogens that may be present in the wastewater. These risks are of particular concern when wastewater is applied to land where crops are grown as part of the human food chain or when direct human contact with the wastewater may occur. In many communities, toxic chemicals may not be present in the biosolids, or their concentrations may be reduced through source control measures. However, pathogens that enter wastewater from infected individuals cannot be controlled at the source and are often found in wastewater or biosolids applied to land. Public health officials have emphasized that microbial pathogens (or pathogen indicators) should not occur in areas where exposure to humans is likely. Under this criteria, the concept of risk assessment which requires the characterization of the occurrence of pathogens, almost seems to be contradictory to basic public health goals. As the understanding of pathogen and pathogen indicator occurrence becomes better refined, the arguments for finding practical application of risk assessment for pathogenic organisms become more compelling.

  7. Probabilistic risk assessment of veterinary medicines applied to four major aquaculture species produced in Asia.

    PubMed

    Rico, Andreu; Van den Brink, Paul J

    2014-01-15

    Aquaculture production constitutes one of the main sources of pollution with veterinary medicines into the environment. About 90% of the global aquaculture production is produced in Asia and the potential environmental risks associated with the use of veterinary medicines in Asian aquaculture have not yet been properly evaluated. In this study we performed a probabilistic risk assessment for eight different aquaculture production scenarios in Asia by combining up-to-date information on the use of veterinary medicines and aquaculture production characteristics. The ERA-AQUA model was used to perform mass balances of veterinary medicinal treatments applied to aquaculture ponds and to characterize risks for primary producers, invertebrates, and fish potentially exposed to chemical residues through aquaculture effluents. The mass balance calculations showed that, on average, about 25% of the applied drug mass to aquaculture ponds is released into the environment, although this percentage varies with the chemical's properties, the mode of application, the cultured species density, and the water exchange rates in the aquaculture pond scenario. In general, the highest potential environmental risks were calculated for parasitic treatments, followed by disinfection and antibiotic treatments. Pangasius catfish production in Vietnam, followed by shrimp production in China, constitute possible hot-spots for environmental pollution due to the intensity of the aquaculture production and considerable discharge of toxic chemical residues into surrounding aquatic ecosystems. A risk-based ranking of compounds is provided for each of the evaluated scenarios, which offers crucial information for conducting further chemical and biological field and laboratory monitoring research. In addition, we discuss general knowledge gaps and research priorities for performing refined risk assessments of aquaculture medicines in the near future. PMID:24061054

  8. Activity anorexia: An interplay between basic and applied behavior analysis

    PubMed Central

    Pierce, W. David; Epling, W. Frank; Dews, Peter B.; Estes, William K.; Morse, William H.; Van Orman, Willard; Herrnstein, Richard J.

    1994-01-01

    The relationship between basic research with nonhumans and applied behavior analysis is illustrated by our work on activity anorexia. When rats are fed one meal a day and allowed to run on an activity wheel, they run excessively, stop eating, and die of starvation. Convergent evidence, from several different research areas, indicates that the behavior of these animals and humans who self-starve is functionally similar. A biobehavioral theory of activity anorexia is presented that details the cultural contingencies, behavioral processes, and physiology of anorexia. Diagnostic criteria and a three-stage treatment program for activity-based anorexia are outlined. The animal model permits basic research on anorexia that for practical and ethical reasons cannot be conducted with humans. Thus, basic research can have applied importance. PMID:22478169

  9. Risk Assessment and Integration Team (RAIT) Portfolio Risk Analysis Strategy

    NASA Technical Reports Server (NTRS)

    Edwards, Michelle

    2010-01-01

    Impact at management level: Qualitative assessment of risk criticality in conjunction with risk consequence, likelihood, and severity enable development of an "investment policy" towards managing a portfolio of risks. Impact at research level: Quantitative risk assessments enable researchers to develop risk mitigation strategies with meaningful risk reduction results. Quantitative assessment approach provides useful risk mitigation information.

  10. Cladistic analysis applied to the classification of volcanoes

    NASA Astrophysics Data System (ADS)

    Hone, D. W. E.; Mahony, S. H.; Sparks, R. S. J.; Martin, K. T.

    2007-11-01

    Cladistics is a systematic method of classification that groups entities on the basis of sharing similar characteristics in the most parsimonious manner. Here cladistics is applied to the classification of volcanoes using a dataset of 59 Quaternary volcanoes and 129 volcanic edifices of the Tohoku region, Northeast Japan. Volcano and edifice characteristics recorded in the database include attributes of volcano size, chemical composition, dominant eruptive products, volcano morphology, dominant landforms, volcano age and eruptive history. Without characteristics related to time the volcanic edifices divide into two groups, with characters related to volcano size, dominant composition and edifice morphology being the most diagnostic. Analysis including time based characteristics yields four groups with a good correlation between these groups and the two groups from the analysis without time for 108 out of 129 volcanic edifices. Thus when characters are slightly changed the volcanoes still form similar groupings. Analysis of the volcanoes both with and without time yields three groups based on compositional, eruptive products and morphological characters. Spatial clusters of volcanic centres have been recognised in the Tohoku region by Tamura et al. ( Earth Planet Sci Lett 197:105 106, 2002). The groups identified by cladistic analysis are distributed unevenly between the clusters, indicating a tendency for individual clusters to form similar kinds of volcanoes with distinctive but coherent styles of volcanism. Uneven distribution of volcano types between clusters can be explained by variations in dominant magma compositions through time, which are reflected in eruption products and volcanic landforms. Cladistic analysis can be a useful tool for elucidating dynamic igneous processes that could be applied to other regions and globally. Our exploratory study indicates that cladistics has promise as a method for classifying volcanoes and potentially elucidating dynamic

  11. Applying Atherosclerotic Risk Prevention Guidelines to Elderly Patients: A Bridge Too Far?

    PubMed

    Feldman, Ross D; Harris, Stewart B; Hegele, Robert A; Pickering, J Geoffrey; Rockwood, Kenneth

    2016-05-01

    The primary prevention of atherosclerotic disease is on the basis of optimal management of the major risk factors. For the major risk factors of diabetes, hypertension, and dyslipidemia, management for most patients is on the basis of well developed and extensive evidence-based diagnostic and therapeutic guidelines. However, for a growing segment of the population who are at the highest risk for atherosclerotic disease (ie, older adults), the application of these guidelines is problematic. First, few studies that form the evidence base for these primary prevention guidelines actually include substantial numbers of elderly subjects. Second, elderly patients represent a special population from multiple perspectives related to their accumulation of health deficits and in their development of frailty. These patients with frailty and multiple comorbidities have been mostly excluded from the primary prevention studies upon which the guidelines are based yet comprise a very significant proportion of the very elderly population. Third, elderly people are at most risk from adverse drug reactions because of the increasing number of medications prescribed in this patient population. When applying the existing guidelines to elderly people the limitations of our knowledge must be recognized regarding how best to mitigate the high risk of heart disease in our aging population and how to generalize these recommendations to the management of the largest subgroup of elderly patients (ie, those with multiple comorbidities and frail older adults). PMID:27040095

  12. A risk assessment tool applied to the study of shale gas resources.

    PubMed

    Veiguela, Miguel; Hurtado, Antonio; Eguilior, Sonsoles; Recreo, Fernando; Roqueñi, Nieves; Loredo, Jorge

    2016-11-15

    The implementation of a risk assessment tool with the capacity to evaluate the risks for health, safety and the environment (HSE) from extraction of non-conventional fossil fuel resources by the hydraulic fracturing (fracking) technique can be a useful tool to boost development and progress of the technology and winning public trust and acceptance of this. At the early project stages, the lack of data related the selection of non-conventional gas deposits makes it difficult the use of existing approaches to risk assessment of fluids injected into geologic formations. The qualitative risk assessment tool developed in this work is based on the approach that shale gas exploitation risk is dependent on both the geologic site and the technological aspects. It follows from the Oldenburg's 'Screening and Ranking Framework (SRF)' developed to evaluate potential geologic carbon dioxide (CO2) storage sites. These two global characteristics: (1) characteristics centered on the natural aspects of the site and (2) characteristics centered on the technological aspects of the Project, have been evaluated through user input of Property values, which define Attributes, which define the Characteristics. In order to carry out an individual evaluation of each of the characteristics and the elements of the model, the tool has been implemented in a spreadsheet. The proposed model has been applied to a site with potential for the exploitation of shale gas in Asturias (northwestern Spain) with tree different technological options to test the approach. PMID:27453140

  13. Applying MORT maintenance safety analysis in Finnish industry

    NASA Astrophysics Data System (ADS)

    Ruuhilehto, Kaarin; Virolainen, Kimmo

    1992-02-01

    A safety analysis method based on MORT (Management Oversight and Risk Tree) method, especially on the version developed for safety considerations in the evaluation of maintenance programs, is presented. The MORT maintenance safety analysis is intended especially for the use maintenance safety management. The analysis helps managers evaluate the goals of their safety work and measures taken to reach them. The analysis is done by a team or teams. The team ought to have expert knowledge of the organization both vertically and horizontally in order to be able to identify factors that may contribute to accidents or other interruptions in the maintenance work. Identification is made by using the MORT maintenance key question set as a check list. The questions check the way safety matters are connnected with the maintenance planning and managing, as well as the safety management itself. In the second stage, means to eliminate the factors causing problems are developed. New practices are established to improve safety of maintenance planning and managing in the enterprise.

  14. Scanning proton microprobe analysis applied to wood and bark samples

    NASA Astrophysics Data System (ADS)

    Lövestam, N. E. G.; Johansson, E.-M.; Johansson, S. A. E.; Pallon, J.

    1990-04-01

    In this study the feasibility of applying scanning micro-PIXE to analysis of wood and bark samples is demonstrated. Elemental mapping of the analysed sections show the patterns of Cl, K, Ca, Mn, Fe, Cu and Zn. Some of these patterns can be related to the annual tree ring structure. It is observed that the variation of elements having an environmental character can be rather large within a single tree ring, thus illuminating possible difficulties when using tree ring sections as a pollution monitor. The variations in elemental concentrations when crossing from bark to wood are also shown to be smooth for some elements but rather abrupt for others.

  15. A risk analysis model for radioactive wastes.

    PubMed

    Külahcı, Fatih

    2011-07-15

    Hazardous wastes affect natural environmental systems to a significant extend, and therefore, it is necessary to control their harm through risk analysis. Herein, an effective risk methodology is proposed by considering their uncertain behaviors on stochastic, statistical and probabilistic bases. The basic element is attachment of a convenient probability distribution function (pdf) to a given waste quality measurement sequence. In this paper, (40)K contaminant measurements are adapted for risk assessment application after derivation of necessary fundamental formulations. The spatial contaminant distribution of (40)K is presented in the forms of maps and three-dimensional surfaces. PMID:21571428

  16. Risk Based Inspection Methodology and Software Applied to Atmospheric Storage Tanks

    NASA Astrophysics Data System (ADS)

    Topalis, P.; Korneliussen, G.; Hermanrud, J.; Steo, Y.

    2012-05-01

    A new risk-based inspection (RBI) methodology and software is presented in this paper. The objective of this work is to allow management of the inspections of atmospheric storage tanks in the most efficient way, while, at the same time, accident risks are minimized. The software has been built on the new risk framework architecture, a generic platform facilitating efficient and integrated development of software applications using risk models. The framework includes a library of risk models and the user interface is automatically produced on the basis of editable schemas. This risk-framework-based RBI tool has been applied in the context of RBI for above-ground atmospheric storage tanks (AST) but it has been designed with the objective of being generic enough to allow extension to the process plants in general. This RBI methodology is an evolution of an approach and mathematical models developed for Det Norske Veritas (DNV) and the American Petroleum Institute (API). The methodology assesses damage mechanism potential, degradation rates, probability of failure (PoF), consequence of failure (CoF) in terms of environmental damage and financial loss, risk and inspection intervals and techniques. The scope includes assessment of the tank floor for soil-side external corrosion and product-side internal corrosion and the tank shell courses for atmospheric corrosion and internal thinning. It also includes preliminary assessment for brittle fracture and cracking. The data are structured according to an asset hierarchy including Plant, Production Unit, Process Unit, Tag, Part and Inspection levels and the data are inherited / defaulted seamlessly from a higher hierarchy level to a lower level. The user interface includes synchronized hierarchy tree browsing, dynamic editor and grid-view editing and active reports with drill-in capability.

  17. Risk assessment applied to air emissions from a medium-sized Italian MSW incinerator.

    PubMed

    Morselli, Luciano; Passarini, Fabrizio; Piccari, Laura; Vassura, Ivano; Bernardi, Elena

    2011-10-01

    Risk assessment is a well established procedure for the analysis of the adverse impacts of pollutant substances emitted by waste treatment plants. The aim of the present study was the determination of the impact on human health associated with the activities of an incinerator in the Emilia-Romagna region (Northern Italy). The dispersion of heavy metals and organic pollutants monitored at plant stacks was predicted by the Gaussian model ISC3 (US-EPA). This analysis led to the estimation of risk, connected with various pollutants showing toxic and carcinogenic activities, for different receptors. The values obtained were first compared with the acceptability limits set by US-EPA, and then graphically represented as a territorial dispersion. A cautious approach was followed to calculate risk, by considering the worst, albeit realistic and reliable, estimate for the different parameters. The calculated exposure pathways resulted in different contributions depending on the receptor category (children and adults), even if direct exposure (via inhalation) is generally predominant. However, the resulting risk for both single pollutants studied and their combination all together proved to be within the acceptable limits (all lifetime individual risks being below 10(-6)), according to the procedure followed. The obtained results highlight the importance of using reliable monitoring data on the studied contamination source and, in particular, suggest the advisability of a more in-depth study on the pollution from incineration stacks. PMID:20813764

  18. Risk Interfaces to Support Integrated Systems Analysis and Development

    NASA Technical Reports Server (NTRS)

    Mindock, Jennifer; Lumpkins, Sarah; Shelhamer, Mark; Anton, Wilma; Havenhill, Maria

    2016-01-01

    Objectives for systems analysis capability: Develop integrated understanding of how a complex human physiological-socio-technical mission system behaves in spaceflight. Why? Support development of integrated solutions that prevent unwanted outcomes (Implementable approaches to minimize mission resources(mass, power, crew time, etc.)); Support development of tools for autonomy (need for exploration) (Assess and maintain resilience -individuals, teams, integrated system). Output of this exercise: -Representation of interfaces based on Human System Risk Board (HSRB) Risk Summary information and simple status based on Human Research Roadmap; Consolidated HSRB information applied to support communication; Point-of-Departure for HRP Element planning; Ability to track and communicate status of collaborations. 4

  19. Climate change, land slide risks and sustainable development, risk analysis and decision support process tool

    NASA Astrophysics Data System (ADS)

    Andersson-sköld, Y. B.; Tremblay, M.

    2011-12-01

    Climate change is in most parts of Sweden expected to result in increased precipitation and increased sea water levels causing flooding, erosion, slope instability and related secondary consequences. Landslide risks are expected to increase with climate change in large parts of Sweden due to increased annual precipitation, more intense precipitation and increased flows combined with dryer summers. In response to the potential climate related risks, and on the commission of the Ministry of Environment, the Swedish Geotechnical Institute (SGI) is at present performing a risk analysis project for the most prominent landslide risk area in Sweden: the Göta river valley. As part of this, a methodology for land slide ex-ante consequence analysis today, and in a future climate, has been developed and applied in the Göta river valley. Human life, settlements, industry, contaminated sites, infrastructure of national importance are invented and assessed important elements at risk. The goal of the consequence analysis is to produce a map of geographically distributed expected losses, which can be combined with a corresponding map displaying landslide probability to describe the risk (the combination of probability and consequence of a (negative) event). The risk analysis is GIS-aided in presenting and visualise the risk and using existing databases for quantification of the consequences represented by ex-ante estimated monetary losses. The results will be used on national, regional and as an indication of the risk on local level, to assess the need of measures to mitigate the risk. The costs and environmental and social impacts to mitigate the risk are expected to be very high but the costs and impacts of a severe landslide are expected to be even higher. Therefore, civil servants have pronounced a need of tools to assess both the vulnerability and a more holistic picture of impacts of climate change adaptation measures. At SGI a tool for the inclusion of sustainability

  20. Towards secure virtual directories : a risk analysis framework.

    SciTech Connect

    Claycomb, William R.

    2010-07-01

    Directory services are used by almost every enterprise computing environment to provide data concerning users, computers, contacts, and other objects. Virtual directories are components that provide directory services in a highly customized manner. Unfortunately, though the use of virtual directory services are widespread, an analysis of risks posed by their unique position and architecture has not been completed. We present a detailed analysis of six attacks to virtual directory services, including steps for detection and prevention. We also describe various categories of attack risks, and discuss what is necessary to launch an attack on virtual directories. Finally, we present a framework to use in analyzing risks to individual enterprise computing virtual directory instances. We show how to apply this framework to an example implementation, and discuss the benefits of doing so.

  1. Empirical modal decomposition applied to cardiac signals analysis

    NASA Astrophysics Data System (ADS)

    Beya, O.; Jalil, B.; Fauvet, E.; Laligant, O.

    2010-01-01

    In this article, we present the method of empirical modal decomposition (EMD) applied to the electrocardiograms and phonocardiograms signals analysis and denoising. The objective of this work is to detect automatically cardiac anomalies of a patient. As these anomalies are localized in time, therefore the localization of all the events should be preserved precisely. The methods based on the Fourier Transform (TFD) lose the localization property [13] and in the case of Wavelet Transform (WT) which makes possible to overcome the problem of localization, but the interpretation remains still difficult to characterize the signal precisely. In this work we propose to apply the EMD (Empirical Modal Decomposition) which have very significant properties on pseudo periodic signals. The second section describes the algorithm of EMD. In the third part we present the result obtained on Phonocardiograms (PCG) and on Electrocardiograms (ECG) test signals. The analysis and the interpretation of these signals are given in this same section. Finally, we introduce an adaptation of the EMD algorithm which seems to be very efficient for denoising.

  2. Applying risk and resilience models to predicting the effects of media violence on development.

    PubMed

    Prot, Sara; Gentile, Douglas A

    2014-01-01

    Although the effects of media violence on children and adolescents have been studied for over 50 years, they remain controversial. Much of this controversy is driven by a misunderstanding of causality that seeks the cause of atrocities such as school shootings. Luckily, several recent developments in risk and resilience theories offer a way out of this controversy. Four risk and resilience models are described, including the cascade model, dose-response gradients, pathway models, and turning-point models. Each is described and applied to the existing media effects literature. Recommendations for future research are discussed with regard to each model. In addition, we examine current developments in theorizing that stressors have sensitizing versus steeling effects and recent interest in biological and gene by environment interactions. We also discuss several of the cultural aspects that have supported the polarization and misunderstanding of the literature, and argue that applying risk and resilience models to the theories and data offers a more balanced way to understand the subtle effects of media violence on aggression within a multicausal perspective. PMID:24851351

  3. Thermographic techniques applied to solar collector systems analysis

    SciTech Connect

    Eden, A.

    1980-02-01

    The use of thermography to analyze large solar collector array systems under dynamic operating conditions is discussed. The research at the Solar Energy Research Institute (SERI) in this area has focused on thermographic techniques and equipment to determine temperature distributions, flow patterns, and air blockages in solar collectors. The results of this extensive study, covering many sites and types of collectors, illustrate the capabilities of infrared (IR) analysis as a qualitative analysis tool and operation and maintenance procedure when applied to large arrays. Thermographic analysis of most collector systems qualitatively showed relative temperature distributions that indicated balanced flow patterns. In three significant cases, blocked or broken collector arrays, which previously had gone undetected, were discovered. Using this analysis, validation studies of large computer codes could examine collector arrays for flow patterns or blockages that could cause disagreement between actual and predicted performance. Initial operation and balancing of large systems could be accomplished without complicated sensor systems not needed for normal operations. Maintenance personnel could quickly check their systems without climbing onto the roof and without complicated sensor systems.

  4. Automated SEM Modal Analysis Applied to the Diogenites

    NASA Technical Reports Server (NTRS)

    Bowman, L. E.; Spilde, M. N.; Papike, James J.

    1996-01-01

    Analysis of volume proportions of minerals, or modal analysis, is routinely accomplished by point counting on an optical microscope, but the process, particularly on brecciated samples such as the diogenite meteorites, is tedious and prone to error by misidentification of very small fragments, which may make up a significant volume of the sample. Precise volume percentage data can be gathered on a scanning electron microscope (SEM) utilizing digital imaging and an energy dispersive spectrometer (EDS). This form of automated phase analysis reduces error, and at the same time provides more information than could be gathered using simple point counting alone, such as particle morphology statistics and chemical analyses. We have previously studied major, minor, and trace-element chemistry of orthopyroxene from a suite of diogenites. This abstract describes the method applied to determine the modes on this same suite of meteorites and the results of that research. The modal abundances thus determined add additional information on the petrogenesis of the diogenites. In addition, low-abundance phases such as spinels were located for further analysis by this method.

  5. The ABC’s of Suicide Risk Assessment: Applying a Tripartite Approach to Individual Evaluations

    PubMed Central

    Harris, Keith M.; Syu, Jia-Jia; Lello, Owen D.; Chew, Y. L. Eileen; Willcox, Christopher H.; Ho, Roger H. M.

    2015-01-01

    There is considerable need for accurate suicide risk assessment for clinical, screening, and research purposes. This study applied the tripartite affect-behavior-cognition theory, the suicidal barometer model, classical test theory, and item response theory (IRT), to develop a brief self-report measure of suicide risk that is theoretically-grounded, reliable and valid. An initial survey (n = 359) employed an iterative process to an item pool, resulting in the six-item Suicidal Affect-Behavior-Cognition Scale (SABCS). Three additional studies tested the SABCS and a highly endorsed comparison measure. Studies included two online surveys (Ns = 1007, and 713), and one prospective clinical survey (n = 72; Time 2, n = 54). Factor analyses demonstrated SABCS construct validity through unidimensionality. Internal reliability was high (α = .86-.93, split-half = .90-.94)). The scale was predictive of future suicidal behaviors and suicidality (r = .68, .73, respectively), showed convergent validity, and the SABCS-4 demonstrated clinically relevant sensitivity to change. IRT analyses revealed the SABCS captured more information than the comparison measure, and better defined participants at low, moderate, and high risk. The SABCS is the first suicide risk measure to demonstrate no differential item functioning by sex, age, or ethnicity. In all comparisons, the SABCS showed incremental improvements over a highly endorsed scale through stronger predictive ability, reliability, and other properties. The SABCS is in the public domain, with this publication, and is suitable for clinical evaluations, public screening, and research. PMID:26030590

  6. Quantitative Microbial Risk Assessment Tutorial: Land-applied Microbial Loadings within a 12-Digit HUC

    EPA Science Inventory

    This tutorial reviews screens, icons, and basic functions of the SDMProjectBuilder (SDMPB). It demonstrates how one chooses a 12-digit HUC for analysis, performs an assessment of land-applied microbes by simulating microbial fate and transport using HSPF, and analyzes and visuali...

  7. Risk assessment of land-applied biosolids-borne triclocarban (TCC).

    PubMed

    Snyder, Elizabeth Hodges; O'Connor, George A

    2013-01-01

    Triclocarban (TCC) is monitored under the USEPA High Production Volume (HPV) chemical program and is predominantly used as the active ingredient in select antibacterial bar soaps and other personal care products. The compound commonly occurs at parts-per-million concentrations in processed wastewater treatment residuals (i.e. biosolids), which are frequently land-applied as fertilizers and soil conditioners. Human and ecological risk assessment parameters measured by the authors in previous studies were integrated with existing data to perform a two-tiered human health and ecological risk assessment of land-applied biosolids-borne TCC. The 14 exposure pathways identified in the Part 503 Biosolids Rule were expanded, and conservative screening-level hazard quotients (HQ values) were first calculated to estimate risk to humans and a variety of terrestrial and aquatic organisms (Tier 1). The majority of biosolids-borne TCC exposure pathways resulted in no screening-level HQ values indicative of significant risks to exposed organisms (including humans), even under worst-case land application scenarios. The two pathways for which the conservative screening-level HQ values exceeded one (i.e. Pathway 10: biosolids➔soil➔soil organism➔predator, and Pathway 16: biosolids➔soil➔surface water➔aquatic organism) were then reexamined using modified parameters and scenarios (Tier 2). Adjusted HQ values remained greater than one for Exposure Pathway 10, with the exception of the final adjusted HQ values under a one-time 5 Mg ha(-1) (agronomic) biosolids loading rate scenario for the American woodcock (Scolopax minor) and short-tailed shrew (Blarina brevicauda). Results were used to prioritize recommendations for future biosolids-borne TCC research, which include additional measurements of toxicological effects and TCC concentrations in environmental matrices at the field level. PMID:23183124

  8. Applying cluster analysis to physics education research data

    NASA Astrophysics Data System (ADS)

    Springuel, R. Padraic

    One major thrust of Physics Education Research (PER) is the identification of student ideas about specific physics concepts, both correct ideas and those that differ from the expert consensus. Typically the research process of eliciting the spectrum of student ideas involves the administration of specially designed questions to students. One major analysis task in PER is the sorting of these student responses into thematically coherent groups. This process is one which has previously been done by eye in PER. This thesis explores the possibility of using cluster analysis to perform the task in a more rigorous and less time-intensive fashion while making fewer assumptions about what the students are doing. Since this technique has not previously been used in PER, a summary of the various kinds of cluster analysis is included as well as a discussion of which might be appropriate for the task of sorting student responses into groups. Two example data sets (one based on the Force and Motion Conceptual Evaluation (DICE) the other looking at acceleration in two-dimensions (A2D) are examined in depth to demonstrate how cluster analysis can be applied to PER data and the various considerations which must be taken into account when doing so. In both cases, the techniques described in this thesis found 5 groups which contained about 90% of the students in the data set. The results of this application are compared to previous research on the topics covered by the two examples to demonstrate that cluster analysis can effectively uncover the same patterns in student responses that have already been identified.

  9. Risk analysis for critical asset protection.

    PubMed

    McGill, William L; Ayyub, Bilal M; Kaminskiy, Mark

    2007-10-01

    This article proposes a quantitative risk assessment and management framework that supports strategic asset-level resource allocation decision making for critical infrastructure and key resource protection. The proposed framework consists of five phases: scenario identification, consequence and criticality assessment, security vulnerability assessment, threat likelihood assessment, and benefit-cost analysis. Key innovations in this methodology include its initial focus on fundamental asset characteristics to generate an exhaustive set of plausible threat scenarios based on a target susceptibility matrix (which we refer to as asset-driven analysis) and an approach to threat likelihood assessment that captures adversary tendencies to shift their preferences in response to security investments based on the expected utilities of alternative attack profiles assessed from the adversary perspective. A notional example is provided to demonstrate an application of the proposed framework. Extensions of this model to support strategic portfolio-level analysis and tactical risk analysis are suggested. PMID:18076495

  10. Soft tissue cephalometric analysis applied to regional Indian population

    PubMed Central

    Upadhyay, Jay S.; Maheshwari, Sandhya; Verma, Sanjeev K.; Zahid, Syed Naved

    2013-01-01

    Introduction: Importance of soft tissue consideration in establishing treatment goals for orthodontics and orthognathic surgery has been recognized and various cephalometric analysis incorporating soft tissue parameters have evolved. The great variance in soft tissue drape of the human face and perception of esthetics exists and normative data based on one population group cannot be applied to all. The study was conducted to compare the standard soft tissue cephalometric analysis (STCA) norms with norms derived for population of western Uttar Pradesh region of India. Materials and Methods: The sample consisted of lateral cephalograms taken in natural head position of 33 normal subjects (16 males, 17 females). The cephalograms were analyzed with soft tissue cephalometric analysis for orthodontic diagnosis and treatment planning, and the Student's t test was used to compare the difference in means between study population and standard STCA norms. Results: Compared with established STCA norms, females in our study had steeper maxillary occlusal plane, more proclined mandibular incisors, and less protrusive lips. Both males and females showed an overall decrease in facial lengths, less prominent midface and mandibular structures and more convex profile compared with established norms for the White population. Conclusions: Statistically significant differences were found in certain key parameters of STCA for western Uttar Pradesh population when compared with established norms. PMID:24665169

  11. Seismic analysis applied to the delimiting of a gas reservoir

    SciTech Connect

    Ronquillo, G.; Navarro, M.; Lozada, M.; Tafolla, C.

    1996-08-01

    We present the results of correlating seismic models with petrophysical parameters and well logs to mark the limits of a gas reservoir in sand lenses. To fulfill the objectives of the study, we used a data processing sequence that included wavelet manipulation, complex trace attributes and pseudovelocities inversion, along with several quality control schemes to insure proper amplitude preservation. Based on the analysis and interpretation of the seismic sections, several areas of interest were selected to apply additional signal treatment as preconditioning for petrophysical inversion. Signal classification was performed to control the amplitudes along the horizons of interest, and to be able to find an indirect interpretation of lithologies. Additionally, seismic modeling was done to support the results obtained and to help integrate the interpretation. The study proved to be a good auxiliary tool in the location of the probable extension of the gas reservoir in sand lenses.

  12. Finite Element Analysis Applied to Dentoalveolar Trauma: Methodology Description

    PubMed Central

    da Silva, B. R.; Moreira Neto, J. J. S.; da Silva, F. I.; de Aguiar, A. S. W.

    2011-01-01

    Dentoalveolar traumatic injuries are among the clinical conditions most frequently treated in dental practice. However, few studies so far have addressed the biomechanical aspects of these events, probably as a result of difficulties in carrying out satisfactory experimental and clinical studies as well as the unavailability of truly scientific methodologies. The aim of this paper was to describe the use of finite element analysis applied to the biomechanical evaluation of dentoalveolar trauma. For didactic purposes, the methodological process was divided into steps that go from the creation of a geometric model to the evaluation of final results, always with a focus on methodological characteristics, advantages, and disadvantages, so as to allow the reader to customize the methodology according to specific needs. Our description shows that the finite element method can faithfully reproduce dentoalveolar trauma, provided the methodology is closely followed and thoroughly evaluated. PMID:21991463

  13. Applied Behavior Analysis is a Science and, Therefore, Progressive.

    PubMed

    Leaf, Justin B; Leaf, Ronald; McEachin, John; Taubman, Mitchell; Ala'i-Rosales, Shahla; Ross, Robert K; Smith, Tristram; Weiss, Mary Jane

    2016-02-01

    Applied behavior analysis (ABA) is a science and, therefore, involves progressive approaches and outcomes. In this commentary we argue that the spirit and the method of science should be maintained in order to avoid reductionist procedures, stifled innovation, and rote, unresponsive protocols that become increasingly removed from meaningful progress for individuals diagnosed with autism spectrum disorder (ASD). We describe this approach as progressive. In a progressive approach to ABA, the therapist employs a structured yet flexible process, which is contingent upon and responsive to child progress. We will describe progressive ABA, contrast it to reductionist ABA, and provide rationales for both the substance and intent of ABA as a progressive scientific method for improving conditions of social relevance for individuals with ASD. PMID:26373767

  14. Image analysis technique applied to lock-exchange gravity currents

    NASA Astrophysics Data System (ADS)

    Nogueira, Helena I. S.; Adduce, Claudia; Alves, Elsa; Franca, Mário J.

    2013-04-01

    An image analysis technique is used to estimate the two-dimensional instantaneous density field of unsteady gravity currents produced by full-depth lock-release of saline water. An experiment reproducing a gravity current was performed in a 3.0 m long, 0.20 m wide and 0.30 m deep Perspex flume with horizontal smooth bed and recorded with a 25 Hz CCD video camera under controlled light conditions. Using dye concentration as a tracer, a calibration procedure was established for each pixel in the image relating the amount of dye uniformly distributed in the tank and the greyscale values in the corresponding images. The results are evaluated and corrected by applying the mass conservation principle within the experimental tank. The procedure is a simple way to assess the time-varying density distribution within the gravity current, allowing the investigation of gravity current dynamics and mixing processes.

  15. Applying machine learning techniques to DNA sequence analysis

    SciTech Connect

    Shavlik, J.W. . Dept. of Computer Sciences); Noordewier, M.O. . Dept. of Computer Science)

    1992-01-01

    We are primarily developing a machine teaming (ML) system that modifies existing knowledge about specific types of biological sequences. It does this by considering sample members and nonmembers of the sequence motif being teamed. Using this information, our teaming algorithm produces a more accurate representation of the knowledge needed to categorize future sequences. Specifically, our KBANN algorithm maps inference rules about a given recognition task into a neural network. Neural network training techniques then use the training examples to refine these inference rules. We call these rules a domain theory, following the convention in the machine teaming community. We have been applying this approach to several problems in DNA sequence analysis. In addition, we have been extending the capabilities of our teaming system along several dimensions. We have also been investigating parallel algorithms that perform sequence alignments in the presence of frameshift errors.

  16. Applying machine learning techniques to DNA sequence analysis

    SciTech Connect

    Shavlik, J.W.

    1992-01-01

    We are developing a machine learning system that modifies existing knowledge about specific types of biological sequences. It does this by considering sample members and nonmembers of the sequence motif being learned. Using this information (which we call a domain theory''), our learning algorithm produces a more accurate representation of the knowledge needed to categorize future sequences. Specifically, the KBANN algorithm maps inference rules, such as consensus sequences, into a neural (connectionist) network. Neural network training techniques then use the training examples of refine these inference rules. We have been applying this approach to several problems in DNA sequence analysis and have also been extending the capabilities of our learning system along several dimensions.

  17. Causal modelling applied to the risk assessment of a wastewater discharge.

    PubMed

    Paul, Warren L; Rokahr, Pat A; Webb, Jeff M; Rees, Gavin N; Clune, Tim S

    2016-03-01

    Bayesian networks (BNs), or causal Bayesian networks, have become quite popular in ecological risk assessment and natural resource management because of their utility as a communication and decision-support tool. Since their development in the field of artificial intelligence in the 1980s, however, Bayesian networks have evolved and merged with structural equation modelling (SEM). Unlike BNs, which are constrained to encode causal knowledge in conditional probability tables, SEMs encode this knowledge in structural equations, which is thought to be a more natural language for expressing causal information. This merger has clarified the causal content of SEMs and generalised the method such that it can now be performed using standard statistical techniques. As it was with BNs, the utility of this new generation of SEM in ecological risk assessment will need to be demonstrated with examples to foster an understanding and acceptance of the method. Here, we applied SEM to the risk assessment of a wastewater discharge to a stream, with a particular focus on the process of translating a causal diagram (conceptual model) into a statistical model which might then be used in the decision-making and evaluation stages of the risk assessment. The process of building and testing a spatial causal model is demonstrated using data from a spatial sampling design, and the implications of the resulting model are discussed in terms of the risk assessment. It is argued that a spatiotemporal causal model would have greater external validity than the spatial model, enabling broader generalisations to be made regarding the impact of a discharge, and greater value as a tool for evaluating the effects of potential treatment plant upgrades. Suggestions are made on how the causal model could be augmented to include temporal as well as spatial information, including suggestions for appropriate statistical models and analyses. PMID:26832914

  18. Contribution of European research to risk analysis.

    PubMed

    Boenke, A

    2001-12-01

    The European Commission's, Quality of Life Research Programme, Key Action 1-Health, Food & Nutrition is mission-oriented and aims, amongst other things, at providing a healthy, safe and high-quality food supply leading to reinforced consumer confidence in the safety, of European food. Its objectives also include the enhancing of the competitiveness of the European food supply. Key Action 1 is currently supporting a number of different types of European collaborative projects in the area of risk analysis. The objectives of these projects range from the development and validation of prevention strategies including the reduction of consumers risks; development and validation of new modelling approaches, harmonization of risk assessment principles methodologies and terminology; standardization of methods and systems used for the safety evaluation of transgenic food; providing of tools for the evaluation of human viral contamination of shellfish and quality control; new methodologies for assessing the potential of unintended effects of genetically modified (genetically modified) foods; development of a risk assessment model for Cryptosporidium parvum related to the food and water industries, to the development of a communication platform for genetically modified organism, producers, retailers, regulatory authorities and consumer groups to improve safety assessment procedures, risk management strategies and risk communication; development and validation of new methods for safety testing of transgenic food; evaluation of the safety and efficacy of iron supplementation in pregnant women, evaluation of the potential cancer-preventing activity of pro- and pre-biotic ('synbiotic') combinations in human volunteers. An overview of these projects is presented here. PMID:11761126

  19. Supplemental Hazard Analysis and Risk Assessment - Hydrotreater

    SciTech Connect

    Lowry, Peter P.; Wagner, Katie A.

    2015-04-01

    A supplemental hazard analysis was conducted and quantitative risk assessment performed in response to an independent review comment received by the Pacific Northwest National Laboratory (PNNL) from the U.S. Department of Energy Pacific Northwest Field Office (PNSO) against the Hydrotreater/Distillation Column Hazard Analysis Report issued in April 2013. The supplemental analysis used the hazardous conditions documented by the previous April 2013 report as a basis. The conditions were screened and grouped for the purpose of identifying whether additional prudent, practical hazard controls could be identified, using a quantitative risk evaluation to assess the adequacy of the controls and establish a lower level of concern for the likelihood of potential serious accidents. Calculations were performed to support conclusions where necessary.

  20. Automated speech analysis applied to laryngeal disease categorization.

    PubMed

    Gelzinis, A; Verikas, A; Bacauskiene, M

    2008-07-01

    The long-term goal of the work is a decision support system for diagnostics of laryngeal diseases. Colour images of vocal folds, a voice signal, and questionnaire data are the information sources to be used in the analysis. This paper is concerned with automated analysis of a voice signal applied to screening of laryngeal diseases. The effectiveness of 11 different feature sets in classification of voice recordings of the sustained phonation of the vowel sound /a/ into a healthy and two pathological classes, diffuse and nodular, is investigated. A k-NN classifier, SVM, and a committee build using various aggregation options are used for the classification. The study was made using the mixed gender database containing 312 voice recordings. The correct classification rate of 84.6% was achieved when using an SVM committee consisting of four members. The pitch and amplitude perturbation measures, cepstral energy features, autocorrelation features as well as linear prediction cosine transform coefficients were amongst the feature sets providing the best performance. In the case of two class classification, using recordings from 79 subjects representing the pathological and 69 the healthy class, the correct classification rate of 95.5% was obtained from a five member committee. Again the pitch and amplitude perturbation measures provided the best performance. PMID:18346812

  1. 14 CFR Appendix C to Part 420 - Risk Analysis

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 14 Aeronautics and Space 4 2011-01-01 2011-01-01 false Risk Analysis C Appendix C to Part 420... TRANSPORTATION LICENSING LICENSE TO OPERATE A LAUNCH SITE Pt. 420, App. C Appendix C to Part 420—Risk Analysis (a... risk is minimal. (2) An applicant shall perform a risk analysis when a populated area is located...

  2. 14 CFR Appendix C to Part 420 - Risk Analysis

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 14 Aeronautics and Space 4 2010-01-01 2010-01-01 false Risk Analysis C Appendix C to Part 420... TRANSPORTATION LICENSING LICENSE TO OPERATE A LAUNCH SITE Pt. 420, App. C Appendix C to Part 420—Risk Analysis (a... risk is minimal. (2) An applicant shall perform a risk analysis when a populated area is located...

  3. 14 CFR Appendix C to Part 420 - Risk Analysis

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 14 Aeronautics and Space 4 2012-01-01 2012-01-01 false Risk Analysis C Appendix C to Part 420... TRANSPORTATION LICENSING LICENSE TO OPERATE A LAUNCH SITE Pt. 420, App. C Appendix C to Part 420—Risk Analysis (a... risk is minimal. (2) An applicant shall perform a risk analysis when a populated area is located...

  4. 14 CFR Appendix C to Part 420 - Risk Analysis

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 14 Aeronautics and Space 4 2013-01-01 2013-01-01 false Risk Analysis C Appendix C to Part 420... TRANSPORTATION LICENSING LICENSE TO OPERATE A LAUNCH SITE Pt. 420, App. C Appendix C to Part 420—Risk Analysis (a... risk is minimal. (2) An applicant shall perform a risk analysis when a populated area is located...

  5. 14 CFR Appendix C to Part 420 - Risk Analysis

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 14 Aeronautics and Space 4 2014-01-01 2014-01-01 false Risk Analysis C Appendix C to Part 420... TRANSPORTATION LICENSING LICENSE TO OPERATE A LAUNCH SITE Pt. 420, App. C Appendix C to Part 420—Risk Analysis (a... risk is minimal. (2) An applicant shall perform a risk analysis when a populated area is located...

  6. Risk assessment and its application to flight safety analysis

    SciTech Connect

    Keese, D.L.; Barton, W.R.

    1989-12-01

    Potentially hazardous test activities have historically been a part of Sandia National Labs mission to design, develop, and test new weapons systems. These test activities include high speed air drops for parachute development, sled tests for component and system level studies, multiple stage rocket experiments, and artillery firings of various projectiles. Due to the nature of Sandia's test programs, the risk associated with these activities can never be totally eliminated. However, a consistent set of policies should be available to provide guidance into the level of risk that is acceptable in these areas. This report presents a general set of guidelines for addressing safety issues related to rocket flight operations at Sandia National Laboratories. Even though the majority of this report deals primarily with rocket flight safety, these same principles could be applied to other hazardous test activities. The basic concepts of risk analysis have a wide range of applications into many of Sandia's current operations. 14 refs., 1 tab.

  7. Multitaper Spectral Analysis and Wavelet Denoising Applied to Helioseismic Data

    NASA Technical Reports Server (NTRS)

    Komm, R. W.; Gu, Y.; Hill, F.; Stark, P. B.; Fodor, I. K.

    1999-01-01

    Estimates of solar normal mode frequencies from helioseismic observations can be improved by using Multitaper Spectral Analysis (MTSA) to estimate spectra from the time series, then using wavelet denoising of the log spectra. MTSA leads to a power spectrum estimate with reduced variance and better leakage properties than the conventional periodogram. Under the assumption of stationarity and mild regularity conditions, the log multitaper spectrum has a statistical distribution that is approximately Gaussian, so wavelet denoising is asymptotically an optimal method to reduce the noise in the estimated spectra. We find that a single m-upsilon spectrum benefits greatly from MTSA followed by wavelet denoising, and that wavelet denoising by itself can be used to improve m-averaged spectra. We compare estimates using two different 5-taper estimates (Stepian and sine tapers) and the periodogram estimate, for GONG time series at selected angular degrees l. We compare those three spectra with and without wavelet-denoising, both visually, and in terms of the mode parameters estimated from the pre-processed spectra using the GONG peak-fitting algorithm. The two multitaper estimates give equivalent results. The number of modes fitted well by the GONG algorithm is 20% to 60% larger (depending on l and the temporal frequency) when applied to the multitaper estimates than when applied to the periodogram. The estimated mode parameters (frequency, amplitude and width) are comparable for the three power spectrum estimates, except for modes with very small mode widths (a few frequency bins), where the multitaper spectra broadened the modest compared with the periodogram. We tested the influence of the number of tapers used and found that narrow modes at low n values are broadened to the extent that they can no longer be fit if the number of tapers is too large. For helioseismic time series of this length and temporal resolution, the optimal number of tapers is less than 10.

  8. First Attempt of Applying Factor Analysis in Moving Base Gravimetry

    NASA Astrophysics Data System (ADS)

    Li, X.; Roman, D. R.

    2014-12-01

    For gravimetric observation systems on mobile platforms (land/sea/airborne), the Low Signal to Noise Ratio (SNR) issue is the main barrier to achieving an accurate, high resolution gravity signal. Normally, low-pass filters (Childers et al 1999, Forsberg et al 2000, Kwon and Jekeli 2000, Hwang et al 2006) are applied to smooth or remove the high frequency "noise" - even though some of the high frequency component is not necessarily noise. This is especially true for aerogravity surveys such as those from the Gravity for the Redefinition of the American Vertical Datum (GRAV-D) project. These gravity survey flights have a spatial resolution of 10 km between tracks but higher resolution along track. The along track resolution is improved due to the lower flight height (6.1 km), equipment sensitivity, and improved modeling of potential errors. Additionally, these surveys suffer from a loss of signal power due to the increased flight elevation. Hence, application of a low-pass filter removes possible signal sensed in the along-track direction that might otherwise prove useful for various geophysical and geodetic applications. Some cutting-edge developments in Wavelets and Artificial Neural Networks had been successfully applied for obtaining improved results (Li 2008 and 2011, Liang and Liu 2013). However, a clearer and fundamental understanding of the error characteristics will further improve the quality of the gravity estimates out of these gravimetric systems. Here, instead of using any predefined basis function or any a priori model, the idea of Factor Analysis is first employed to try to extract the underlying factors of the noises in the systems. Real data sets collected by both land vehicle and aircraft will be processed as the examples.

  9. Low-thrust mission risk analysis.

    NASA Technical Reports Server (NTRS)

    Yen, C. L.; Smith, D. B.

    1973-01-01

    A computerized multi-stage failure process simulation procedure is used to evaluate the risk in a solar electric space mission. The procedure uses currently available thrust-subsystem reliability data and performs approximate simulations of the thrust subsystem burn operation, the system failure processes, and the retargetting operations. The application of the method is used to assess the risks in carrying out a 1980 rendezvous mission to Comet Encke. Analysis of the results and evaluation of the effects of various risk factors on the mission show that system component failure rates is the limiting factor in attaining a high mission reliability. But it is also shown that a well-designed trajectory and system operation mode can be used effectively to partially compensate for unreliable thruster performance.

  10. Risk analysis for renewable energy projects due to constraints arising

    NASA Astrophysics Data System (ADS)

    Prostean, G.; Vasar, C.; Prostean, O.; Vartosu, A.

    2016-02-01

    Starting from the target of the European Union (EU) to use renewable energy in the area that aims a binding target of 20% renewable energy in final energy consumption by 2020, this article illustrates the identification of risks for implementation of wind energy projects in Romania, which could lead to complex technical implications, social and administrative. In specific projects analyzed in this paper were identified critical bottlenecks in the future wind power supply chain and reasonable time periods that may arise. Renewable energy technologies have to face a number of constraints that delayed scaling-up their production process, their transport process, the equipment reliability, etc. so implementing these types of projects requiring complex specialized team, the coordination of which also involve specific risks. The research team applied an analytical risk approach to identify major risks encountered within a wind farm project developed in Romania in isolated regions with different particularities, configured for different geographical areas (hill and mountain locations in Romania). Identification of major risks was based on the conceptual model set up for the entire project implementation process. Throughout this conceptual model there were identified specific constraints of such process. Integration risks were examined by an empirical study based on the method HAZOP (Hazard and Operability). The discussion describes the analysis of our results implementation context of renewable energy projects in Romania and creates a framework for assessing energy supply to any entity from renewable sources.

  11. A comprehensive risk analysis of coastal zones in China

    NASA Astrophysics Data System (ADS)

    Wang, Guanghui; Liu, Yijun; Wang, Hongbing; Wang, Xueying

    2014-03-01

    Although coastal zones occupy an important position in the world development, they face high risks and vulnerability to natural disasters because of their special locations and their high population density. In order to estimate their capability for crisis-response, various models have been established. However, those studies mainly focused on natural factors or conditions, which could not reflect the social vulnerability and regional disparities of coastal zones. Drawing lessons from the experiences of the United Nations Environment Programme (UNEP), this paper presents a comprehensive assessment strategy based on the mechanism of Risk Matrix Approach (RMA), which includes two aspects that are further composed of five second-class indicators. The first aspect, the probability phase, consists of indicators of economic conditions, social development, and living standards, while the second one, the severity phase, is comprised of geographic exposure and natural disasters. After weighing all of the above indicators by applying the Analytic Hierarchy Process (AHP) and Delphi Method, the paper uses the comprehensive assessment strategy to analyze the risk indices of 50 coastal cities in China. The analytical results are presented in ESRI ArcGis10.1, which generates six different risk maps covering the aspects of economy, society, life, environment, disasters, and an overall assessment of the five areas. Furthermore, the study also investigates the spatial pattern of these risk maps, with detailed discussion and analysis of different risks in coastal cities.

  12. Correlation Network Analysis Applied to Complex Biofilm Communities

    PubMed Central

    Duran-Pinedo, Ana E.; Paster, Bruce; Teles, Ricardo; Frias-Lopez, Jorge

    2011-01-01

    The complexity of the human microbiome makes it difficult to reveal organizational principles of the community and even more challenging to generate testable hypotheses. It has been suggested that in the gut microbiome species such as Bacteroides thetaiotaomicron are keystone in maintaining the stability and functional adaptability of the microbial community. In this study, we investigate the interspecies associations in a complex microbial biofilm applying systems biology principles. Using correlation network analysis we identified bacterial modules that represent important microbial associations within the oral community. We used dental plaque as a model community because of its high diversity and the well known species-species interactions that are common in the oral biofilm. We analyzed samples from healthy individuals as well as from patients with periodontitis, a polymicrobial disease. Using results obtained by checkerboard hybridization on cultivable bacteria we identified modules that correlated well with microbial complexes previously described. Furthermore, we extended our analysis using the Human Oral Microbe Identification Microarray (HOMIM), which includes a large number of bacterial species, among them uncultivated organisms present in the mouth. Two distinct microbial communities appeared in healthy individuals while there was one major type in disease. Bacterial modules in all communities did not overlap, indicating that bacteria were able to effectively re-associate with new partners depending on the environmental conditions. We then identified hubs that could act as keystone species in the bacterial modules. Based on those results we then cultured a not-yet-cultivated microorganism, Tannerella sp. OT286 (clone BU063). After two rounds of enrichment by a selected helper (Prevotella oris OT311) we obtained colonies of Tannerella sp. OT286 growing on blood agar plates. This system-level approach would open the possibility of manipulating microbial

  13. Synthesis of Enterprise and Value-Based Methods for Multiattribute Risk Analysis

    SciTech Connect

    C. Robert Kenley; John W. Collins; John M. Beck; Harold J. Heydt; Chad B. Garcia

    2001-10-01

    This paper describes a method for performing multiattribute decision analysis to prioritize ap-proaches to handling risks during the development and operation of complex socio-technical systems. The method combines risk categorization based on enterprise views, risk prioritization of the categories based on the Analytic Hierarchy Process (AHP), and more standard probability-consequence ratings schemes. We also apply value-based testing me-thods used in software development to prioritize risk-handling approaches. We describe a tool that synthesizes the methods and performs a multiattribute analysis of the technical and pro-grammatic risks on the Next Generation Nuclear Plant (NGNP) enterprise.

  14. Integrated Reliability and Risk Analysis System (IRRAS)

    SciTech Connect

    Russell, K D; McKay, M K; Sattison, M.B. Skinner, N.L.; Wood, S T; Rasmuson, D M

    1992-01-01

    The Integrated Reliability and Risk Analysis System (IRRAS) is a state-of-the-art, microcomputer-based probabilistic risk assessment (PRA) model development and analysis tool to address key nuclear plant safety issues. IRRAS is an integrated software tool that gives the user the ability to create and analyze fault trees and accident sequences using a microcomputer. This program provides functions that range from graphical fault tree construction to cut set generation and quantification. Version 1.0 of the IRRAS program was released in February of 1987. Since that time, many user comments and enhancements have been incorporated into the program providing a much more powerful and user-friendly system. This version has been designated IRRAS 4.0 and is the subject of this Reference Manual. Version 4.0 of IRRAS provides the same capabilities as Version 1.0 and adds a relational data base facility for managing the data, improved functionality, and improved algorithm performance.

  15. Non-Harmonic Analysis Applied to Optical Coherence Tomography Imaging

    NASA Astrophysics Data System (ADS)

    Cao, Xu; Uchida, Tetsuya; Hirobayashi, Shigeki; Chong, Changho; Morosawa, Atsushi; Totsuka, Koki; Suzuki, Takuya

    2012-02-01

    A new processing technique called non-harmonic analysis (NHA) is proposed for optical coherence tomography (OCT) imaging. Conventional Fourier-domain OCT employs the discrete Fourier transform (DFT), which depends on the window function and length. The axial resolution of the OCT image, calculated by using DFT, is inversely proportional to the full width at half maximum (FWHM) of the wavelength range. The FWHM of wavelength range is limited by the sweeping range of the source in swept-source OCT and it is limited by the number of CCD pixels in spectral-domain OCT. However, the NHA process does not have such constraints; NHA can resolve high frequencies irrespective of the window function and the frame length of the sampled data. In this study, the NHA process is described and it is applied to OCT imaging. It is compared with OCT images based on the DFT. To demonstrate the benefits of using NHA for OCT, we perform OCT imaging with NHA of an onion skin. The results reveal that NHA can achieve an image resolution equivalent that of a 100-nm sweep range using a significantly reduced wavelength range. They also reveal the potential of using this technique to achieve high-resolution imaging without using a broadband source. However, the long calculation times required for NHA must be addressed if it is to be used in clinical applications.

  16. Applying DNA computation to intractable problems in social network analysis.

    PubMed

    Chen, Rick C S; Yang, Stephen J H

    2010-09-01

    From ancient times to the present day, social networks have played an important role in the formation of various organizations for a range of social behaviors. As such, social networks inherently describe the complicated relationships between elements around the world. Based on mathematical graph theory, social network analysis (SNA) has been developed in and applied to various fields such as Web 2.0 for Web applications and product developments in industries, etc. However, some definitions of SNA, such as finding a clique, N-clique, N-clan, N-club and K-plex, are NP-complete problems, which are not easily solved via traditional computer architecture. These challenges have restricted the uses of SNA. This paper provides DNA-computing-based approaches with inherently high information density and massive parallelism. Using these approaches, we aim to solve the three primary problems of social networks: N-clique, N-clan, and N-club. Their accuracy and feasible time complexities discussed in the paper will demonstrate that DNA computing can be used to facilitate the development of SNA. PMID:20566337

  17. To apply or not to apply: a survey analysis of grant writing costs and benefits.

    PubMed

    von Hippel, Ted; von Hippel, Courtney

    2015-01-01

    We surveyed 113 astronomers and 82 psychologists active in applying for federally funded research on their grant-writing history between January, 2009 and November, 2012. We collected demographic data, effort levels, success rates, and perceived non-financial benefits from writing grant proposals. We find that the average proposal takes 116 PI hours and 55 CI hours to write; although time spent writing was not related to whether the grant was funded. Effort did translate into success, however, as academics who wrote more grants received more funding. Participants indicated modest non-monetary benefits from grant writing, with psychologists reporting a somewhat greater benefit overall than astronomers. These perceptions of non-financial benefits were unrelated to how many grants investigators applied for, the number of grants they received, or the amount of time they devoted to writing their proposals. We also explored the number of years an investigator can afford to apply unsuccessfully for research grants and our analyses suggest that funding rates below approximately 20%, commensurate with current NIH and NSF funding, are likely to drive at least half of the active researchers away from federally funded research. We conclude with recommendations and suggestions for individual investigators and for department heads. PMID:25738742

  18. To Apply or Not to Apply: A Survey Analysis of Grant Writing Costs and Benefits

    PubMed Central

    von Hippel, Ted; von Hippel, Courtney

    2015-01-01

    We surveyed 113 astronomers and 82 psychologists active in applying for federally funded research on their grant-writing history between January, 2009 and November, 2012. We collected demographic data, effort levels, success rates, and perceived non-financial benefits from writing grant proposals. We find that the average proposal takes 116 PI hours and 55 CI hours to write; although time spent writing was not related to whether the grant was funded. Effort did translate into success, however, as academics who wrote more grants received more funding. Participants indicated modest non-monetary benefits from grant writing, with psychologists reporting a somewhat greater benefit overall than astronomers. These perceptions of non-financial benefits were unrelated to how many grants investigators applied for, the number of grants they received, or the amount of time they devoted to writing their proposals. We also explored the number of years an investigator can afford to apply unsuccessfully for research grants and our analyses suggest that funding rates below approximately 20%, commensurate with current NIH and NSF funding, are likely to drive at least half of the active researchers away from federally funded research. We conclude with recommendations and suggestions for individual investigators and for department heads. PMID:25738742

  19. 14 CFR 417.225 - Debris risk analysis.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 14 Aeronautics and Space 4 2010-01-01 2010-01-01 false Debris risk analysis. 417.225 Section 417... OF TRANSPORTATION LICENSING LAUNCH SAFETY Flight Safety Analysis § 417.225 Debris risk analysis. A flight safety analysis must demonstrate that the risk to the public potentially exposed to inert...

  20. 14 CFR 417.225 - Debris risk analysis.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 14 Aeronautics and Space 4 2011-01-01 2011-01-01 false Debris risk analysis. 417.225 Section 417... OF TRANSPORTATION LICENSING LAUNCH SAFETY Flight Safety Analysis § 417.225 Debris risk analysis. A flight safety analysis must demonstrate that the risk to the public potentially exposed to inert...

  1. 14 CFR 417.225 - Debris risk analysis.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 14 Aeronautics and Space 4 2013-01-01 2013-01-01 false Debris risk analysis. 417.225 Section 417... OF TRANSPORTATION LICENSING LAUNCH SAFETY Flight Safety Analysis § 417.225 Debris risk analysis. A flight safety analysis must demonstrate that the risk to the public potentially exposed to inert...

  2. 14 CFR 417.225 - Debris risk analysis.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 14 Aeronautics and Space 4 2014-01-01 2014-01-01 false Debris risk analysis. 417.225 Section 417... OF TRANSPORTATION LICENSING LAUNCH SAFETY Flight Safety Analysis § 417.225 Debris risk analysis. A flight safety analysis must demonstrate that the risk to the public potentially exposed to inert...

  3. 14 CFR 417.225 - Debris risk analysis.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 14 Aeronautics and Space 4 2012-01-01 2012-01-01 false Debris risk analysis. 417.225 Section 417... OF TRANSPORTATION LICENSING LAUNCH SAFETY Flight Safety Analysis § 417.225 Debris risk analysis. A flight safety analysis must demonstrate that the risk to the public potentially exposed to inert...

  4. Fire Risk Analysis for Armenian NPP Confinement

    SciTech Connect

    Poghosyan, Shahen; Malkhasyan, Albert; Bznuni, Surik; Amirjanyan, Armen

    2006-07-01

    Major fire occurred at Armenian NPP (ANPP) in October 1982 showed that fire-induced initiating events (IE) can have dominant contribution in overall risk of core damage. Probabilistic Safety Assessment study for fire-induced initiating events for ANPP was initiated in 2002. Analysis was performed for compartments fires in which could result in failure of components which are necessary for reactor cold shutdown. Analysis shows that main risk from fire at ANPP is conditioned by fire in cable tunnels 61-64. Meanwhile fire in confinement compartments don't have significant contribution to overall risk of core damage. The exception is so called 'confinement valves compartment' (room no.A-013/2) fire (more than 7.5% of CDF) in which fire could result in the loss of coolant accident with unavailability of primary makeup system, which directly leads to core damage. Detailed analysis of this problem that is common for typical WWER-440/230 reactors with no hermetic MCPs and recommendations for solution are presented in this paper. (authors)

  5. Landsafe: Landing Site Risk Analysis Software Framework

    NASA Astrophysics Data System (ADS)

    Schmidt, R.; Bostelmann, J.; Cornet, Y.; Heipke, C.; Philippe, C.; Poncelet, N.; de Rosa, D.; Vandeloise, Y.

    2012-08-01

    The European Space Agency (ESA) is planning a Lunar Lander mission in the 2018 timeframe that will demonstrate precise soft landing at the polar regions of the Moon. To ensure a safe and successful landing a careful risk analysis has to be carried out. This is comprised of identifying favorable target areas and evaluating the surface conditions in these areas. Features like craters, boulders, steep slopes, rough surfaces and shadow areas have to be identified in order to assess the risk associated to a landing site in terms of a successful touchdown and subsequent surface operation of the lander. In addition, global illumination conditions at the landing site have to be simulated and analyzed. The Landing Site Risk Analysis software framework (LandSAfe) is a system for the analysis, selection and certification of safe landing sites on the lunar surface. LandSAfe generates several data products including high resolution digital terrain models (DTMs), hazard maps, illumination maps, temperature maps and surface reflectance maps which assist the user in evaluating potential landing site candidates. This paper presents the LandSAfe system and describes the methods and products of the different modules. For one candidate landing site on the rim of Shackleton crater at the south pole of the Moon a high resolution DTM is showcased.

  6. New risk metrics and mathematical tools for risk analysis: Current and future challenges

    NASA Astrophysics Data System (ADS)

    Skandamis, Panagiotis N.; Andritsos, Nikolaos; Psomas, Antonios; Paramythiotis, Spyridon

    2015-01-01

    The current status of the food safety supply world wide, has led Food and Agriculture Organization (FAO) and World Health Organization (WHO) to establishing Risk Analysis as the single framework for building food safety control programs. A series of guidelines and reports that detail out the various steps in Risk Analysis, namely Risk Management, Risk Assessment and Risk Communication is available. The Risk Analysis approach enables integration between operational food management systems, such as Hazard Analysis Critical Control Points, public health and governmental decisions. To do that, a series of new Risk Metrics has been established as follows: i) the Appropriate Level of Protection (ALOP), which indicates the maximum numbers of illnesses in a population per annum, defined by quantitative risk assessments, and used to establish; ii) Food Safety Objective (FSO), which sets the maximum frequency and/or concentration of a hazard in a food at the time of consumption that provides or contributes to the ALOP. Given that ALOP is rather a metric of the public health tolerable burden (it addresses the total `failure' that may be handled at a national level), it is difficult to be interpreted into control measures applied at the manufacturing level. Thus, a series of specific objectives and criteria for performance of individual processes and products have been established, all of them assisting in the achievement of FSO and hence, ALOP. In order to achieve FSO, tools quantifying the effect of processes and intrinsic properties of foods on survival and growth of pathogens are essential. In this context, predictive microbiology and risk assessment have offered an important assistance to Food Safety Management. Predictive modelling is the basis of exposure assessment and the development of stochastic and kinetic models, which are also available in the form of Web-based applications, e.g., COMBASE and Microbial Responses Viewer), or introduced into user-friendly softwares

  7. New risk metrics and mathematical tools for risk analysis: Current and future challenges

    SciTech Connect

    Skandamis, Panagiotis N. Andritsos, Nikolaos Psomas, Antonios Paramythiotis, Spyridon

    2015-01-22

    The current status of the food safety supply world wide, has led Food and Agriculture Organization (FAO) and World Health Organization (WHO) to establishing Risk Analysis as the single framework for building food safety control programs. A series of guidelines and reports that detail out the various steps in Risk Analysis, namely Risk Management, Risk Assessment and Risk Communication is available. The Risk Analysis approach enables integration between operational food management systems, such as Hazard Analysis Critical Control Points, public health and governmental decisions. To do that, a series of new Risk Metrics has been established as follows: i) the Appropriate Level of Protection (ALOP), which indicates the maximum numbers of illnesses in a population per annum, defined by quantitative risk assessments, and used to establish; ii) Food Safety Objective (FSO), which sets the maximum frequency and/or concentration of a hazard in a food at the time of consumption that provides or contributes to the ALOP. Given that ALOP is rather a metric of the public health tolerable burden (it addresses the total ‘failure’ that may be handled at a national level), it is difficult to be interpreted into control measures applied at the manufacturing level. Thus, a series of specific objectives and criteria for performance of individual processes and products have been established, all of them assisting in the achievement of FSO and hence, ALOP. In order to achieve FSO, tools quantifying the effect of processes and intrinsic properties of foods on survival and growth of pathogens are essential. In this context, predictive microbiology and risk assessment have offered an important assistance to Food Safety Management. Predictive modelling is the basis of exposure assessment and the development of stochastic and kinetic models, which are also available in the form of Web-based applications, e.g., COMBASE and Microbial Responses Viewer), or introduced into user

  8. Analysis of exogenous components of mortality risks.

    PubMed

    Blinkin, V L

    1998-04-01

    A new technique for deriving exogenous components of mortality risks from national vital statistics has been developed. Each observed death rate Dij (where i corresponds to calendar time (year or interval of years) and j denotes the number of corresponding age group) was represented as Dij = Aj + BiCj, and unknown quantities Aj, Bi, and Cj were estimated by a special procedure using the least-squares principle. The coefficients of variation do not exceed 10%. It is shown that the term Aj can be interpreted as the endogenous and the second term BiCj as the exogenous components of the death rate. The aggregate of endogenous components Aj can be described by a regression function, corresponding to the Gompertz-Makeham law, A(tau) = gamma + beta x e alpha tau, where gamma, beta, and alpha are constants, tau is age, A(tau) [symbol: see text] tau = tau j identical to A(tau j) identical to Aj and tau j is the value of age tau in jth age group. The coefficients of variation for such a representation does not exceed 4%. An analysis of exogenous risk levels in the Moscow and Russian populations during 1980-1995 shows that since 1992 all components of exogenous risk in the Moscow population had been increasing up to 1994. The greatest contribution to the total level of exogenous risk was lethal diseases, and their death rate was 387 deaths per 100,000 persons in 1994, i.e., 61.9% of all deaths. The dynamics of exogenous mortality risk change during 1990-1994 in the Moscow population and in the Russian population without Moscow had been identical: the risk had been increasing and its value in the Russian population had been higher than that in the Moscow population. PMID:9637078

  9. Debris Flow Risk Management Framework and Risk Analysis in Taiwan, A Preliminary Study

    NASA Astrophysics Data System (ADS)

    Tsao, Ting-Chi; Hsu, Wen-Ko; Chiou, Lin-Bin; Cheng, Chin-Tung; Lo, Wen-Chun; Chen, Chen-Yu; Lai, Cheng-Nong; Ju, Jiun-Ping

    2010-05-01

    Taiwan is located on a seismically active mountain belt between the Philippine Sea plate and Eurasian plate. After 1999's Chi-Chi earthquake (Mw=7.6), landslide and debris flow occurred frequently. In Aug. 2009, Typhoon Morakot struck Taiwan and numerous landslides and debris flow events, some with tremendous fatalities, were observed. With limited resources, authorities should establish a disaster management system to cope with slope disaster risks more effectively. Since 2006, Taiwan's authority in charge of debris flow management, the Soil and Water Conservation Bureau (SWCB), completed the basic investigation and data collection of 1,503 potential debris flow creeks around Taiwan. During 2008 and 2009, a debris flow quantitative risk analysis (QRA) framework, based on landslide risk management framework of Australia, was proposed and conducted on 106 creeks of the 30 villages with debris flow hazard history. Information and value of several types of elements at risk (bridge, road, building and crop) were gathered and integrated into a GIS layer, with the vulnerability model of each elements at risk applied. Through studying the historical hazard events of the 30 villages, numerical simulations of debris flow hazards with different magnitudes (5, 10, 25, 50, 100 and 200 years return period) were conducted, the economic losses and fatalities of each scenario were calculated for each creek. When taking annual exceeding probability into account, the annual total risk of each creek was calculated, and the results displayed on a debris flow risk map. The number of fatalities and frequency were calculated, and the F-N curves of 106 creeks were provided. For F-N curves, the individual risk to life per year of 1.0E-04 and slope of 1, which matched with international standards, were considered to be an acceptable risk. Applying the results of the 106 creeks onto the F-N curve, they were divided into 3 categories: Unacceptable, ALARP (As Low As Reasonable Practicable) and

  10. Global Human Settlement Analysis for Disaster Risk Reduction

    NASA Astrophysics Data System (ADS)

    Pesaresi, M.; Ehrlich, D.; Ferri, S.; Florczyk, A.; Freire, S.; Haag, F.; Halkia, M.; Julea, A. M.; Kemper, T.; Soille, P.

    2015-04-01

    The Global Human Settlement Layer (GHSL) is supported by the European Commission, Joint Research Center (JRC) in the frame of his institutional research activities. Scope of GHSL is developing, testing and applying the technologies and analysis methods integrated in the JRC Global Human Settlement analysis platform for applications in support to global disaster risk reduction initiatives (DRR) and regional analysis in the frame of the European Cohesion policy. GHSL analysis platform uses geo-spatial data, primarily remotely sensed and population. GHSL also cooperates with the Group on Earth Observation on SB-04-Global Urban Observation and Information, and various international partners andWorld Bank and United Nations agencies. Some preliminary results integrating global human settlement information extracted from Landsat data records of the last 40 years and population data are presented.

  11. RISK ASSESSMENT AND EPIDEMIOLOGICAL INFORMATION FOR PATHOGENIC MICROORGANISMS APPLIED TO SOIL

    EPA Science Inventory

    There is increasing interest in the development of a microbial risk assessment methodology for regulatory and operational decision making. Initial interests in microbial risk assessments focused on drinking, recreational, and reclaimed water issues. More recently risk assessmen...

  12. Factor Analysis Applied the VFY-218 RCS Data

    NASA Technical Reports Server (NTRS)

    Woo, Alex; Chancellor, Marisa K. (Technical Monitor)

    1997-01-01

    Present statistical factor analysis of computer simulations and measurement data for the VFY-218 configuration. Factor analysis try to quantify the statistical grouping of measurements and simulations.

  13. 76 FR 30705 - Problem Formulation for Human Health Risk Assessments of Pathogens in Land-Applied Biosolids

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-05-26

    ... the public and an independent, external panel of scientific experts (73 FR 54400). Dated: May 18, 2011... AGENCY Problem Formulation for Human Health Risk Assessments of Pathogens in Land-Applied Biosolids... Pathogens in Land-Applied Biosolids'' EPA/600/R-08/035F, which was prepared by the National Center...

  14. Comprehensive safeguards evaluation methods and societal risk analysis

    SciTech Connect

    Richardson, J.M.

    1982-03-01

    Essential capabilities of an integrated evaluation methodology for analyzing safeguards systems are discussed. Such a methodology must be conceptually meaningful, technically defensible, discriminating and consistent. A decompostion of safeguards systems by function is mentioned as a possible starting point for methodology development. The application of a societal risk equation to safeguards systems analysis is addressed. Conceptual problems with this approach are discussed. Technical difficulties in applying this equation to safeguards systems are illustrated through the use of confidence intervals, information content, hypothesis testing and ranking and selection procedures.

  15. Phase plane analysis: applying chaos theory in health care.

    PubMed

    Priesmeyer, H R; Sharp, L F

    1995-01-01

    This article applies the new science of nonlinearity to administrative issues and accounts receivable management in health care, and it provides a new perspective on common operating and quality control measures. PMID:10151628

  16. Nuclear risk analysis of the Ulysses mission

    SciTech Connect

    Bartram, B.W.; Vaughan, F.R. ); Englehart, D.R.W. )

    1991-01-01

    The use of a radioisotope thermoelectric generator fueled with plutonium-238 dioxide on the Space Shuttle-launched Ulysses mission implies some level of risk due to potential accidents. This paper describes the method used to quantify risks in the Ulysses mission Final Safety Analysis Report prepared for the U.S. Department of Energy. The starting point for the analysis described herein is following input of source term probability distributions from the General Electric Company. A Monte Carlo technique is used to develop probability distributions of radiological consequences for a range of accident scenarios thoughout the mission. Factors affecting radiological consequences are identified, the probability distribution of the effect of each factor determined, and the functional relationship among all the factors established. The probability distributions of all the factor effects are then combined using a Monte Carlo technique. The results of the analysis are presented in terms of complementary cumulative distribution functions (CCDF) by mission sub-phase, phase, and the overall mission. The CCDFs show the total probability that consequences (calculated health effects) would be equal to or greater than a given value.

  17. Modeling Opponents in Adversarial Risk Analysis.

    PubMed

    Rios Insua, David; Banks, David; Rios, Jesus

    2016-04-01

    Adversarial risk analysis has been introduced as a framework to deal with risks derived from intentional actions of adversaries. The analysis supports one of the decisionmakers, who must forecast the actions of the other agents. Typically, this forecast must take account of random consequences resulting from the set of selected actions. The solution requires one to model the behavior of the opponents, which entails strategic thinking. The supported agent may face different kinds of opponents, who may use different rationality paradigms, for example, the opponent may behave randomly, or seek a Nash equilibrium, or perform level-k thinking, or use mirroring, or employ prospect theory, among many other possibilities. We describe the appropriate analysis for these situations, and also show how to model the uncertainty about the rationality paradigm used by the opponent through a Bayesian model averaging approach, enabling a fully decision-theoretic solution. We also show how as we observe an opponent's decision behavior, this approach allows learning about the validity of each of the rationality models used to predict his decision by computing the models' (posterior) probabilities, which can be understood as a measure of their validity. We focus on simultaneous decision making by two agents. PMID:26133501

  18. Anticipating risk for human subjects participating in clinical research: application of Failure Mode and Effects Analysis.

    PubMed

    Cody, Robert J

    2006-03-01

    Failure Mode and Effects Analysis (FMEA) is a method applied in various industries to anticipate and mitigate risk. This methodology can be more systematically applied to the protection of human subjects in research. The purpose of FMEA is simple: prevent problems before they occur. By applying FMEA process analysis to the elements of a specific research protocol, the failure severity, occurrence, and detection rates can be estimated for calculation of a "risk priority number" (RPN). Methods can then be identified to reduce the RPN to levels where the risk/benefit ratio favors human subject benefit, to a greater magnitude than existed in the pre-analysis risk profile. At the very least, the approach provides a checklist of issues that can be individualized for specific research protocols or human subject populations. PMID:16537191

  19. Cognitive task analysis: Techniques applied to airborne weapons training

    SciTech Connect

    Terranova, M.; Seamster, T.L.; Snyder, C.E.; Treitler, I.E.; Carlow Associates, Inc., Fairfax, VA; Martin Marietta Energy Systems, Inc., Oak Ridge, TN; Tennessee Univ., Knoxville, TN )

    1989-01-01

    This is an introduction to cognitive task analysis as it may be used in Naval Air Systems Command (NAVAIR) training development. The focus of a cognitive task analysis is human knowledge, and its methods of analysis are those developed by cognitive psychologists. This paper explains the role that cognitive task analysis and presents the findings from a preliminary cognitive task analysis of airborne weapons operators. Cognitive task analysis is a collection of powerful techniques that are quantitative, computational, and rigorous. The techniques are currently not in wide use in the training community, so examples of this methodology are presented along with the results. 6 refs., 2 figs., 4 tabs.

  20. Risk Analysis Related to Quality Management Principles

    NASA Astrophysics Data System (ADS)

    Vykydal, David; Halfarová, Petra; Nenadál, Jaroslav; Plura, Jiří; Hekelová, Edita

    2012-12-01

    Efficient and effective implementation of quality management principles asks for a responsible approach from top managers' perspectives. A study of the current state of affairs in Czech organizations discovers a lot of shortcomings in this field that can be changed to vary managerial risks. The article identifies and analyses some of them and gives short guidance for appropriate treatment. Text of the article reflects the authors' experience as well as knowledge obtained from the systematic analysis of industrial companies' environments.

  1. System Analysis and Risk Assessment System.

    Energy Science and Technology Software Center (ESTSC)

    2000-11-20

    Version 00 SARA4.16 is a program that allows the user to review the results of a Probabilistic Risk Assessment (PRA) and to perform limited sensitivity analysis on these results. This tool is intended to be used by a less technical oriented user and does not require the level of understanding of PRA concepts required by a full PRA analysis tool. With this program a user can review the information generated by a PRA analyst andmore » compare the results to those generated by making limited modifications to the data in the PRA. Also included in this program is the ability to graphically display the information stored in the database. This information includes event trees, fault trees, P&IDs and uncertainty distributions. SARA 4.16 is incorporated in the SAPHIRE 5.0 code package.« less

  2. Nutrient Status and Contamination Risks from Digested Pig Slurry Applied on a Vegetable Crops Field

    PubMed Central

    Zhang, Shaohui; Hua, Yumei; Deng, Liangwei

    2016-01-01

    The effects of applied digested pig slurry on a vegetable crops field were studied. The study included a 3-year investigation on nutrient characteristics, heavy metals contamination and hygienic risks of a vegetable crops field in Wuhan, China. The results showed that, after anaerobic digestion, abundant N, P and K remained in the digested pig slurry while fecal coliforms, ascaris eggs, schistosoma eggs and hookworm eggs were highly reduced. High Cr, Zn and Cu contents in the digested pig slurry were found in spring. Digested pig slurry application to the vegetable crops field led to improved soil fertility. Plant-available P in the fertilized soils increased due to considerable increase in total P content and decrease in low-availability P fraction. The As content in the fertilized soils increased slightly but significantly (p = 0.003) compared with control. The Hg, Zn, Cr, Cd, Pb, and Cu contents in the fertilized soils did not exceed the maximum permissible contents for vegetable crops soils in China. However, high Zn accumulation should be of concern due to repeated applications of digested pig slurry. No fecal coliforms, ascaris eggs, schistosoma eggs or hookworm eggs were detected in the fertilized soils. PMID:27058548

  3. Nutrient Status and Contamination Risks from Digested Pig Slurry Applied on a Vegetable Crops Field.

    PubMed

    Zhang, Shaohui; Hua, Yumei; Deng, Liangwei

    2016-04-01

    The effects of applied digested pig slurry on a vegetable crops field were studied. The study included a 3-year investigation on nutrient characteristics, heavy metals contamination and hygienic risks of a vegetable crops field in Wuhan, China. The results showed that, after anaerobic digestion, abundant N, P and K remained in the digested pig slurry while fecal coliforms, ascaris eggs, schistosoma eggs and hookworm eggs were highly reduced. High Cr, Zn and Cu contents in the digested pig slurry were found in spring. Digested pig slurry application to the vegetable crops field led to improved soil fertility. Plant-available P in the fertilized soils increased due to considerable increase in total P content and decrease in low-availability P fraction. The As content in the fertilized soils increased slightly but significantly (p = 0.003) compared with control. The Hg, Zn, Cr, Cd, Pb, and Cu contents in the fertilized soils did not exceed the maximum permissible contents for vegetable crops soils in China. However, high Zn accumulation should be of concern due to repeated applications of digested pig slurry. No fecal coliforms, ascaris eggs, schistosoma eggs or hookworm eggs were detected in the fertilized soils. PMID:27058548

  4. Sinkhole risk modelling applied to transportation infrastructures. A case study from the Ebro valley evaporite karst (NE Spain)

    NASA Astrophysics Data System (ADS)

    Galve, Jorge P.; Remondo, Juan; Gutiérrez, Francisco; Guerrero, Jesús; Bonachea, Jaime; Lucha, Pedro

    2010-05-01

    Sinkholes disrupt transportation route serviceability causing significant direct and indirect economic losses. Additionally, catastrophic collapse sinkholes may lead to accidents producing loss of human lives. Sinkhole risk modelling allows the estimation of the expectable losses in different portions of infrastructures and the identification of the sections where the application corrective measures would have a better cost-benefit ratio. An example of sinkhole risk analysis applied to a motorway under construction in a mantled evaporite karst area with a very high probability of occurrence of cover collapse sinkholes is presented. Firstly, sinkhole susceptibility models have been obtained, and independently evaluated, on the basis of a probabilistic method which combines the distance to nearest sinkhole with other conditioning factors. The most reliable susceptibility model was then transformed into several sinkhole hazard models using empirical functions. This functions describe the relationships between the frequency of sinkholes and (1) sinkholes dimensions, (2) terrain susceptibility and (3) land cover. Although to evaluate hazard models more information on temporal occurrences would be needed, the quality and quantity of the data in which models are based and the distribution of the latest sinkholes of considerable magnitude occurred in the study area indicate that the models seem to be sound. Two collapse sinkholes 4 m across formed after the production of the models coincide with the zone of highest hazard, which occupy 15% of the study area. Finally, on the basis of the hazard models obtained, sinkhole risk models were generated for a motorway under construction with the aim of quantitatively estimating the expected losses in different sections of the infrastructure in a given period of time. To produce the risk models, the vulnerability of the motorway was estimated considering the cost of the structure, sinkhole magnitude and frequency and the expectable

  5. Optical methods of stress analysis applied to cracked components

    NASA Technical Reports Server (NTRS)

    Smith, C. W.

    1991-01-01

    After briefly describing the principles of frozen stress photoelastic and moire interferometric analyses, and the corresponding algorithms for converting optical data from each method into stress intensity factors (SIF), the methods are applied to the determination of crack shapes, SIF determination, crack closure displacement fields, and pre-crack damage mechanisms in typical aircraft component configurations.

  6. Applying Research: An Analysis of Texts for Consumers of Research.

    ERIC Educational Resources Information Center

    Erion, R. L.; Steinley, Gary

    The critical reading of research involves: (1) comprehension, (2) evaluation, and (3) application. A study examined six recently published textbooks to determine to what extent they attempt to help students learn to apply educational research; these texts were specifically designed for "consumers" of research (i.e., critical readers of research)…

  7. Duration Analysis Applied to the Adoption of Knowledge.

    ERIC Educational Resources Information Center

    Vega-Cervera, Juan A.; Gordillo, Isabel Cuadrado

    2001-01-01

    Analyzes knowledge acquisition in a sample of 264 pupils in 9 Spanish elementary schools, using time as a dependent variable. Introduces psycho-pedagogical, pedagogical, and social variables into a hazard model applied to the reading process. Auditory discrimination (not intelligence or visual perception) most significantly influences learning to…

  8. INDICATORS OF RISK: AN ANALYSIS APPROACH FOR IMPROVED RIVER MANAGEMENT

    EPA Science Inventory

    A risk index is an approach to measuring the level of risk to the plants and/or animals (biota) in a certain area using water and habitat quality information. A new technique for developing risk indices was applied to data collected from Mid-Atlantic streams of the U.S. during 1...

  9. How Has Applied Behavior Analysis and Behavior Therapy Changed?: An Historical Analysis of Journals

    ERIC Educational Resources Information Center

    O'Donohue, William; Fryling, Mitch

    2007-01-01

    Applied behavior analysis and behavior therapy are now nearly a half century old. It is interesting to ask if and how these disciplines have changed over time, particularly regarding some of their key internal controversies (e.g., role of cognitions). We examined the first five years and the 2000-2004 five year period of the "Journal of Applied…

  10. Applying Model Analysis to a Resource-Based Analysis of the Force and Motion Conceptual Evaluation

    ERIC Educational Resources Information Center

    Smith, Trevor I.; Wittmann, Michael C.; Carter, Tom

    2014-01-01

    Previously, we analyzed the Force and Motion Conceptual Evaluation in terms of a resources-based model that allows for clustering of questions so as to provide useful information on how students correctly or incorrectly reason about physics. In this paper, we apply model analysis to show that the associated model plots provide more information…

  11. Risk analysis for autonomous underwater vehicle operations in extreme environments.

    PubMed

    Brito, Mario Paulo; Griffiths, Gwyn; Challenor, Peter

    2010-12-01

    Autonomous underwater vehicles (AUVs) are used increasingly to explore hazardous marine environments. Risk assessment for such complex systems is based on subjective judgment and expert knowledge as much as on hard statistics. Here, we describe the use of a risk management process tailored to AUV operations, the implementation of which requires the elicitation of expert judgment. We conducted a formal judgment elicitation process where eight world experts in AUV design and operation were asked to assign a probability of AUV loss given the emergence of each fault or incident from the vehicle's life history of 63 faults and incidents. After discussing methods of aggregation and analysis, we show how the aggregated risk estimates obtained from the expert judgments were used to create a risk model. To estimate AUV survival with mission distance, we adopted a statistical survival function based on the nonparametric Kaplan-Meier estimator. We present theoretical formulations for the estimator, its variance, and confidence limits. We also present a numerical example where the approach is applied to estimate the probability that the Autosub3 AUV would survive a set of missions under Pine Island Glacier, Antarctica in January-March 2009. PMID:20731790

  12. Risk analysis of nuclear safeguards regulations. [Aggregated Systems Model (ASM)

    SciTech Connect

    Al-Ayat, R.A.; Altman, W.D.; Judd, B.R.

    1982-06-01

    The Aggregated Systems Model (ASM), a probabilisitic risk analysis tool for nuclear safeguards, was applied to determine benefits and costs of proposed amendments to NRC regulations governing nuclear material control and accounting systems. The objective of the amendments was to improve the ability to detect insiders attempting to steal large quantities of special nuclear material (SNM). Insider threats range from likely events with minor consequences to unlikely events with catastrophic consequences. Moreover, establishing safeguards regulations is complicated by uncertainties in threats, safeguards performance, and consequences, and by the subjective judgments and difficult trade-offs between risks and safeguards costs. The ASM systematically incorporates these factors in a comprehensive, analytical framework. The ASM was used to evaluate the effectiveness of current safeguards and to quantify the risk of SNM theft. Various modifications designed to meet the objectives of the proposed amendments to reduce that risk were analyzed. Safeguards effectiveness was judged in terms of the probability of detecting and preventing theft, the expected time to detection, and the expected quantity of SNM diverted in a year. Data were gathered in tours and interviews at NRC-licensed facilities. The assessment at each facility was begun by carefully selecting scenarios representing the range of potential insider threats. A team of analysts and facility managers assigned probabilities for detection and prevention events in each scenario. Using the ASM we computed the measures of system effectiveness and identified cost-effective safeguards modifications that met the objectives of the proposed amendments.

  13. Modal analysis applied to circular, rectangular, and coaxial waveguides

    NASA Technical Reports Server (NTRS)

    Hoppe, D. J.

    1988-01-01

    Recent developments in the analysis of various waveguide components and feedhorns using Modal Analysis (Mode Matching Method) are summarized. A brief description of the theory is presented, and the important features of the method are pointed out. Specific examples in circular, rectangular, and coaxial waveguides are included, with comparisons between the theory and experimental measurements. Extensions to the methods are described.

  14. Examples of Communicating Uncertainty Applied to Earthquake Hazard and Risk Products

    NASA Astrophysics Data System (ADS)

    Wald, D. J.

    2013-12-01

    When is communicating scientific modeling uncertainty effective? One viewpoint is that the answer depends on whether one is communicating hazard or risk: hazards have quantifiable uncertainties (which, granted, are often ignored), yet risk uncertainties compound uncertainties inherent in the hazard with those of the risk calculations, and are thus often larger. Larger, yet more meaningful: since risk entails societal impact of some form, consumers of such information tend to have a better grasp of the potential uncertainty ranges for loss information than they do for less-tangible hazard values (like magnitude, peak acceleration, or stream flow). I present two examples that compare and contrast communicating uncertainty for earthquake hazard and risk products. The first example is the U.S. Geological Survey's (USGS) ShakeMap system, which portrays the uncertain, best estimate of the distribution and intensity of shaking over the potentially impacted region. The shaking intensity is well constrained at seismograph locations yet is uncertain elsewhere, so shaking uncertainties are quantified and presented spatially. However, with ShakeMap, it seems that users tend to believe what they see is accurate in part because (1) considering the shaking uncertainty complicates the picture, and (2) it would not necessarily alter their decision-making. In contrast, when it comes to making earthquake-response decisions based on uncertain loss estimates, actions tend to be made only after analysis of the confidence in (or source of) such estimates. Uncertain ranges of loss estimates instill tangible images for users, and when such uncertainties become large, intuitive reality-check alarms go off, for example, when the range of losses presented become too wide to be useful. The USGS Prompt Assessment of Global Earthquakes for Response (PAGER) system, which in near-real time alerts users to the likelihood of ranges of potential fatalities and economic impact, is aimed at

  15. 49 CFR 260.17 - Credit risk premium analysis.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 49 Transportation 4 2014-10-01 2014-10-01 false Credit risk premium analysis. 260.17 Section 260... Financial Assistance § 260.17 Credit risk premium analysis. (a) When Federal appropriations are not available to cover the total subsidy cost, the Administrator will determine the Credit Risk...

  16. 49 CFR 260.17 - Credit risk premium analysis.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 49 Transportation 4 2013-10-01 2013-10-01 false Credit risk premium analysis. 260.17 Section 260... Financial Assistance § 260.17 Credit risk premium analysis. (a) When Federal appropriations are not available to cover the total subsidy cost, the Administrator will determine the Credit Risk...

  17. 49 CFR 260.17 - Credit risk premium analysis.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 49 Transportation 4 2011-10-01 2011-10-01 false Credit risk premium analysis. 260.17 Section 260... Financial Assistance § 260.17 Credit risk premium analysis. (a) When Federal appropriations are not available to cover the total subsidy cost, the Administrator will determine the Credit Risk...

  18. 49 CFR 260.17 - Credit risk premium analysis.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 49 Transportation 4 2012-10-01 2012-10-01 false Credit risk premium analysis. 260.17 Section 260... Financial Assistance § 260.17 Credit risk premium analysis. (a) When Federal appropriations are not available to cover the total subsidy cost, the Administrator will determine the Credit Risk...

  19. Factorial kriging analysis applied to geological data from petroleum exploration

    SciTech Connect

    Jaquet, O.

    1989-10-01

    A regionalized variable, thickness of the reservoir layer, from a gas field is decomposed by factorial kriging analysis. Maps of the obtained components may be associated with depositional environments that are favorable for petroleum exploration.

  20. Neutron-activation analysis applied to copper ores and artifacts

    NASA Technical Reports Server (NTRS)

    Linder, N. F.

    1970-01-01

    Neutron activation analysis is used for quantitative identification of trace metals in copper. Establishing a unique fingerprint of impurities in Michigan copper would enable identification of artifacts made from this copper.

  1. Applying the Land Use Portfolio Model with Hazus to analyse risk from natural hazard events

    USGS Publications Warehouse

    Dinitz, Laura B.; Taketa, Richard A.

    2013-01-01

    This paper describes and demonstrates the integration of two geospatial decision-support systems for natural-hazard risk assessment and management. Hazus is a risk-assessment tool developed by the Federal Emergency Management Agency to identify risks and estimate the severity of risk from natural hazards. The Land Use Portfolio Model (LUPM) is a risk-management tool developed by the U.S. Geological Survey to evaluate plans or actions intended to reduce risk from natural hazards. We analysed three mitigation policies for one earthquake scenario in the San Francisco Bay area to demonstrate the added value of using Hazus and the LUPM together. The demonstration showed that Hazus loss estimates can be input to the LUPM to obtain estimates of losses avoided through mitigation, rates of return on mitigation investment, and measures of uncertainty. Together, they offer a more comprehensive approach to help with decisions for reducing risk from natural hazards.

  2. Use-related risk analysis for medical devices based on improved FMEA.

    PubMed

    Liu, Long; Shuai, Ma; Wang, Zhu; Li, Ping

    2012-01-01

    In order to effectively analyze and control use-related risk of medical devices, quantitative methodologies must be applied. Failure Mode and Effects Analysis (FMEA) is a proactive technique for error detection and risk reduction. In this article, an improved FMEA based on Fuzzy Mathematics and Grey Relational Theory is developed to better carry out user-related risk analysis for medical devices. As an example, the analysis process using this improved FMEA method for a certain medical device (C-arm X-ray machine) is described. PMID:22317712

  3. OVERVIEW: MYCOTOXIN RISK ANALYSIS IN THE USA

    Technology Transfer Automated Retrieval System (TEKTRAN)

    The objective of the risk assessment process is to provide the basis for decisions that will allow successful management of the risk. Poor risk management decisions that result from inadequate risk assessment can be enormousely costly, both economically and in terms of human or animal health. Accco...

  4. Development of partial failure analysis method in probability risk assessments

    SciTech Connect

    Ni, T.; Modarres, M.

    1996-12-01

    This paper presents a new approach to evaluate the partial failure effect on current Probability Risk Assessments (PRAs). An integrated methodology of the thermal-hydraulic analysis and fuzzy logic simulation using the Dynamic Master Logic Diagram (DMLD) was developed. The thermal-hydraulic analysis used in this approach is to identify partial operation effect of any PRA system function in a plant model. The DMLD is used to simulate the system performance of the partial failure effect and inspect all minimal cut sets of system functions. This methodology can be applied in the context of a full scope PRA to reduce core damage frequency. An example of this application of the approach is presented. The partial failure data used in the example is from a survey study of partial failure effects from the Nuclear Plant Reliability Data System (NPRDS).

  5. Joint regression analysis and AMMI model applied to oat improvement

    NASA Astrophysics Data System (ADS)

    Oliveira, A.; Oliveira, T. A.; Mejza, S.

    2012-09-01

    In our work we present an application of some biometrical methods useful in genotype stability evaluation, namely AMMI model, Joint Regression Analysis (JRA) and multiple comparison tests. A genotype stability analysis of oat (Avena Sativa L.) grain yield was carried out using data of the Portuguese Plant Breeding Board, sample of the 22 different genotypes during the years 2002, 2003 and 2004 in six locations. In Ferreira et al. (2006) the authors state the relevance of the regression models and of the Additive Main Effects and Multiplicative Interactions (AMMI) model, to study and to estimate phenotypic stability effects. As computational techniques we use the Zigzag algorithm to estimate the regression coefficients and the agricolae-package available in R software for AMMI model analysis.

  6. Orbit Response Matrix Analysis Applied at PEP-II

    SciTech Connect

    Steier, C.; Wolski, A.; Ecklund, S.; Safranek, J.A.; Tenenbaum, P.; Terebilo, A.; Turner, J.L.; Yocky, G.; /SLAC

    2005-05-17

    The analysis of orbit response matrices has been used very successfully to measure and correct the gradient and skew gradient distribution in many accelerators. It allows determination of an accurately calibrated model of the coupled machine lattice, which then can be used to calculate the corrections necessary to improve coupling, dynamic aperture and ultimately luminosity. At PEP-II, the Matlab version of LOCO has been used to analyze coupled response matrices for both the LER and the HER. The large number of elements in PEP-II and the very complicated interaction region present unique challenges to the data analysis. All necessary tools to make the analysis method useable at PEP-II have been implemented and LOCO can now be used as a routine tool for lattice diagnostic.

  7. The colour analysis method applied to homogeneous rocks

    NASA Astrophysics Data System (ADS)

    Halász, Amadé; Halmai, Ákos

    2015-12-01

    Computer-aided colour analysis can facilitate cyclostratigraphic studies. Here we report on a case study involving the development of a digital colour analysis method for examination of the Boda Claystone Formation which is the most suitable in Hungary for the disposal of high-level radioactive waste. Rock type colours are reddish brown or brownish red, or any shade between brown and red. The method presented here could be used to differentiate similar colours and to identify gradual transitions between these; the latter are of great importance in a cyclostratigraphic analysis of the succession. Geophysical well-logging has demonstrated the existence of characteristic cyclic units, as detected by colour and natural gamma. Based on our research, colour, natural gamma and lithology correlate well. For core Ib-4, these features reveal the presence of orderly cycles with thicknesses of roughly 0.64 to 13 metres. Once the core has been scanned, this is a time- and cost-effective method.

  8. Systems design analysis applied to launch vehicle configuration

    NASA Technical Reports Server (NTRS)

    Ryan, R.; Verderaime, V.

    1993-01-01

    As emphasis shifts from optimum-performance aerospace systems to least lift-cycle costs, systems designs must seek, adapt, and innovate cost improvement techniques in design through operations. The systems design process of concept, definition, and design was assessed for the types and flow of total quality management techniques that may be applicable in a launch vehicle systems design analysis. Techniques discussed are task ordering, quality leverage, concurrent engineering, Pareto's principle, robustness, quality function deployment, criteria, and others. These cost oriented techniques are as applicable to aerospace systems design analysis as to any large commercial system.

  9. On the relation between applied behavior analysis and positive behavioral support

    PubMed Central

    Carr, James E.; Sidener, Tina M.

    2002-01-01

    Anderson and Freeman (2000) recently defined positive behavioral support (PBS) as a systematic approach to the delivery of clinical and educational services that is rooted in behavior analysis. However, the recent literature contains varied definitions of PBS as well as discrepant notions regarding the relation between applied behavior analysis and PBS. After summarizing common definitional characteristics of PBS from the literature, we conclude that PBS is comprised almost exclusively of techniques and values originating in applied behavior analysis. We then discuss the relations between applied behavior analysis and PBS that have been proposed in the literature. Finally, we discuss possible implications of considering PBS a field separate from applied behavior analysis. PMID:22478389

  10. [Irrationality and risk--a problem analysis].

    PubMed

    Bergler, R

    1995-04-01

    The way one experiences risks and one's risk behaviour is not centrally the result of rational considerations and decisions. Contradictions and incompatibilities can be proved empirically: (1) General paranoiac hysterical fear of risks, (2) complete repression of health risks, (3) unsureness and inability in dealing with risks, (4) prejudice against risks, (5) admiring risk behaviour, (6) seeking dangerous ways to test one's own limits, (7) readiness for aggression by not subjectively being able to control the risk, (8) naively assessing a risk positively thus intensifying pleasurable sensation, (9) increased preparedness to take risks in groups, (10) increased preparedness to take risk in anonymity, (11) accepting risks as a creative challenge for achievement motivation, (12) loss of innovation when avoiding risks. The awareness and assessment of risk factors is dependent on (1) personality factors (achievement-motivated persons experience risk as a creative challenge; risk behaviour being a variation in stimulus and exploration of one's environment), (2) the social back-ground (booster function), (3) the culture-related assessment of being able to control risks and (4) age (youthful risk behaviour being a way of coping with developmental exercises, e.g. purposely breaking parent's rules, provocatively demonstrating adult behaviour, a means to solve frustrating performance failures, advantages through social acceptance). The way one copes with risk factors is based on subjective estimation of risk; this assessment is made anew for each area of behaviour according to the specific situation. Making a decision and acting upon it is the result of simultaneously assessing the possible psychological benefit-cost factors of taking risks. The extent, to which the threat of risk factors is felt, depends on the importance one attaches to certain needs and benefits of quality of life and, in addition to this, what the subjective probabilities are that one is likely to be

  11. Applying Adult Learning Theory through a Character Analysis

    ERIC Educational Resources Information Center

    Baskas, Richard S.

    2011-01-01

    The purpose of this study is to analyze the behavior of a character, Celie, in a movie, 'The Color Purple," through the lens of two adult learning theorists to determine the relationships the character has with each theory. The development and portrayal of characters in movies can be explained and understood by the analysis of adult learning…

  12. Applying Skinner's Analysis of Verbal Behavior to Persons with Dementia

    ERIC Educational Resources Information Center

    Dixon, Mark; Baker, Jonathan C.; Sadowski, Katherine Ann

    2011-01-01

    Skinner's 1957 analysis of verbal behavior has demonstrated a fair amount of utility to teach language to children with autism and other various disorders. However, the learning of language can be forgotten, as is the case for many elderly suffering from dementia or other degenerative diseases. It appears possible that Skinner's operants may…

  13. Applying Score Analysis to a Rehearsal Pedagogy of Expressive Performance

    ERIC Educational Resources Information Center

    Byo, James L.

    2014-01-01

    The discoveries of score analysis (e.g., minor seventh chord, ostinato, phrase elision, melodic fragment, half cadence) are more than just compositional techniques or music vocabulary. They are sounds--fascinating, storytelling, dynamic modes of expression--that when approached as such enrich the rehearsal experience. This article presents a…

  14. Action, Content and Identity in Applied Genre Analysis for ESP

    ERIC Educational Resources Information Center

    Flowerdew, John

    2011-01-01

    Genres are staged, structured, communicative events, motivated by various communicative purposes, and performed by members of specific discourse communities (Swales 1990; Bhatia 1993, 2004; Berkenkotter & Huckin 1995). Since its inception, with the two seminal works on the topic by Swales (1990) and Bhatia (1993), genre analysis has taken pride of…

  15. Applied Bibliometrics: Using Citation Analysis in the Journal Submission Process.

    ERIC Educational Resources Information Center

    Robinson, Michael D.

    1991-01-01

    Discusses the use of citation analysis as an effective tool for scholars to determine what journals would be appropriate for publication of their work. Calculating citation distance is explained, and a study with economics journals is described that computed citation distance between previously published articles and journals in the field. (12…

  16. Applying an Activity System to Online Collaborative Group Work Analysis

    ERIC Educational Resources Information Center

    Choi, Hyungshin; Kang, Myunghee

    2010-01-01

    This study determines whether an activity system provides a systematic framework to analyse collaborative group work. Using an activity system as a unit of analysis, the research examined learner behaviours, conflicting factors and facilitating factors while students engaged in collaborative work via asynchronous computer-mediated communication.…

  17. Improving Patient Prostate Cancer Risk Assessment: Moving From Static, Globally-Applied to Dynamic, Practice-Specific Cancer Risk Calculators

    PubMed Central

    Strobl, Andreas N.; Vickers, Andrew J.; Van Calster, Ben; Steyerberg, Ewout; Leach, Robin J.; Thompson, Ian M.; Ankerst, Donna P.

    2015-01-01

    Clinical risk calculators are now widely available but have generally been implemented in a static and one-size-fits-all fashion. The objective of this study was to challenge these notions and show via a case study concerning risk-based screening for prostate cancer how calculators can be dynamically and locally tailored to improve on-site patient accuracy. Yearly data from five international prostate biopsy cohorts (3 in the US, 1 in Austria, 1 in England) were used to compare 6 methods for annual risk prediction: static use of the online US-developed Prostate Cancer Prevention Trial Risk Calculator (PCPTRC); recalibration of the PCPTRC; revision of the PCPTRC; building a new model each year using logistic regression, Bayesian prior-to-posterior updating, or random forests. All methods performed similarly with respect to discrimination, except for random forests, which were worse. All methods except for random forests greatly improved calibration over the static PCPTRC in all cohorts except for Austria, where the PCPTRC had the best calibration followed closely by recalibration. The case study shows that a simple annual recalibration of a general online risk tool for prostate cancer can improve its accuracy with respect to the local patient practice at hand. PMID:25989018

  18. Applying AI tools to operational space environmental analysis

    NASA Technical Reports Server (NTRS)

    Krajnak, Mike; Jesse, Lisa; Mucks, John

    1995-01-01

    The U.S. Air Force and National Oceanic Atmospheric Agency (NOAA) space environmental operations centers are facing increasingly complex challenges meeting the needs of their growing user community. These centers provide current space environmental information and short term forecasts of geomagnetic activity. Recent advances in modeling and data access have provided sophisticated tools for making accurate and timely forecasts, but have introduced new problems associated with handling and analyzing large quantities of complex data. AI (Artificial Intelligence) techniques have been considered as potential solutions to some of these problems. Fielding AI systems has proven more difficult than expected, in part because of operational constraints. Using systems which have been demonstrated successfully in the operational environment will provide a basis for a useful data fusion and analysis capability. Our approach uses a general purpose AI system already in operational use within the military intelligence community, called the Temporal Analysis System (TAS). TAS is an operational suite of tools supporting data processing, data visualization, historical analysis, situation assessment and predictive analysis. TAS includes expert system tools to analyze incoming events for indications of particular situations and predicts future activity. The expert system operates on a knowledge base of temporal patterns encoded using a knowledge representation called Temporal Transition Models (TTM's) and an event database maintained by the other TAS tools. The system also includes a robust knowledge acquisition and maintenance tool for creating TTM's using a graphical specification language. The ability to manipulate TTM's in a graphical format gives non-computer specialists an intuitive way of accessing and editing the knowledge base. To support space environmental analyses, we used TAS's ability to define domain specific event analysis abstractions. The prototype system defines

  19. Best practices: applying management analysis of excellence to immunization.

    PubMed

    Wishner, Amy; Aronson, Jerold; Kohrt, Alan; Norton, Gary

    2005-01-01

    The authors applied business management tools to analyze and promote excellence and to evaluate differences between average and above-average immunization peformers in private practices. The authors conducted a pilot study of 10 private practices in Pennsylvania using tools common in management to assess practices' organizational climate and managerial style. Authoritative and coaching styles of physician leaders were common to both groups. Managerial styles that emphasized higher levels of clarity and responsibility managerial styles were evident in the large practices; and rewards and flexibility styles were higher in the small above-average practices. The findings of this pilot study match results seen in high performers in other industries. It concludes that the authoritative style appears to have the most impact on performance. It has interesting implications for training/behavior change to improve immunization rates, along with traditional medical interventions. PMID:15921143

  20. A value analysis model applied to the management of amblyopia.

    PubMed Central

    Beauchamp, G R; Bane, M C; Stager, D R; Berry, P M; Wright, W W

    1999-01-01

    PURPOSE: To assess the value of amblyopia-related services by utilizing a health value model (HVM). Cost and quality criteria are evaluated in accordance with the interests of patients, physicians, and purchasers. METHODS: We applied an HVM to a hypothetical statistical ("median") child with amblyopia whose visual acuity is 20/80 and to a group of children with amblyopia who are managed by our practice. We applied the model to calculate the value of these services by evaluating the responses of patients and physicians and relating these responses to clinical outcomes. RESULTS: The consensus value of care for the hypothetical median child was calculated to be 0.406 (of 1.000). For those children managed in our practice, the calculated value is 0.682. Clinically, 79% achieved 20/40 or better visual acuity, and the mean final visual acuity was 0.2 logMAR (20/32). Value appraisals revealed significant concerns about the financial aspects of amblyopia-related services, particularly among physicians. Patients rated services more positively than did physicians. CONCLUSIONS: Amblyopia care is difficult, sustained, and important work that requires substantial sensitivity to and support of children and families. Compliance and early detection are essential to success. The value of amblyopia services is rated significantly higher by patients than by physicians. Relative to the measured value, amblyopia care is undercompensated. The HVM is useful to appraise clinical service delivery and its variation. The costs of failure and the benefits of success are high; high-value amblyopia care yields substantial dividends and should be commensurately compensated in the marketplace. PMID:10703133

  1. HANFORD SAFETY ANALYSIS & RISK ASSESSMENT HANDBOOK (SARAH)

    SciTech Connect

    EVANS, C B

    2004-12-21

    The purpose of the Hanford Safety Analysis and Risk Assessment Handbook (SARAH) is to support the development of safety basis documentation for Hazard Category 2 and 3 (HC-2 and 3) U.S. Department of Energy (DOE) nuclear facilities to meet the requirements of 10 CFR 830, ''Nuclear Safety Management''. Subpart B, ''Safety Basis Requirements.'' Consistent with DOE-STD-3009-94, Change Notice 2, ''Preparation Guide for U.S. Department of Energy Nonreactor Nuclear Facility Documented Safety Analyses'' (STD-3009), and DOE-STD-3011-2002, ''Guidance for Preparation of Basis for Interim Operation (BIO) Documents'' (STD-3011), the Hanford SARAH describes methodology for performing a safety analysis leading to development of a Documented Safety Analysis (DSA) and derivation of Technical Safety Requirements (TSR), and provides the information necessary to ensure a consistently rigorous approach that meets DOE expectations. The DSA and TSR documents, together with the DOE-issued Safety Evaluation Report (SER), are the basic components of facility safety basis documentation. For HC-2 or 3 nuclear facilities in long-term surveillance and maintenance (S&M), for decommissioning activities, where source term has been eliminated to the point that only low-level, residual fixed contamination is present, or for environmental remediation activities outside of a facility structure, DOE-STD-1120-98, ''Integration of Environment, Safety, and Health into Facility Disposition Activities'' (STD-1120), may serve as the basis for the DSA. HC-2 and 3 environmental remediation sites also are subject to the hazard analysis methodologies of this standard.

  2. Setting events in applied behavior analysis: Toward a conceptual and methodological expansion

    PubMed Central

    Wahler, Robert G.; Fox, James J.

    1981-01-01

    The contributions of applied behavior analysis as a natural science approach to the study of human behavior are acknowledged. However, it is also argued that applied behavior analysis has provided limited access to the full range of environmental events that influence socially significant behavior. Recent changes in applied behavior analysis to include analysis of side effects and social validation represent ways in which the traditional applied behavior analysis conceptual and methodological model has been profitably expanded. A third area of expansion, the analysis of setting events, is proposed by the authors. The historical development of setting events as a behavior influence concept is traced. Modifications of the basic applied behavior analysis methodology and conceptual systems that seem necessary to setting event analysis are discussed and examples of descriptive and experimental setting event analyses are presented. PMID:16795646

  3. Nested sampling applied in Bayesian room-acoustics decay analysis.

    PubMed

    Jasa, Tomislav; Xiang, Ning

    2012-11-01

    Room-acoustic energy decays often exhibit single-rate or multiple-rate characteristics in a wide variety of rooms/halls. Both the energy decay order and decay parameter estimation are of practical significance in architectural acoustics applications, representing two different levels of Bayesian probabilistic inference. This paper discusses a model-based sound energy decay analysis within a Bayesian framework utilizing the nested sampling algorithm. The nested sampling algorithm is specifically developed to evaluate the Bayesian evidence required for determining the energy decay order with decay parameter estimates as a secondary result. Taking the energy decay analysis in architectural acoustics as an example, this paper demonstrates that two different levels of inference, decay model-selection and decay parameter estimation, can be cohesively accomplished by the nested sampling algorithm. PMID:23145609

  4. Current Human Reliability Analysis Methods Applied to Computerized Procedures

    SciTech Connect

    Ronald L. Boring

    2012-06-01

    Computerized procedures (CPs) are an emerging technology within nuclear power plant control rooms. While CPs have been implemented internationally in advanced control rooms, to date no US nuclear power plant has implemented CPs in its main control room (Fink et al., 2009). Yet, CPs are a reality of new plant builds and are an area of considerable interest to existing plants, which see advantages in terms of enhanced ease of use and easier records management by omitting the need for updating hardcopy procedures. The overall intent of this paper is to provide a characterization of human reliability analysis (HRA) issues for computerized procedures. It is beyond the scope of this document to propose a new HRA approach or to recommend specific methods or refinements to those methods. Rather, this paper serves as a review of current HRA as it may be used for the analysis and review of computerized procedures.

  5. LAMQS analysis applied to ancient Egyptian bronze coins

    NASA Astrophysics Data System (ADS)

    Torrisi, L.; Caridi, F.; Giuffrida, L.; Torrisi, A.; Mondio, G.; Serafino, T.; Caltabiano, M.; Castrizio, E. D.; Paniz, E.; Salici, A.

    2010-05-01

    Some Egyptian bronze coins, dated VI-VII sec A.D. are analyzed through different physical techniques in order to compare their composition and morphology and to identify their origin and the type of manufacture. The investigations have been performed by using micro-invasive analysis, such as Laser Ablation and Mass Quadrupole Spectrometry (LAMQS), X-ray Fluorescence (XRF), Laser Induced Breakdown Spectroscopy (LIBS), Electronic (SEM) and Optical Microscopy, Surface Profile Analysis (SPA) and density measurements. Results indicate that the coins have a similar bulk composition but significant differences have been evidenced due to different constituents of the patina, bulk alloy composition, isotopic ratios, density and surface morphology. The results are in agreement with the archaeological expectations, indicating that the coins have been produced in two different Egypt sites: Alexandria and Antinoupolis. A group of fake coins produced in Alexandria in the same historical period is also identified.

  6. Applying hydraulic transient analysis: The Grizzly Hydro Project

    SciTech Connect

    Logan, T.H.; Stutsman, R.D. )

    1992-04-01

    No matter the size of the hydro plant, if it has a long waterway and will operate in peaking mode, the project designer needs to address the issue of hydraulic transients-known as water hammer-early in the design. This article describes the application of transient analysis to the design of a 20-MW hydro plant in California. In this case, a Howell Bunger valve was used as a pressure regulating valve to control transient pressures and speed rise.

  7. Landslide risk analysis: a multi-disciplinary methodological approach

    NASA Astrophysics Data System (ADS)

    Sterlacchini, S.; Frigerio, S.; Giacomelli, P.; Brambilla, M.

    2007-11-01

    This study describes an analysis carried out within the European community project "ALARM" (Assessment of Landslide Risk and Mitigation in Mountain Areas, 2004) on landslide risk assessment in the municipality of Corvara in Badia, Italy. This mountainous area, located in the central Dolomites (Italian Alps), poses a significant landslide hazard to several man-made and natural objects. Three parameters for determining risk were analysed as an aid to preparedness and mitigation planning: event occurrence probability, elements at risk, and the vulnerability of these elements. Initially, a landslide hazard scenario was defined; this step was followed by the identification of the potential vulnerable elements, by the estimation of the expected physical effects, due to the occurrence of a damaging phenomenon, and by the analysis of social and economic features of the area. Finally, a potential risk scenario was defined, where the relationships between the event, its physical effects, and its economic consequences were investigated. People and public administrators with training and experience in local landsliding and slope processes were involved in each step of the analysis. A "cause-effect" correlation was applied, derived from the "dose-response" equation initially used in the biological sciences and then adapted by economists for the assessment of environmental risks. The relationship was analysed from a physical point of view and the cause (the natural event) was correlated to the physical effects, i.e. the aesthetic, functional, and structural damage. An economic evaluation of direct and indirect damage was carried out considering the assets in the affected area (i.e., tourist flows, goods, transport and the effect on other social and economic activities). This study shows the importance of indirect damage, which is as significant as direct damage. The total amount of direct damage was estimated in 8 913 000 €; on the contrary, indirect damage ranged considerably

  8. Applying thiouracil (TU)-tagging for mouse transcriptome analysis

    PubMed Central

    Gay, Leslie; Karfilis, Kate V.; Miller, Michael R.; Doe, Chris Q.; Stankunas, Kryn

    2014-01-01

    Transcriptional profiling is a powerful approach to study mouse development, physiology, and disease models. Here, we describe a protocol for mouse thiouracil-tagging (TU-tagging), a transcriptome analysis technology that includes in vivo covalent labeling, purification, and analysis of cell type-specific RNA. TU-tagging enables 1) the isolation of RNA from a given cell population of a complex tissue, avoiding transcriptional changes induced by cell isolation trauma, and 2) the identification of actively transcribed RNAs and not pre-existing transcripts. Therefore, in contrast to other cell-specific transcriptional profiling methods based on purification of tagged ribosomes or nuclei, TU-tagging provides a direct examination of transcriptional regulation. We describe how to: 1) deliver 4-thiouracil to transgenic mice to thio-label cell lineage-specific transcripts, 2) purify TU-tagged RNA and prepare libraries for Illumina sequencing, and 3) follow a straight-forward bioinformatics workflow to identify cell type-enriched or differentially expressed genes. Tissue containing TU-tagged RNA can be obtained in one day, RNA-Seq libraries generated within two days, and, following sequencing, an initial bioinformatics analysis completed in one additional day. PMID:24457332

  9. Big Data Usage Patterns in the Health Care Domain: A Use Case Driven Approach Applied to the Assessment of Vaccination Benefits and Risks

    PubMed Central

    Liyanage, H.; Liaw, S-T.; Kuziemsky, C.; Mold, F.; Krause, P.; Fleming, D.; Jones, S.

    2014-01-01

    Summary Background Generally benefits and risks of vaccines can be determined from studies carried out as part of regulatory compliance, followed by surveillance of routine data; however there are some rarer and more long term events that require new methods. Big data generated by increasingly affordable personalised computing, and from pervasive computing devices is rapidly growing and low cost, high volume, cloud computing makes the processing of these data inexpensive. Objective To describe how big data and related analytical methods might be applied to assess the benefits and risks of vaccines. Method: We reviewed the literature on the use of big data to improve health, applied to generic vaccine use cases, that illustrate benefits and risks of vaccination. We defined a use case as the interaction between a user and an information system to achieve a goal. We used flu vaccination and pre-school childhood immunisation as exemplars. Results We reviewed three big data use cases relevant to assessing vaccine benefits and risks: (i) Big data processing using crowd-sourcing, distributed big data processing, and predictive analytics, (ii) Data integration from heterogeneous big data sources, e.g. the increasing range of devices in the “internet of things”, and (iii) Real-time monitoring for the direct monitoring of epidemics as well as vaccine effects via social media and other data sources. Conclusions Big data raises new ethical dilemmas, though its analysis methods can bring complementary real-time capabilities for monitoring epidemics and assessing vaccine benefit-risk balance. PMID:25123718

  10. Applying a Forensic Actuarial Assessment (the Violence Risk Appraisal Guide) to Nonforensic Patients

    ERIC Educational Resources Information Center

    Harris, Grant T.; Rice, Marnie E.; Camilleri, Joseph A..

    2004-01-01

    The actuarial Violence Risk Appraisal Guide (VRAG) was developed for male offenders where it has shown excellent replicability in many new forensic samples using officially recorded outcomes. Clinicians also make decisions, however, about the risk of interpersonal violence posed by nonforensic psychiatric patients of both sexes. Could an actuarial…

  11. RADON EXPOSURE ASSESSMENT AND DOSIMETRY APPLIED TO EPIDEMIOLOGY AND RISK ESTIMATION

    EPA Science Inventory

    Epidemiological studies of underground miners provide the primary basis for radon risk estimates for indoor exposures as well as mine exposures. A major source of uncertainty in these risk estimates is the uncertainty in radon progeny exposure estimates for the miners. In addit...

  12. Applying a Generic Juvenile Risk Assessment Instrument to a Local Context: Some Practical and Theoretical Lessons

    ERIC Educational Resources Information Center

    Miller, Joel; Lin, Jeffrey

    2007-01-01

    This article examines issues raised by the application of a generic actuarial juvenile risk instrument (the Model Risk Assessment Instrument) to New York City, a context different from the one in which it was developed. It describes practical challenges arising from the constraints of locally available data and local sensibilities and highlights…

  13. Applying Decision-Making Approaches to Health Risk-Taking Behaviors: Progress and Remaining Challenges.

    PubMed

    Cho; Keller; Cooper

    1999-06-01

    This paper critically examines how risk-taking behaviors can be modeled from a decision-making perspective. We first review several applications of a decision perspective to the study of risk-taking behaviors, including studies that investigate consequence generation and the components of the overall utility (i.e., consequence, desirability, and likelihood) of risk-taking and studies that investigate the validity of two decision-oriented models (subjective expected utility and the theory of reasoned action) in predicting risk-taking behaviors. We then discuss challenges in modeling risk-taking behaviors from a decision-making perspective. These challenges include (i) finding the factors that are necessary to improve the predictability of models, (ii) difficulties in eliciting the individual components of overall utility, and (iii) incorporating overall utility changes over time. Copyright 1999 Academic Press. PMID:10366518

  14. IT-OSRA: applying ensemble simulations to estimate the oil spill risk associated to operational and accidental oil spills

    NASA Astrophysics Data System (ADS)

    Sepp Neves, Antonio Augusto; Pinardi, Nadia; Martins, Flavio

    2016-08-01

    Oil Spill Risk Assessments (OSRAs) are widely employed to support decision making regarding oil spill risks. This article adapts the ISO-compliant OSRA framework developed by Sepp Neves et al. (J Environ Manag 159:158-168, 2015) to estimate risks in a complex scenario where uncertainties related to the meteo-oceanographic conditions, where and how a spill could happen exist and the risk computation methodology is not yet well established (ensemble oil spill modeling). The improved method was applied to the Algarve coast, Portugal. Over 50,000 simulations were performed in 2 ensemble experiments to estimate the risks due to operational and accidental spill scenarios associated with maritime traffic. The level of risk was found to be important for both types of scenarios, with significant seasonal variations due to the the currents and waves variability. Higher frequency variability in the meteo-oceanographic variables were also found to contribute to the level of risk. The ensemble results show that the distribution of oil concentrations found on the coast is not Gaussian, opening up new fields of research on how to deal with oil spill risks and related uncertainties.

  15. IT-OSRA: applying ensemble simulations to estimate the oil spill risk associated to operational and accidental oil spills

    NASA Astrophysics Data System (ADS)

    Sepp Neves, Antonio Augusto; Pinardi, Nadia; Martins, Flavio

    2016-06-01

    Oil Spill Risk Assessments (OSRAs) are widely employed to support decision making regarding oil spill risks. This article adapts the ISO-compliant OSRA framework developed by Sepp Neves et al. (J Environ Manag 159:158-168, 2015) to estimate risks in a complex scenario where uncertainties related to the meteo-oceanographic conditions, where and how a spill could happen exist and the risk computation methodology is not yet well established (ensemble oil spill modeling). The improved method was applied to the Algarve coast, Portugal. Over 50,000 simulations were performed in 2 ensemble experiments to estimate the risks due to operational and accidental spill scenarios associated with maritime traffic. The level of risk was found to be important for both types of scenarios, with significant seasonal variations due to the the currents and waves variability. Higher frequency variability in the meteo-oceanographic variables were also found to contribute to the level of risk. The ensemble results show that the distribution of oil concentrations found on the coast is not Gaussian, opening up new fields of research on how to deal with oil spill risks and related uncertainties.

  16. LAVA (Los Alamos Vulnerability and Risk Assessment Methodology): A conceptual framework for automated risk analysis

    SciTech Connect

    Smith, S.T.; Lim, J.J.; Phillips, J.R.; Tisinger, R.M.; Brown, D.C.; FitzGerald, P.D.

    1986-01-01

    At Los Alamos National Laboratory, we have developed an original methodology for performing risk analyses on subject systems characterized by a general set of asset categories, a general spectrum of threats, a definable system-specific set of safeguards protecting the assets from the threats, and a general set of outcomes resulting from threats exploiting weaknesses in the safeguards system. The Los Alamos Vulnerability and Risk Assessment Methodology (LAVA) models complex systems having large amounts of ''soft'' information about both the system itself and occurrences related to the system. Its structure lends itself well to automation on a portable computer, making it possible to analyze numerous similar but geographically separated installations consistently and in as much depth as the subject system warrants. LAVA is based on hierarchical systems theory, event trees, fuzzy sets, natural-language processing, decision theory, and utility theory. LAVA's framework is a hierarchical set of fuzzy event trees that relate the results of several embedded (or sub-) analyses: a vulnerability assessment providing information about the presence and efficacy of system safeguards, a threat analysis providing information about static (background) and dynamic (changing) threat components coupled with an analysis of asset ''attractiveness'' to the dynamic threat, and a consequence analysis providing information about the outcome spectrum's severity measures and impact values. By using LAVA, we have modeled our widely used computer security application as well as LAVA/CS systems for physical protection, transborder data flow, contract awards, and property management. It is presently being applied for modeling risk management in embedded systems, survivability systems, and weapons systems security. LAVA is especially effective in modeling subject systems that include a large human component.

  17. Applying temporal network analysis to the venture capital market

    NASA Astrophysics Data System (ADS)

    Zhang, Xin; Feng, Ling; Zhu, Rongqian; Stanley, H. Eugene

    2015-10-01

    Using complex network theory to study the investment relationships of venture capital firms has produced a number of significant results. However, previous studies have often neglected the temporal properties of those relationships, which in real-world scenarios play a pivotal role. Here we examine the time-evolving dynamics of venture capital investment in China by constructing temporal networks to represent (i) investment relationships between venture capital firms and portfolio companies and (ii) the syndication ties between venture capital investors. The evolution of the networks exhibits rich variations in centrality, connectivity and local topology. We demonstrate that a temporal network approach provides a dynamic and comprehensive analysis of real-world networks.

  18. Multi-criteria decision analysis with probabilistic risk assessment for the management of contaminated ground water

    SciTech Connect

    Khadam, Ibrahim M.; Kaluarachchi, Jagath J

    2003-10-01

    Traditionally, environmental decision analysis in subsurface contamination scenarios is performed using cost-benefit analysis. In this paper, we discuss some of the limitations associated with cost-benefit analysis, especially its definition of risk, its definition of cost of risk, and its poor ability to communicate risk-related information. This paper presents an integrated approach for management of contaminated ground water resources using health risk assessment and economic analysis through a multi-criteria decision analysis framework. The methodology introduces several important concepts and definitions in decision analysis related to subsurface contamination. These are the trade-off between population risk and individual risk, the trade-off between the residual risk and the cost of risk reduction, and cost-effectiveness as a justification for remediation. The proposed decision analysis framework integrates probabilistic health risk assessment into a comprehensive, yet simple, cost-based multi-criteria decision analysis framework. The methodology focuses on developing decision criteria that provide insight into the common questions of the decision-maker that involve a number of remedial alternatives. The paper then explores three potential approaches for alternative ranking, a structured explicit decision analysis, a heuristic approach of importance of the order of criteria, and a fuzzy logic approach based on fuzzy dominance and similarity analysis. Using formal alternative ranking procedures, the methodology seeks to present a structured decision analysis framework that can be applied consistently across many different and complex remediation settings. A simple numerical example is presented to demonstrate the proposed methodology. The results showed the importance of using an integrated approach for decision-making considering both costs and risks. Future work should focus on the application of the methodology to a variety of complex field conditions to

  19. Multivariate calibration applied to the quantitative analysis of infrared spectra

    NASA Astrophysics Data System (ADS)

    Haaland, David M.

    1992-03-01

    Multivariate calibration methods are very useful for improving the precision, accuracy, and reliability of quantitative spectral analyses. Spectroscopists can more effectively use these sophisticated statistical tools if they have a qualitative understanding of the techniques involved. A qualitative picture of the factor analysis multivariate calibration methods of partial least squares (PLS) and principal component regression (PCR) is presented using infrared calibrations based upon spectra of phosphosilicate glass thin films on silicon wafers. Comparisons of the relative prediction abilities of four different multivariate calibration methods are given based on Monte Carlo simulations of spectral calibration and prediction data. The success of multivariate spectral calibrations is demonstrated for several quantitative infrared studies. The infrared absorption and emission spectra of thin-film dielectrics used in the manufacture of microelectronic devices demonstrate rapid, nondestructive at-line and in- situ analyses using PLS calibrations. Finally, the application of multivariate spectral calibrations to reagentless analysis of blood is presented. We have found that the determination of glucose in whole blood taken from diabetics can be precisely monitored from the PLS calibration of either mid- or near-infrared spectra of the blood. Progress toward the noninvasive determination of glucose levels in diabetics is an ultimate goal of this research.

  20. Operational modal analysis applied to the concert harp

    NASA Astrophysics Data System (ADS)

    Chomette, B.; Le Carrou, J.-L.

    2015-05-01

    Operational modal analysis (OMA) methods are useful to extract modal parameters of operating systems. These methods seem to be particularly interesting to investigate the modal basis of string instruments during operation to avoid certain disadvantages due to conventional methods. However, the excitation in the case of string instruments is not optimal for OMA due to the presence of damped harmonic components and low noise in the disturbance signal. Therefore, the present study investigates the least-square complex exponential (LSCE) and the modified least-square complex exponential methods in the case of a string instrument to identify modal parameters of the instrument when it is played. The efficiency of the approach is experimentally demonstrated on a concert harp excited by some of its strings and the two methods are compared to a conventional modal analysis. The results show that OMA allows us to identify modes particularly present in the instrument's response with a good estimation especially if they are close to the excitation frequency with the modified LSCE method.

  1. Dynamical systems analysis applied to working memory data.

    PubMed

    Gasimova, Fidan; Robitzsch, Alexander; Wilhelm, Oliver; Boker, Steven M; Hu, Yueqin; Hülür, Gizem

    2014-01-01

    In the present paper we investigate weekly fluctuations in the working memory capacity (WMC) assessed over a period of 2 years. We use dynamical system analysis, specifically a second order linear differential equation, to model weekly variability in WMC in a sample of 112 9th graders. In our longitudinal data we use a B-spline imputation method to deal with missing data. The results show a significant negative frequency parameter in the data, indicating a cyclical pattern in weekly memory updating performance across time. We use a multilevel modeling approach to capture individual differences in model parameters and find that a higher initial performance level and a slower improvement at the MU task is associated with a slower frequency of oscillation. Additionally, we conduct a simulation study examining the analysis procedure's performance using different numbers of B-spline knots and values of time delay embedding dimensions. Results show that the number of knots in the B-spline imputation influence accuracy more than the number of embedding dimensions. PMID:25071657

  2. Multivariate calibration applied to the quantitative analysis of infrared spectra

    SciTech Connect

    Haaland, D.M.

    1991-01-01

    Multivariate calibration methods are very useful for improving the precision, accuracy, and reliability of quantitative spectral analyses. Spectroscopists can more effectively use these sophisticated statistical tools if they have a qualitative understanding of the techniques involved. A qualitative picture of the factor analysis multivariate calibration methods of partial least squares (PLS) and principal component regression (PCR) is presented using infrared calibrations based upon spectra of phosphosilicate glass thin films on silicon wafers. Comparisons of the relative prediction abilities of four different multivariate calibration methods are given based on Monte Carlo simulations of spectral calibration and prediction data. The success of multivariate spectral calibrations is demonstrated for several quantitative infrared studies. The infrared absorption and emission spectra of thin-film dielectrics used in the manufacture of microelectronic devices demonstrate rapid, nondestructive at-line and in-situ analyses using PLS calibrations. Finally, the application of multivariate spectral calibrations to reagentless analysis of blood is presented. We have found that the determination of glucose in whole blood taken from diabetics can be precisely monitored from the PLS calibration of either mind- or near-infrared spectra of the blood. Progress toward the non-invasive determination of glucose levels in diabetics is an ultimate goal of this research. 13 refs., 4 figs.

  3. Principles of Micellar Electrokinetic Capillary Chromatography Applied in Pharmaceutical Analysis

    PubMed Central

    Hancu, Gabriel; Simon, Brigitta; Rusu, Aura; Mircia, Eleonora; Gyéresi, Árpád

    2013-01-01

    Since its introduction capillary electrophoresis has shown great potential in areas where electrophoretic techniques have rarely been used before, including here the analysis of pharmaceutical substances. The large majority of pharmaceutical substances are neutral from electrophoretic point of view, consequently separations by the classic capillary zone electrophoresis; where separation is based on the differences between the own electrophoretic mobilities of the analytes; are hard to achieve. Micellar electrokinetic capillary chromatography, a hybrid method that combines chromatographic and electrophoretic separation principles, extends the applicability of capillary electrophoretic methods to neutral analytes. In micellar electrokinetic capillary chromatography, surfactants are added to the buffer solution in concentration above their critical micellar concentrations, consequently micelles are formed; micelles that undergo electrophoretic migration like any other charged particle. The separation is based on the differential partitioning of an analyte between the two-phase system: the mobile aqueous phase and micellar pseudostationary phase. The present paper aims to summarize the basic aspects regarding separation principles and practical applications of micellar electrokinetic capillary chromatography, with particular attention to those relevant in pharmaceutical analysis. PMID:24312804

  4. Risk analysis of a gas-processing complex in India.

    PubMed

    Garg, R K; Khan, A A

    1991-09-01

    ONGC's Hazira Gas-Processing Complex (HGPC) consists of facilities for receiving natural gas along with associated condensate from an off-shore field at a rate of 20 MMN M3 per day. After separating the condensate, which is processed in condensate fractionation units, the gas is processed through various steps to recover LPG and to reduce its dew point to less than 5 degrees C in order to make it suitable for transportation over long distances. The acid gas recovered during the gas-sweetening step is processed to obtain sulphur. The major products manufactured at HGPC therefore are lean sweet gas, LPG, NGL, and sulphur. The Oil and Natural Gas Commission awarded the assignment on Hazard Study and Risk Analysis of their Hazira Gas-Processing Complex (HGPC) to the Council of Scientific and Industrial Research (CSIR) in association with the Netherlands Organisation for Applied Scientific Research (TNO). The scope of this assignment covered a number of closely related and fully defined activities normally encountered in this type of work. Identification of hazards through the most appropriate methods, assigning frequency of occurrence of major unwanted incidents, quantification and assessment of probable damage to plant equipment, environment, human and animal life due to an unexpected event, and evaluation of various methods for reducing risk, together constituted the methodology for this assignment. Detailed recommendations aimed at reducing risk and enhancing reliability of plant and machinery were made. This article gives an overview of the assignment. PMID:1947347

  5. Analysis of Radiation Pneumonitis Risk Using a Generalized Lyman Model

    SciTech Connect

    Tucker, Susan L. Liu, H. Helen; Liao Zhongxing; Wei Xiong; Wang Shulian; Jin Hekun; Komaki, Ritsuko; Martel, Mary K.; Mohan, Radhe

    2008-10-01

    Purpose: To introduce a version of the Lyman normal-tissue complication probability (NTCP) model adapted to incorporate censored time-to-toxicity data and clinical risk factors and to apply the generalized model to analysis of radiation pneumonitis (RP) risk. Methods and Materials: Medical records and radiation treatment plans were reviewed retrospectively for 576 patients with non-small cell lung cancer treated with radiotherapy. The time to severe (Grade {>=}3) RP was computed, with event times censored at last follow-up for patients not experiencing this endpoint. The censored time-to-toxicity data were analyzed using the standard and generalized Lyman models with patient smoking status taken into account. Results: The generalized Lyman model with patient smoking status taken into account produced NTCP estimates up to 27 percentage points different from the model based on dose-volume factors alone. The generalized model also predicted that 8% of the expected cases of severe RP were unobserved because of censoring. The estimated volume parameter for lung was not significantly different from n = 1, corresponding to mean lung dose. Conclusions: NTCP models historically have been based solely on dose-volume effects and binary (yes/no) toxicity data. Our results demonstrate that inclusion of nondosimetric risk factors and censored time-to-event data can markedly affect outcome predictions made using NTCP models.

  6. Applying Costs, Risks and Values Evaluation (CRAVE) methodology to Engineering Support Request (ESR) prioritization

    NASA Technical Reports Server (NTRS)

    Joglekar, Prafulla N.

    1994-01-01

    Given limited budget, the problem of prioritization among Engineering Support Requests (ESR's) with varied sizes, shapes, and colors is a difficult one. At the Kennedy Space Center (KSC), the recently developed 4-Matrix (4-M) method represents a step in the right direction as it attempts to combine the traditional criteria of technical merits only with the new concern for cost-effectiveness. However, the 4-M method was not adequately successful in the actual prioritization of ESRs for the fiscal year 1995 (FY95). This research identifies a number of design issues that should help us to develop better methods. It emphasizes that given the variety and diversity of ESR's one should not expect that a single method could help in the assessment of all ESR's. One conclusion is that a methodology such as Costs, Risks, and Values Evaluation (CRAVE) should be adopted. It also is clear that the development of methods such as 4-M requires input not only from engineers with technical expertise in ESR's but also from personnel with adequate background in the theory and practice of cost-effectiveness analysis. At KSC, ESR prioritization is one part of the Ground Support Working Teams (GSWT) Integration Process. It was discovered that the more important barriers to the incorporation of cost-effectiveness considerations in ESR prioritization lie in this process. The culture of integration, and the corresponding structure of review by a committee of peers, is not conducive to the analysis and confrontation necessary in the assessment and prioritization of ESR's. Without assistance from appropriately trained analysts charged with the responsibility to analyze and be confrontational about each ESR, the GSWT steering committee will continue to make its decisions based on incomplete understanding, inconsistent numbers, and at times, colored facts. The current organizational separation of the prioritization and the funding processes is also identified as an important barrier to the

  7. Sensitivity and uncertainty analysis applied to the JHR reactivity prediction

    SciTech Connect

    Leray, O.; Vaglio-Gaudard, C.; Hudelot, J. P.; Santamarina, A.; Noguere, G.; Di-Salvo, J.

    2012-07-01

    The on-going AMMON program in EOLE reactor at CEA Cadarache (France) provides experimental results to qualify the HORUS-3D/N neutronics calculation scheme used for the design and safety studies of the new Material Testing Jules Horowitz Reactor (JHR). This paper presents the determination of technological and nuclear data uncertainties on the core reactivity and the propagation of the latter from the AMMON experiment to JHR. The technological uncertainty propagation was performed with a direct perturbation methodology using the 3D French stochastic code TRIPOLI4 and a statistical methodology using the 2D French deterministic code APOLLO2-MOC which leads to a value of 289 pcm (1{sigma}). The Nuclear Data uncertainty propagation relies on a sensitivity study on the main isotopes and the use of a retroactive marginalization method applied to the JEFF 3.1.1 {sup 27}Al evaluation in order to obtain a realistic multi-group covariance matrix associated with the considered evaluation. This nuclear data uncertainty propagation leads to a K{sub eff} uncertainty of 624 pcm for the JHR core and 684 pcm for the AMMON reference configuration core. Finally, transposition and reduction of the prior uncertainty were made using the Representativity method which demonstrates the similarity of the AMMON experiment with JHR (the representativity factor is 0.95). The final impact of JEFF 3.1.1 nuclear data on the Begin Of Life (BOL) JHR reactivity calculated by the HORUS-3D/N V4.0 is a bias of +216 pcm with an associated posterior uncertainty of 304 pcm (1{sigma}). (authors)

  8. Differential Network Analysis Applied to Preoperative Breast Cancer Chemotherapy Response

    PubMed Central

    Warsow, Gregor; Struckmann, Stephan; Kerkhoff, Claus; Reimer, Toralf; Engel, Nadja; Fuellen, Georg

    2013-01-01

    In silico approaches are increasingly considered to improve breast cancer treatment. One of these treatments, neoadjuvant TFAC chemotherapy, is used in cases where application of preoperative systemic therapy is indicated. Estimating response to treatment allows or improves clinical decision-making and this, in turn, may be based on a good understanding of the underlying molecular mechanisms. Ever increasing amounts of high throughput data become available for integration into functional networks. In this study, we applied our software tool ExprEssence to identify specific mechanisms relevant for TFAC therapy response, from a gene/protein interaction network. We contrasted the resulting active subnetwork to the subnetworks of two other such methods, OptDis and KeyPathwayMiner. We could show that the ExprEssence subnetwork is more related to the mechanistic functional principles of TFAC therapy than the subnetworks of the other two methods despite the simplicity of ExprEssence. We were able to validate our method by recovering known mechanisms and as an application example of our method, we identified a mechanism that may further explain the synergism between paclitaxel and doxorubicin in TFAC treatment: Paclitaxel may attenuate MELK gene expression, resulting in lower levels of its target MYBL2, already associated with doxorubicin synergism in hepatocellular carcinoma cell lines. We tested our hypothesis in three breast cancer cell lines, confirming it in part. In particular, the predicted effect on MYBL2 could be validated, and a synergistic effect of paclitaxel and doxorubicin could be demonstrated in the breast cancer cell lines SKBR3 and MCF-7. PMID:24349128

  9. Differential network analysis applied to preoperative breast cancer chemotherapy response.

    PubMed

    Warsow, Gregor; Struckmann, Stephan; Kerkhoff, Claus; Reimer, Toralf; Engel, Nadja; Fuellen, Georg

    2013-01-01

    In silico approaches are increasingly considered to improve breast cancer treatment. One of these treatments, neoadjuvant TFAC chemotherapy, is used in cases where application of preoperative systemic therapy is indicated. Estimating response to treatment allows or improves clinical decision-making and this, in turn, may be based on a good understanding of the underlying molecular mechanisms. Ever increasing amounts of high throughput data become available for integration into functional networks. In this study, we applied our software tool ExprEssence to identify specific mechanisms relevant for TFAC therapy response, from a gene/protein interaction network. We contrasted the resulting active subnetwork to the subnetworks of two other such methods, OptDis and KeyPathwayMiner. We could show that the ExprEssence subnetwork is more related to the mechanistic functional principles of TFAC therapy than the subnetworks of the other two methods despite the simplicity of ExprEssence. We were able to validate our method by recovering known mechanisms and as an application example of our method, we identified a mechanism that may further explain the synergism between paclitaxel and doxorubicin in TFAC treatment: Paclitaxel may attenuate MELK gene expression, resulting in lower levels of its target MYBL2, already associated with doxorubicin synergism in hepatocellular carcinoma cell lines. We tested our hypothesis in three breast cancer cell lines, confirming it in part. In particular, the predicted effect on MYBL2 could be validated, and a synergistic effect of paclitaxel and doxorubicin could be demonstrated in the breast cancer cell lines SKBR3 and MCF-7. PMID:24349128

  10. Relative risk analysis of the use of radiation-emitting medical devices: A preliminary application

    SciTech Connect

    Jones, E.D.

    1996-06-01

    This report describes the development of a risk analysis approach for evaluating the use of radiation-emitting medial devices. This effort was performed by Lawrence Livermore National Laboratory for the US Nuclear Regulatory Commission (NRC). The assessment approach has bee applied to understand the risks in using the Gamma Knife, a gamma irradiation therapy device. This effort represents an initial step to evaluate the potential role of risk analysis for developing regulations and quality assurance requirements in the use of nuclear medical devices. The risk approach identifies and assesses the most likely risk contributors and their relative importance for the medical system. The approach uses expert screening techniques and relative risk profiling to incorporate the type, quality, and quantity of data available and to present results in an easily understood form.

  11. School attendance, health-risk behaviors, and self-esteem in adolescents applying for working papers.

    PubMed Central

    Suss, A. L.; Tinkelman, B. K.; Freeman, K.; Friedman, S. B.

    1996-01-01

    Since health-risk behaviors are often encountered in clusters among adolescents, it was hypothesized that adolescents with poor school attendance would be associated with more health-risk behaviors (e.g., substance use, violence) than those who attend school regularly. This study assessed the relationship between poor school attendance and health-risk behaviors, and described health-risk behaviors and self-esteem among adolescents seeking employment. In this cross-sectional study, school attendance (poor vs. regular attendance) was related to health-risk behaviors by asking 122 subjects seen at a New York City Working Papers Clinic to complete both a 72-item questionnaire about their health-risk behaviors and the 58-item Coopersmith Self-Esteem School Form Inventory. Chi-square and Fisher's Exact Tests were performed. The poor and regular attenders of school differed significantly in only 5 out of 44 items pertaining to health-risk behaviors. Self-esteem measures for the two groups did not differ from one another or from national norms. In this sample, depression "in general" (global) and "at home," but not "at school," were associated significantly with suicidal thoughts/attempts and serious past life events (e.g. family conflict, sexual abuse). There were no significant associations between depression or self-esteem and illicit substance or alcohol use. We found few associations between poor school attendance and health-risk behaviors in this sample of employment-seeking adolescents. The poor and regular attenders of school were similar in most aspects of their health-risk behaviors and self-esteem. PMID:8982520

  12. Applying the Analytic Hierarchy Process to Oil Sands Environmental Compliance Risk Management

    NASA Astrophysics Data System (ADS)

    Roux, Izak Johannes, III

    Oil companies in Alberta, Canada, invested $32 billion on new oil sands projects in 2013. Despite the size of this investment, there is a demonstrable deficiency in the uniformity and understanding of environmental legislation requirements that manifest into increased project compliance risks. This descriptive study developed 2 prioritized lists of environmental regulatory compliance risks and mitigation strategies and used multi-criteria decision theory for its theoretical framework. Information from compiled lists of environmental compliance risks and mitigation strategies was used to generate a specialized pairwise survey, which was piloted by 5 subject matter experts (SMEs). The survey was validated by a sample of 16 SMEs, after which the Analytic Hierarchy Process (AHP) was used to rank a total of 33 compliance risks and 12 mitigation strategy criteria. A key finding was that the AHP is a suitable tool for ranking of compliance risks and mitigation strategies. Several working hypotheses were also tested regarding how SMEs prioritized 1 compliance risk or mitigation strategy compared to another. The AHP showed that regulatory compliance, company reputation, environmental compliance, and economics ranked the highest and that a multi criteria mitigation strategy for environmental compliance ranked the highest. The study results will inform Alberta oil sands industry leaders about the ranking and utility of specific compliance risks and mitigations strategies, enabling them to focus on actions that will generate legislative and public trust. Oil sands leaders implementing a risk management program using the risks and mitigation strategies identified in this study will contribute to environmental conservation, economic growth, and positive social change.

  13. Naming, the formation of stimulus classes, and applied behavior analysis.

    PubMed Central

    Stromer, R; Mackay, H A; Remington, B

    1996-01-01

    The methods used in Sidman's original studies on equivalence classes provide a framework for analyzing functional verbal behavior. Sidman and others have shown how teaching receptive, name-referent matching may produce rudimentary oral reading and word comprehension skills. Eikeseth and Smith (1992) have extended these findings by showing that children with autism may acquire equivalence classes after learning to supply a common oral name to each stimulus in a potential class. A stimulus class analysis suggests ways to examine (a) the problem of programming generalization from teaching situations to other environments, (b) the expansion of the repertoires that occur in those settings, and (c) the use of naming to facilitate these forms of generalization. Such research will help to clarify and extend Horne and Lowe's recent (1996) account of the role of verbal behavior in the formation of stimulus classes. PMID:8810064

  14. Detailed analysis of POD method applied on turbulent flow

    NASA Astrophysics Data System (ADS)

    Kellnerova, Radka; Kukacka, Libor; Uruba, Vaclav; Jurcakova, Klara; Janour, Zbynek

    2012-04-01

    Proper orthogonal decomposition (POD) of a very turbulent flow inside a street canyon is performed. The energy contribution of each mode is obtained. Also, physical meaning of the POD result is clarified. Particular modes of POD are assigned to the particular flow events like a sweep event, a vortex behind a roof or a vortex at the bottom of a street. Test of POD sensitivity to the acquisition time of data records is done. Test with decreasing sample frequency is also executed. Further, interpolation of POD expansion coefficient is performed in order to test possible increase in sample frequency and get new information about the flow from the POD analysis. We tested a linear and a spline type of the interpolation and the linear one carried out a slightly better result.

  15. Biophotogrammetry model of respiratory motion analysis applied to children.

    PubMed

    Ripka, W L; Ricieri, D da V; Ulbricht, L; Neves, E B; Stadnik, A M W; Romaneli, E F R

    2012-01-01

    This study aimed to test a protocol of measurements based on Biophotogrammetry to Analysis of Respiratory Mechanics (BARM) in healthy children. Seventeen normal spirometric children (six male and 11 female) were tested. Their performed maneuvers of forced inspiratory vital capacity were recorded in the supine position. The images were acquired by a digital camera, laterally placed to the trunk. Surface markers allowed that the files, exported to CorelDraw® software, were processed by irregular trapezoids paths. Compartments were defined in the thoracic (TX), abdominal (AB) and the chest wall (CW). They were defined at the end of an inspiration and expiration, both maximum, controlled by a digital spirometer. The result showed that the measured areas at the inspiratory and expiratory periods were statistically different (p<0.05). It reflects the mobility of CW and compartments. In conclusion, the proposed method can identify the breathing pattern of the measured subject using images in two dimensions (2D). PMID:23366409

  16. Painleve singularity analysis applied to charged particle dynamics during reconnection

    SciTech Connect

    Larson, J.W.

    1992-01-01

    For a plasma in the collisionless regime, test-particle modelling can lend some insight into the macroscopic behavior of the plasma, e.g. conductivity and heating. A common example for which this technique is used is a system with electric and magnetic fields given by B = [delta]yx + zy + yz and E = [epsilon]z, where [delta], [gamma], and [epsilon] are constant parameters. This model can be used to model plasma behavior near neutral lines, ([gamma] = 0), as well as current sheets ([gamma] = 0, [delta] = 0). The integrability properties of the particle motion in such fields might affect the plasma's macroscopic behavior, and the author has asked the question [open quotes]For what values of [delta], [gamma], and [epsilon] is the system integrable [close quotes] To answer this question, the author has employed Painleve singularity analysis, which is an examination of the singularity properties of a test particle's equations of motion in the complex time plane. This analysis has identified two field geometries for which the system's particle dynamics are integrable in terms of the second Painleve transcendent: the circular O-line case and the case of the neutral sheet configuration. These geometries yield particle dynamics that are integrable in the Liouville sense (i.e., there exist the proper number of integrals in involution) in an extended phase space which includes the time as a canonical coordinate, and this property is also true for nonzero [gamma]. The singularity property tests also identified a large, dense set of X-line and O-line field geometries that yield dynamics that may possess the weak Painleve property. In the case of the X-line geometries, this result shows little relevance to the physical nature of the system, but the existence of a dense set of elliptical O-line geometries with this property may be related to the fact that for [epsilon] positive, one can construct asymptotic solutions in the limit t [yields] [infinity].

  17. Assessing population exposure for landslide risk analysis using dasymetric cartography

    NASA Astrophysics Data System (ADS)

    Garcia, Ricardo A. C.; Oliveira, Sergio C.; Zezere, Jose L.

    2015-04-01

    Exposed Population is a major topic that needs to be taken into account in a full landslide risk analysis. Usually, risk analysis is based on an accounting of inhabitants number or inhabitants density, applied over statistical or administrative terrain units, such as NUTS or parishes. However, this kind of approach may skew the obtained results underestimating the importance of population, mainly in territorial units with predominance of rural occupation. Furthermore, the landslide susceptibility scores calculated for each terrain unit are frequently more detailed and accurate than the location of the exposed population inside each territorial unit based on Census data. These drawbacks are not the ideal setting when landslide risk analysis is performed for urban management and emergency planning. Dasymetric cartography, which uses a parameter or set of parameters to restrict the spatial distribution of a particular phenomenon, is a methodology that may help to enhance the resolution of Census data and therefore to give a more realistic representation of the population distribution. Therefore, this work aims to map and to compare the population distribution based on a traditional approach (population per administrative terrain units) and based on dasymetric cartography (population by building). The study is developed in the Region North of Lisbon using 2011 population data and following three main steps: i) the landslide susceptibility assessment based on statistical models independently validated; ii) the evaluation of population distribution (absolute and density) for different administrative territorial units (Parishes and BGRI - the basic statistical unit in the Portuguese Census); and iii) the dasymetric population's cartography based on building areal weighting. Preliminary results show that in sparsely populated administrative units, population density differs more than two times depending on the application of the traditional approach or the dasymetric

  18. Seismic vulnerability assessments in risk analysis

    NASA Astrophysics Data System (ADS)

    Frolova, Nina; Larionov, Valery; Bonnin, Jean; Ugarov, Alexander

    2013-04-01

    The assessment of seismic vulnerability is a critical issue within natural and technological risk analysis. In general, there are three common types of methods used for development of vulnerability functions of different elements at risk: empirical, analytical and expert estimations. The paper addresses the empirical methods for seismic vulnerability estimation for residential buildings and industrial facilities. The results of engineering analysis of past earthquake consequences, as well as the statistical data on buildings behavior during strong earthquakes presented in the different seismic intensity scales, are used to verify the regional parameters of mathematical models in order to simulate physical and economic vulnerability for different building types classified according to seismic scale MMSK-86. Verified procedure has been used to estimate the physical and economic vulnerability of buildings and constructions against earthquakes for the Northern Caucasus Federal region of the Russian Federation and Krasnodar area, which are characterized by rather high level of seismic activity and high population density. In order to estimate expected damage states to buildings and constructions in the case of the earthquakes according to the OSR-97B (return period T=1,000 years) within big cities and towns, they were divided into unit sites and their coordinates were presented as dots located in the centers of unit sites. Then the indexes obtained for each unit site were summed up. The maps of physical vulnerability zoning for Northern Caucasus Federal region of the Russian Federation and Krasnodar area includes two elements: percent of different damage states for settlements with number of inhabitants less than 1,000 and vulnerability for cities and towns with number of inhabitants more than 1,000. The hypsometric scale is used to represent both elements on the maps. Taking into account the size of oil pipe line systems located in the highly active seismic zones in

  19. Applying the emergency risk management process to tackle the crisis of antibiotic resistance

    PubMed Central

    Dominey-Howes, Dale; Bajorek, Beata; Michael, Carolyn A.; Betteridge, Brittany; Iredell, Jonathan; Labbate, Maurizio

    2015-01-01

    We advocate that antibiotic resistance be reframed as a disaster risk management problem. Antibiotic-resistant infections represent a risk to life as significant as other commonly occurring natural disasters (e.g., earthquakes). Despite efforts by global health authorities, antibiotic resistance continues to escalate. Therefore, new approaches and expertise are needed to manage the issue. In this perspective we: (1) make a call for the emergency management community to recognize the antibiotic resistance risk and join in addressing this problem; (2) suggest using the risk management process to help tackle antibiotic resistance; (3) show why this approach has value and why it is different to existing approaches; and (4) identify public perception of antibiotic resistance as an important issue that warrants exploration. PMID:26388864

  20. Statistical methods for texture analysis applied to agronomical images

    NASA Astrophysics Data System (ADS)

    Cointault, F.; Journaux, L.; Gouton, P.

    2008-02-01

    For activities of agronomical research institute, the land experimentations are essential and provide relevant information on crops such as disease rate, yield components, weed rate... Generally accurate, they are manually done and present numerous drawbacks, such as penibility, notably for wheat ear counting. In this case, the use of color and/or texture image processing to estimate the number of ears per square metre can be an improvement. Then, different image segmentation techniques based on feature extraction have been tested using textural information with first and higher order statistical methods. The Run Length method gives the best results closed to manual countings with an average error of 3%. Nevertheless, a fine justification of hypothesis made on the values of the classification and description parameters is necessary, especially for the number of classes and the size of analysis windows, through the estimation of a cluster validity index. The first results show that the mean number of classes in wheat image is of 11, which proves that our choice of 3 is not well adapted. To complete these results, we are currently analysing each of the class previously extracted to gather together all the classes characterizing the ears.

  1. Ion Beam Analysis applied to laser-generated plasmas

    NASA Astrophysics Data System (ADS)

    Cutroneo, M.; Macková, A.; Havranek, V.; Malinsky, P.; Torrisi, L.; Kormunda, M.; Barchuk, M.; Ullschmied, J.; Dudzak, R.

    2016-04-01

    This paper presents the research activity on Ion Beam Analysis methods performed at Tandetron Laboratory (LT) of the Institute of Nuclear Physics AS CR, Rez, Czech Republic. Recently, many groups are paying attention to implantation by laser generated plasma. This process allows to insert a controllable amount of energetic ions into the surface layers of different materials modifying the physical and chemical properties of the surface material. Different substrates are implanted by accelerated ions from plasma through terawatt iodine laser, at nominal intensity of 1015 W/cm2, at the PALS Research Infrastructure AS CR, in the Czech Republic. This regime of the laser matter interaction generates, multi-MeV proton beams, and multi-charged ions that are tightly confined in time (hundreds ps) and space (source radius of a few microns). These ion beams have a much lower transverse temperature, a much shorter duration and a much higher current than those obtainable from conventional accelerators. The implementation of protons and ions acceleration driven by ultra-short high intensity lasers is exhibited by adopting suitable irradiation conditions as well as tailored targets. An overview of implanted targets and their morphological and structural characterizations is presented and discussed.

  2. Quantitative phase imaging applied to laser damage detection and analysis.

    PubMed

    Douti, Dam-Bé L; Chrayteh, Mhamad; Aknoun, Sherazade; Doualle, Thomas; Hecquet, Christophe; Monneret, Serge; Gallais, Laurent

    2015-10-01

    We investigate phase imaging as a measurement method for laser damage detection and analysis of laser-induced modification of optical materials. Experiments have been conducted with a wavefront sensor based on lateral shearing interferometry associated with a high-magnification optical microscope. The system has been used for the in-line observation of optical thin films and bulk samples, laser irradiated in two different conditions: 500 fs pulses at 343 and 1030 nm, and millisecond to second irradiation with a CO2 laser at 10.6 μm. We investigate the measurement of the laser-induced damage threshold of optical material by detection and phase changes and show that the technique realizes high sensitivity with different optical path measurements lower than 1 nm. Additionally, the quantitative information on the refractive index or surface modification of the samples under test that is provided by the system has been compared to classical metrology instruments used for laser damage or laser ablation characterization (an atomic force microscope, a differential interference contrast microscope, and an optical surface profiler). An accurate in-line measurement of the morphology of laser-ablated sites, from few nanometers to hundred microns in depth, is shown. PMID:26479612

  3. Applied and computational harmonic analysis on graphs and networks

    NASA Astrophysics Data System (ADS)

    Irion, Jeff; Saito, Naoki

    2015-09-01

    In recent years, the advent of new sensor technologies and social network infrastructure has provided huge opportunities and challenges for analyzing data recorded on such networks. In the case of data on regular lattices, computational harmonic analysis tools such as the Fourier and wavelet transforms have well-developed theories and proven track records of success. It is therefore quite important to extend such tools from the classical setting of regular lattices to the more general setting of graphs and networks. In this article, we first review basics of graph Laplacian matrices, whose eigenpairs are often interpreted as the frequencies and the Fourier basis vectors on a given graph. We point out, however, that such an interpretation is misleading unless the underlying graph is either an unweighted path or cycle. We then discuss our recent effort of constructing multiscale basis dictionaries on a graph, including the Hierarchical Graph Laplacian Eigenbasis Dictionary and the Generalized Haar-Walsh Wavelet Packet Dictionary, which are viewed as generalizations of the classical hierarchical block DCTs and the Haar-Walsh wavelet packets, respectively, to the graph setting. Finally, we demonstrate the usefulness of our dictionaries by using them to simultaneously segment and denoise 1-D noisy signals sampled on regular lattices, a problem where classical tools have difficulty.

  4. Applying Logic Analysis to Genomic Data and Phylogenetic Profiles

    NASA Astrophysics Data System (ADS)

    Yeates, Todd

    2005-03-01

    One of the main goals of comparative genomics is to understand how all the various proteins in a cell relate to each other in terms of pathways and interaction networks. Various computational ideas have been explored with this goal in mind. In the original phylogenetic profile method, `functional linkages' were inferred between pairs of proteins when the two proteins, A and B, showed identical (or statistically similar) patterns of presence vs. absence across a set of completely sequenced genomes. Here we describe a new generalization, logic analysis of phylogenetic profiles (LAPP), from which higher order relationships can be identified between three (or more) different proteins. For instance, in one type of triplet logic relation -- of which there are eight distinct types -- a protein C may be present in a genome iff proteins A and B are both present (C=AB). An application of the LAPP method identifies thousands of previously unidentified relationships between protein triplets. These higher order logic relationships offer insights -- not available from pairwise approaches -- into branching, competition, and alternate routes through cellular pathways and networks. The results also make it possible to assign tentative cellular functions to many novel proteins of unknown function. Co-authors: Peter Bowers, Shawn Cokus, Morgan Beeby, and David Eisenberg

  5. Quantitative microbial risk assessment applied to irrigation of salad crops with waste stabilization pond effluents.

    PubMed

    Pavione, D M S; Bastos, R K X; Bevilacqua, P D

    2013-01-01

    A quantitative microbial risk assessment model for estimating infection risks arising from consuming crops eaten raw that have been irrigated with effluents from stabilization ponds was constructed. A log-normal probability distribution function was fitted to a large database from a comprehensive monitoring of an experimental pond system to account for variability in Escherichia coli concentration in irrigation water. Crop contamination levels were estimated using predictive models derived from field experiments involving the irrigation of several crops with different effluent qualities. Data on daily intake of salad crops were obtained from a national survey in Brazil. Ten thousand-trial Monte Carlo simulations were used to estimate human health risks associated with the use of wastewater for irrigating low- and high-growing crops. The use of effluents containing 10(3)-10(4) E. coli per 100 ml resulted in median rotavirus infection risk of approximately 10(-3) and 10(-4) pppy when irrigating, respectively, low- and high-growing crops; the corresponding 95th percentile risk estimates were around 10(-2) in both scenarios. Sensitivity analyses revealed that variations in effluent quality, in the assumed ratios of pathogens to E. coli, and in the reduction of pathogens between harvest and consumption had great impact upon risk estimates. PMID:23508143

  6. A performance-based approach to landslide risk analysis

    NASA Astrophysics Data System (ADS)

    Romeo, R. W.

    2009-04-01

    An approach for the risk assessment based on a probabilistic analysis of the performance of structures threatened by landslides is shown and discussed. The risk is a possible loss due to the occurrence of a potentially damaging event. Analytically the risk is the probability convolution of hazard, which defines the frequency of occurrence of the event (i.e., the demand), and fragility that defines the capacity of the system to withstand the event given its characteristics (i.e., severity) and those of the exposed goods (vulnerability), that is: Risk=p(D>=d|S,V) The inequality sets a damage (or loss) threshold beyond which the system's performance is no longer met. Therefore a consistent approach to risk assessment should: 1) adopt a probabilistic model which takes into account all the uncertainties of the involved variables (capacity and demand), 2) follow a performance approach based on given loss or damage thresholds. The proposed method belongs to the category of the semi-empirical ones: the theoretical component is given by the probabilistic capacity-demand model; the empirical component is given by the observed statistical behaviour of structures damaged by landslides. Two landslide properties alone are required: the area-extent and the type (or kinematism). All other properties required to determine the severity of landslides (such as depth, speed and frequency) are derived via probabilistic methods. The severity (or intensity) of landslides, in terms of kinetic energy, is the demand of resistance; the resistance capacity is given by the cumulative distribution functions of the limit state performance (fragility functions) assessed via damage surveys and cards compilation. The investigated limit states are aesthetic (of nominal concern alone), functional (interruption of service) and structural (economic and social losses). The damage probability is the probabilistic convolution of hazard (the probability mass function of the frequency of occurrence of given

  7. Improving the flash flood frequency analysis applying dendrogeomorphological evidences

    NASA Astrophysics Data System (ADS)

    Ruiz-Villanueva, V.; Ballesteros, J. A.; Bodoque, J. M.; Stoffel, M.; Bollschweiler, M.; Díez-Herrero, A.

    2009-09-01

    Flash floods are one of the natural hazards that cause major damages worldwide. Especially in Mediterranean areas they provoke high economic losses every year. In mountain areas with high stream gradients, floods events are characterized by extremely high flow and debris transport rates. Flash flood analysis in mountain areas presents specific scientific challenges. On one hand, there is a lack of information on precipitation and discharge due to a lack of spatially well distributed gauge stations with long records. On the other hand, gauge stations may not record correctly during extreme events when they are damaged or the discharge exceeds the recordable level. In this case, no systematic data allows improvement of the understanding of the spatial and temporal occurrence of the process. Since historic documentation is normally scarce or even completely missing in mountain areas, tree-ring analysis can provide an alternative approach. Flash floods may influence trees in different ways: (1) tilting of the stem through the unilateral pressure of the flowing mass or individual boulders; (2) root exposure through erosion of the banks; (3) injuries and scars caused by boulders and wood transported in the flow; (4) decapitation of the stem and resulting candelabra growth through the severe impact of boulders; (5) stem burial through deposition of material. The trees react to these disturbances with specific growth changes such as abrupt change of the yearly increment and anatomical changes like reaction wood or callus tissue. In this study, we sampled 90 cross sections and 265 increment cores of trees heavily affected by past flash floods in order to date past events and to reconstruct recurrence intervals in two torrent channels located in the Spanish Central System. The first study site is located along the Pelayo River, a torrent in natural conditions. Based on the external disturbances of trees and their geomorphological position, 114 Pinus pinaster (Ait

  8. Problem Formulation for Human Health Risk Assessments of Pathogens in Land-Applied Biosolids (Final Report)

    EPA Science Inventory

    Cover of the Land-<span class=Applied Biosolids 2011 Final Report "> Millions of tons of treated sewage sludges or “biosolids” are applied annually to f...

  9. Radiation Leukemogenesis: Applying Basic Science of Epidemiological Estimates of Low Dose Risks and Dose-Rate Effects

    SciTech Connect

    Hoel, D. G.

    1998-11-01

    The next stage of work has been to examine more closely the A-bomb leukemia data which provides the underpinnings of the risk estimation of CML in the above mentioned manuscript. The paper by Hoel and Li (Health Physics 75:241-50) shows how the linear-quadratic model has basic non-linearities at the low dose region for the leukemias including CML. Pierce et. al., (Radiation Research 123:275-84) have developed distributions for the uncertainty in the estimated exposures of the A-bomb cohort. Kellerer, et. al., (Radiation and Environmental Biophysics 36:73-83) has further considered possible errors in the estimated neutron values and with changing RBE values with dose and has hypothesized that the tumor response due to gamma may not be linear. We have incorporated his neutron model and have constricted new A-bomb doses based on his model adjustments. The Hoel and Li dose response analysis has also been applied using the Kellerer neutron dose adjustments for the leukemias. Finally, both Pierce's dose uncertainties and Kellerer neutron adjustments are combined as well as the varying RBE with dose as suggested by Rossi and Zaider and used for leukemia dose-response analysis. First the results of Hoel and Li showing a significantly improved fit of the linear-quadratic dose response by the inclusion of a threshold (i.e. low-dose nonlinearity) persisted. This work has been complete for both solid tumor as well as leukemia for both mortality as well as incidence data. The results are given in the manuscript described below which has been submitted to Health Physics.

  10. Exploratory Factor Analysis as a Construct Validation Tool: (Mis)applications in Applied Linguistics Research

    ERIC Educational Resources Information Center

    Karami, Hossein

    2015-01-01

    Factor analysis has been frequently exploited in applied research to provide evidence about the underlying factors in various measurement instruments. A close inspection of a large number of studies published in leading applied linguistic journals shows that there is a misconception among applied linguists as to the relative merits of exploratory…

  11. Applied Drama and the Higher Education Learning Spaces: A Reflective Analysis

    ERIC Educational Resources Information Center

    Moyo, Cletus

    2015-01-01

    This paper explores Applied Drama as a teaching approach in Higher Education learning spaces. The exploration takes a reflective analysis approach by first examining the impact that Applied Drama has had on my career as a Lecturer/Educator/Teacher working in Higher Education environments. My engagement with Applied Drama practice and theory is…

  12. Risk analysis of sustainable urban drainage and irrigation

    NASA Astrophysics Data System (ADS)

    Ursino, Nadia

    2015-09-01

    Urbanization, by creating extended impervious areas, to the detriment of vegetated ones, may have an undesirable influence on the water and energy balances of urban environments. The storage and infiltration capacity of the drainage system lessens the negative influence of urbanization, and vegetated areas help to re-establish pre-development environmental conditions. Resource limitation, climate, leading to increasing water scarcity, demographic and socio-institutional shifts promote more integrated water management. Storm-water harvesting for landscape irrigation mitigates possible water restrictions for the urban population in drought scenarios. A new probabilistic model for sustainable rainfall drainage, storage and re-use systems was implemented in this study. Risk analysis of multipurpose storage capacities was generalized by the use of only a few dimensionless parameters and applied to a case study in a Mediterranean-type climate, although the applicability of the model is not restricted to any particular climatic type.

  13. Beyond Time Out and Table Time: Today's Applied Behavior Analysis for Students with Autism

    ERIC Educational Resources Information Center

    Boutot, E. Amanda; Hume, Kara

    2010-01-01

    Recent mandates related to the implementation of evidence-based practices for individuals with autism spectrum disorder (ASD) require that autism professionals both understand and are able to implement practices based on the science of applied behavior analysis (ABA). The use of the term "applied behavior analysis" and its related concepts…

  14. Beyond Time out and Table Time: Today's Applied Behavior Analysis for Students with Autism

    ERIC Educational Resources Information Center

    Boutot, E. Amanda; Hume, Kara

    2012-01-01

    Recent mandates related to the implementation of evidence-based practices for individuals with autism spectrum disorder (ASD) require that autism professionals both understand and are able to implement practices based on the science of applied behavior analysis (ABA). The use of the term "applied behavior analysis" and its related concepts…

  15. A quantitative flood risk analysis methodology for urban areas with integration of social research data

    NASA Astrophysics Data System (ADS)

    Escuder-Bueno, I.; Castillo-Rodríguez, J. T.; Zechner, S.; Jöbstl, C.; Perales-Momparler, S.; Petaccia, G.

    2012-09-01

    Risk analysis has become a top priority for authorities and stakeholders in many European countries, with the aim of reducing flooding risk, considering the population's needs and improving risk awareness. Within this context, two methodological pieces have been developed in the period 2009-2011 within the SUFRI project (Sustainable Strategies of Urban Flood Risk Management with non-structural measures to cope with the residual risk, 2nd ERA-Net CRUE Funding Initiative). First, the "SUFRI Methodology for pluvial and river flooding risk assessment in urban areas to inform decision-making" provides a comprehensive and quantitative tool for flood risk analysis. Second, the "Methodology for investigation of risk awareness of the population concerned" presents the basis to estimate current risk from a social perspective and identify tendencies in the way floods are understood by citizens. Outcomes of both methods are integrated in this paper with the aim of informing decision making on non-structural protection measures. The results of two case studies are shown to illustrate practical applications of this developed approach. The main advantage of applying the methodology herein presented consists in providing a quantitative estimation of flooding risk before and after investing in non-structural risk mitigation measures. It can be of great interest for decision makers as it provides rational and solid information.

  16. Applying the Triad method in a risk assessment of a former surface treatment and metal industry site.

    PubMed

    Ribé, Veronica; Aulenius, Elisabet; Nehrenheim, Emma; Martell, Ulrika; Odlare, Monica

    2012-03-15

    With a greater focus on soil protection in the E.U., the need for ecological risk assessment tools for cost-effective characterization of site contamination is increasing. One of the challenges in assessing the risk of soil contaminants is to accurately account for changes in mobility of contaminants over time, as a result of ageing. Improved tools for measuring the bioavailable and mobile fraction of contaminants is therefore highly desirable. In this study the Triad method was used to perform a risk characterization of a former surface treatment and metal industry in Eskilstuna, Sweden. The risk assessment confirmed the environmental risk of the most heavily contaminated sample and showed that the toxic effect was most likely caused by high metal concentrations. The assessment of the two soil samples with low to moderate metal contamination levels was more complex, as there was a higher deviation between the results from the three lines of evidence; chemistry, (eco)toxicology and ecology. For the slightly less contaminated sample of the two, a weighting of the results from the ecotoxicological LoE would be recommended in order to accurately determine the risk of the metal contamination at the sampling site as the toxic effect detected in the Microtox® test and Ostracodtoxkit™ test was more likely to be due to oil contamination. The soil sample with higher total metal concentrations requires further ecotoxicological testing, as the integrated risk value indicated an environmental risk from metal contamination. The applied methodology, the Triad method, is considered appropriate for conducting improved environmental risk assessments in order to achieve sustainable remediation processes. PMID:21890272

  17. Applying predictive analytics to develop an intelligent risk detection application for healthcare contexts.

    PubMed

    Moghimi, Fatemeh Hoda; Cheung, Michael; Wickramasinghe, Nilmini

    2013-01-01

    Healthcare is an information rich industry where successful outcomes require the processing of multi-spectral data and sound decision making. The exponential growth of data and big data issues coupled with a rapid increase of service demands in healthcare contexts today, requires a robust framework enabled by IT (information technology) solutions as well as real-time service handling in order to ensure superior decision making and successful healthcare outcomes. Such a context is appropriate for the application of real time intelligent risk detection decision support systems using predictive analytic techniques such as data mining. To illustrate the power and potential of data science technologies in healthcare decision making scenarios, the use of an intelligent risk detection (IRD) model is proffered for the context of Congenital Heart Disease (CHD) in children, an area which requires complex high risk decisions that need to be made expeditiously and accurately in order to ensure successful healthcare outcomes. PMID:23920700

  18. Risk factor detection for heart disease by applying text analytics in electronic medical records.

    PubMed

    Torii, Manabu; Fan, Jung-Wei; Yang, Wei-Li; Lee, Theodore; Wiley, Matthew T; Zisook, Daniel S; Huang, Yang

    2015-12-01

    In the United States, about 600,000 people die of heart disease every year. The annual cost of care services, medications, and lost productivity reportedly exceeds 108.9 billion dollars. Effective disease risk assessment is critical to prevention, care, and treatment planning. Recent advancements in text analytics have opened up new possibilities of using the rich information in electronic medical records (EMRs) to identify relevant risk factors. The 2014 i2b2/UTHealth Challenge brought together researchers and practitioners of clinical natural language processing (NLP) to tackle the identification of heart disease risk factors reported in EMRs. We participated in this track and developed an NLP system by leveraging existing tools and resources, both public and proprietary. Our system was a hybrid of several machine-learning and rule-based components. The system achieved an overall F1 score of 0.9185, with a recall of 0.9409 and a precision of 0.8972. PMID:26279500

  19. Risk assessment framework of fate and transport models applied to hazardous waste sites

    SciTech Connect

    Hwang, S.T.

    1993-06-01

    Risk assessment is an increasingly important part of the decision-making process in the cleanup of hazardous waste sites. Despite guidelines from regulatory agencies and considerable research efforts to reduce uncertainties in risk assessments, there are still many issues unanswered. This paper presents new research results pertaining to fate and transport models, which will be useful in estimating exposure concentrations and will help reduce uncertainties in risk assessment. These developments include an approach for (1) estimating the degree of emissions and concentration levels of volatile pollutants during the use of contaminated water, (2) absorption of organic chemicals in the soil matrix through the skin, and (3) steady state, near-field, contaminant concentrations in the aquifer within a waste boundary.

  20. Applying Risk Science and Stakeholder Engagement to Overcome Environmental Barriers to Marine and Hydrokinetic Energy Projects

    SciTech Connect

    Copping, Andrea E.; Anderson, Richard M.; Van Cleve, Frances B.

    2010-09-20

    The production of electricity from the moving waters of the ocean has the potential to be a viable addition to the portfolio of renewable energy sources worldwide. The marine and hydrokinetic (MHK) industry faces many hurdles, including technology development, challenges of offshore deployments, and financing; however, the barrier most commonly identified by industry, regulators, and stakeholders is the uncertainty surrounding potential environmental effects of devices placed in the water and the permitting processes associated with real or potential impacts. Regulatory processes are not well positioned to judge the severity of harm due to turbines or wave generators. Risks from MHK devices to endangered or protected animals in coastal waters and rivers, as well as the habitats that support them, are poorly understood. This uncertainty raises concerns about catastrophic interactions between spinning turbine blades or slack mooring lines and marine mammals, birds and fish. In order to accelerate the deployment of tidal and wave devices, there is a need to sort through the extensive list of potential interactions that may cause harm to marine organisms and ecosystems, to set priorities for regulatory triggers, and to direct future research. Identifying the risk of MHK technology components on specific marine organisms and ecosystem components can separate perceived from real risk-relevant interactions. Scientists from Pacific Northwest National Laboratory (PNNL) are developing an Environmental Risk Evaluation System (ERES) to assess environmental effects associated with MHK technologies and projects through a systematic analytical process, with specific input from key stakeholder groups. The array of stakeholders interested in the development of MHK is broad, segmenting into those whose involvement is essential for the success of the MHK project, those that are influential, and those that are interested. PNNL and their partners have engaged these groups, gaining

  1. 49 CFR 260.17 - Credit risk premium analysis.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 49 Transportation 4 2010-10-01 2010-10-01 false Credit risk premium analysis. 260.17 Section 260.17 Transportation Other Regulations Relating to Transportation (Continued) FEDERAL RAILROAD... Financial Assistance § 260.17 Credit risk premium analysis. (a) When Federal appropriations are...

  2. Virtues and Limitations of Risk Analysis

    ERIC Educational Resources Information Center

    Weatherwax, Robert K.

    1975-01-01

    After summarizing the Rasmussion Report, the author reviews the probabilistic portion of the report from the perspectives of engineering utility and risk assessment uncertainty. The author shows that the report may represent a significant step forward in the assurance of reactor safety and an imperfect measure of actual reactor risk. (BT)

  3. Risk analysis for worker exposure to benzene

    NASA Astrophysics Data System (ADS)

    Hallenbeck, William H.; Flowers, Roxanne E.

    1992-05-01

    Cancer risk factors (characterized by route, dose, dose rate per kilogram, fraction of lifetime exposed, species, and sex) were derived for workers exposed to benzene via inhalation or ingestion. Exposure at the current Occupational Safety and Health Administration (OSHA) permissible exposure limit (PEL) and at leaking underground storage tank (LUST) sites were evaluated. At the current PEL of 1 ppm, the theoretical lifetime excess risk of cancer from benzene inhalation is ten per 1000. The theoretical lifetime excess risk for worker inhalation exposure at LUST sites ranged from 10 to 40 per 1000. These results indicate that personal protection should be required. The theoretical lifetime excess risk due to soil ingestion is five to seven orders of magnitude less than the inhalation risks.

  4. Finding Factors Influencing Risk: Comparing Variable Selection Methods Applied to Logistic Regression Models of Cases and Controls

    PubMed Central

    Swartz, Michael D.; Yu, Robert K.; Shete, Sanjay

    2011-01-01

    When modeling the risk of a disease, the very act of selecting the factors to include can heavily impact the results. This study compares the performance of several variable selection techniques applied to logistic regression. We performed realistic simulation studies to compare five methods of variable selection: (1) a confidence interval approach for significant coefficients (CI), (2) backward selection, (3) forward selection, (4) stepwise selection, and (5) Bayesian stochastic search variable selection (SSVS) using both informed and uniformed priors. We defined our simulated diseases mimicking odds ratios for cancer risk found in the literature for environmental factors, such as smoking; dietary risk factors, such as fiber; genetic risk factors such as XPD; and interactions. We modeled the distribution of our covariates, including correlation, after the reported empirical distributions of these risk factors. We also used a null data set to calibrate the priors of the Bayesian method and evaluate its sensitivity. Of the standard methods (95% CI, backward, forward and stepwise selection) the CI approach resulted in the highest average percent of correct associations and lowest average percent of incorrect associations. SSVS with an informed prior had higher average percent of correct associations and lower average percent of incorrect associations than did the CI approach. This study shows that Bayesian methods offer a way to use prior information to both increase power and decrease false-positive results when selecting factors to model complex disease risk. PMID:18937224

  5. From Policy to Practice: Applying Educational Change in Israeli Schools for Students at High Academic Risk

    ERIC Educational Resources Information Center

    Alfassi, Miriam

    2004-01-01

    This short report identifies the academic and environmental experiences that support the reclaiming of students at high academic risk and describes an applicable school-wide programme emanating from the call for educational change in Israel. The proposed structured academic programme demonstrates how schools can achieve broad student knowledge and…

  6. INTERPRETATION OF SPLP RESULTS FOR ASSESSING RISK TO GROUNDWATER FROM LAND-APPLIED GRANULAR WASTE

    EPA Science Inventory

    Scientists and engineers often rely on results from the synthetic precipitation leaching procedure (SPLP) to assess the risk of groundwater contamination posed by the land application of granular solid wastes. The concentrations of pollutants in SPLP leachate can be measured and ...

  7. Human health risk assessment of triclosan in land-applied biosolids.

    PubMed

    Verslycke, Tim; Mayfield, David B; Tabony, Jade A; Capdevielle, Marie; Slezak, Brian

    2016-09-01

    Triclosan (5-chloro-2-[2,4-dichlorophenoxy]-phenol) is an antimicrobial agent found in a variety of pharmaceutical and personal care products. Numerous studies have examined the occurrence and environmental fate of triclosan in wastewater, biosolids, biosolids-amended soils, and plants and organisms exposed to biosolid-amended soils. Triclosan has a propensity to adhere to organic carbon in biosolids and biosolid-amended soils. Land application of biosolids containing triclosan has the potential to contribute to multiple direct and indirect human health exposure pathways. To estimate exposures and human health risks from biosolid-borne triclosan, a risk assessment was conducted in general accordance with the methodology incorporated into the US Environmental Protection Agency's Part 503 biosolids rule. Human health exposures to biosolid-borne triclosan were estimated on the basis of published empirical data or modeled using upper-end environmental partitioning estimates. Similarly, a range of published triclosan human health toxicity values was evaluated. Margins of safety were estimated for 10 direct and indirect exposure pathways, both individually and combined. The present risk assessment found large margins of safety (>1000 to >100 000) for potential exposures to all pathways, even under the most conservative exposure and toxicity assumptions considered. The human health exposures and risks from biosolid-borne triclosan are concluded to be de minimis. Environ Toxicol Chem 2016;35:2358-2367. © 2016 SETAC. PMID:27552397

  8. At-Risk Students and Virtual Enterprise: Tourism and Hospitality Simulations in Applied and Academic Learning.

    ERIC Educational Resources Information Center

    Borgese, Anthony

    This paper discusses Virtual Enterprise (VE), a technology-driven business simulation program in which students conceive, create, and operate enterprises that utilize Web-based and other technologies to trade products and services around the world. The study examined the effects of VE on a learning community of at-risk students, defined as those…

  9. Invitational Theory and Practice Applied to Resiliency Development in At-Risk Youth

    ERIC Educational Resources Information Center

    Lee, R. Scott

    2012-01-01

    Resilience development is a growing field of study within the scholarly literature regarding social emotional achievement of at-risk students. Developing resiliency is based on the assumption that positive, pro-social, and/or strength-based values inherent in children and youth should be actively and intentionally developed. The core values of…

  10. Applying a Cognitive-Behavioral Model of HIV Risk to Youths in Psychiatric Care

    ERIC Educational Resources Information Center

    Donenberg, Geri R.; Schwartz, Rebecca Moss; Emerson, Erin; Wilson, Helen W.; Bryant, Fred B.; Coleman, Gloria

    2005-01-01

    This study examined the utility of cognitive and behavioral constructs (AIDS information, motivation, and behavioral skills) in explaining sexual risk taking among 172 12-20-year-old ethnically diverse urban youths in outpatient psychiatric care. Structural equation modeling revealed only moderate support for the model, explaining low to moderate…

  11. The schism between experimental and applied behavior analysis: Is it real and who cares? 1

    PubMed Central

    Poling, Alan; Picker, Mitchell; Grossett, Deborah; Hall-Johnson, Earl; Holbrook, Maurice

    1981-01-01

    This paper addresses the relationship between the experimental analysis of behavior and applied behavior analysis. Citation data indicate that across time the Journal of the Experimental Analysis of Behavior, and other experimental sources, have been referenced increasingly infrequently in the Journal of Applied Behavior Analysis, Behavior Therapy, and Behavior Research and Therapy. Such sources are now rarely cited in these journals, and never have been regularly referenced in Behavior Modification. Although their proper interpretation is far from certain, these data partially support recent suggestions that the experimental analysis of behavior and applied behavior analysis are largely separate, insular fields. A questionnaire, mailed to the editorial staffs of the Journal of the Experimental Analysis of Behavior and the Journal of Applied Behavior Analysis, was intended to gather further information about the alleged schism between the fields. Few respondents regularly read both journals, publish in both journals, or find both journals useful in their current research efforts. The majority of editors of both journals indicated that the fields were growing apart, although there was no consensus that this is harmful for behavior analysis. Most editors of the Journal of Applied Behavior Analysis reported that research published in the Journal of the Experimental Analysis of Behavior has decreased in value to applied researchers across time; most editors of the Journal of the Experimental Analysis of Behavior indicated that research published there has not changed in applied value. Several respondents commented at length concerning the relationship of experimental and applied behavior analysis. These comments, many of which appear in the article, reveal a marked plurality of views. PMID:22478543

  12. EC Transmission Line Risk Identification and Analysis

    SciTech Connect

    Bigelow, Tim S

    2012-04-01

    The purpose of this document is to assist in evaluating and planning for the cost, schedule, and technical project risks associated with the delivery and operation of the EC (Electron cyclotron) transmission line system. In general, the major risks that are anticipated to be encountered during the project delivery phase associated with the implementation of the Procurement Arrangement for the EC transmission line system are associated with: (1) Undefined or changing requirements (e.g., functional or regulatory requirements) (2) Underperformance of prototype, first unit, or production components during testing (3) Unavailability of qualified vendors for critical components Technical risks associated with the design and operation of the system are also identified.

  13. Developing scenarios to assess future landslide risks: a model-based approach applied to mountainous regions

    NASA Astrophysics Data System (ADS)

    Vacquie, Laure; Houet, Thomas

    2016-04-01

    In the last century, European mountain landscapes have experienced significant transformations. Natural and anthropogenic changes, climate changes, touristic and industrial development, socio-economic interactions, and their implications in terms of LUCC (land use and land cover changes) have directly influenced the spatial organization and vulnerability of mountain landscapes. This study is conducted as part of the SAMCO project founded by the French National Science Agency (ANR). It aims at developing a methodological approach, combining various tools, modelling platforms and methods, to identify vulnerable regions to landslide hazards accounting for futures LUCC. It presents an integrated approach combining participative scenarios and a LULC changes simulation models to assess the combined effects of LUCC and climate change on landslide risks in the Cauterets valley (French Pyrenees Mountains) up to 2100. Through vulnerability and risk mapping, the objective is to gather information to support landscape planning and implement land use strategies with local stakeholders for risk management. Four contrasting scenarios are developed and exhibit contrasting trajectories of socio-economic development. Prospective scenarios are based on national and international socio-economic contexts relying on existing assessment reports. The methodological approach integrates knowledge from local stakeholders to refine each scenario during their construction and to reinforce their plausibility and relevance by accounting for local specificities, e.g. logging and pastoral activities, touristic development, urban planning, etc. A process-based model, the Forecasting Scenarios for Mountains (ForeSceM) model, developed on the Dinamica Ego modelling platform is used to spatially allocate futures LUCC for each prospective scenario. Concurrently, a spatial decision support tool, i.e. the SYLVACCESS model, is used to identify accessible areas for forestry in scenario projecting logging

  14. Risk analysis of the thermal sterilization process. Analysis of factors affecting the thermal resistance of microorganisms.

    PubMed

    Akterian, S G; Fernandez, P S; Hendrickx, M E; Tobback, P P; Periago, P M; Martinez, A

    1999-03-01

    A risk analysis was applied to experimental heat resistance data. This analysis is an approach for processing experimental thermobacteriological data in order to study the variability of D and z values of target microorganisms depending on the deviations range of environmental factors, to determine the critical factors and to specify their critical tolerance. This analysis is based on sets of sensitivity functions applied to a specific case of experimental data related to the thermoresistance of Clostridium sporogenes and Bacillus stearothermophilus spores. The effect of the following factors was analyzed: the type of target microorganism; nature of the heating substrate; pH, temperature; type of acid employed and NaCl concentration. The type of target microorganism to be inactivated, the nature of the substrate (reference or real food) and the heating temperature were identified as critical factors, determining about 90% of the alteration of the microbiological risk. The effect of the type of acid used for the acidification of products and the concentration of NaCl can be assumed to be negligible factors for the purposes of engineering calculations. The critical non-uniformity in temperature during thermobacteriological studies was set as 0.5% and the critical tolerances of pH value and NaCl concentration were 5%. These results are related to a specific case study, for that reason their direct generalization is not correct. PMID:10357273

  15. Risk analysis of gravity dam instability using credibility theory Monte Carlo simulation model.

    PubMed

    Xin, Cao; Chongshi, Gu

    2016-01-01

    Risk analysis of gravity dam stability involves complicated uncertainty in many design parameters and measured data. Stability failure risk ratio described jointly by probability and possibility has deficiency in characterization of influence of fuzzy factors and representation of the likelihood of risk occurrence in practical engineering. In this article, credibility theory is applied into stability failure risk analysis of gravity dam. Stability of gravity dam is viewed as a hybrid event considering both fuzziness and randomness of failure criterion, design parameters and measured data. Credibility distribution function is conducted as a novel way to represent uncertainty of influence factors of gravity dam stability. And combining with Monte Carlo simulation, corresponding calculation method and procedure are proposed. Based on a dam section, a detailed application of the modeling approach on risk calculation of both dam foundation and double sliding surfaces is provided. The results show that, the present method is feasible to be applied on analysis of stability failure risk for gravity dams. The risk assessment obtained can reflect influence of both sorts of uncertainty, and is suitable as an index value. PMID:27386264

  16. Approaches to risk-adjusting outcome measures applied to criminal justice involvement after community service.

    PubMed

    Banks, S M; Pandiani, J A; Bramley, J

    2001-08-01

    The ethic of fairness in program evaluation requires that measures of behavioral health agency performance be sensitive to differences in those agencies' caseload composition. The authors describe two traditional approaches to the statistical risk adjustment of outcome measures (stratification weighting and pre-post measurement) that are designed to account for differences in caseload composition and introduce a method that incorporates the strengths of both approaches. Procedures for deriving each of these measures are described in detail and demonstrated in the evaluation of a statewide system of community-based behavioral health care programs. This evaluation examines the degree to which service recipients get into trouble with the law after treatment. Three measures are recommended for inclusion in outcome-oriented "report cards," and the interpretation of each measure is discussed. Finally, the authors suggest formats for graphic and tabular presentation of the risk-adjusted evaluation for sharing findings with diverse stakeholder groups. PMID:11497020

  17. Locating and applying sociological theories of risk-taking to develop public health interventions for adolescents

    PubMed Central

    Pound, Pandora; Campbell, Rona

    2015-01-01

    Sociological theories seldom inform public health interventions at the community level. The reasons for this are unclear but may include difficulties in finding, understanding or operationalising theories. We conducted a study to explore the feasibility of locating sociological theories within a specific field of public health, adolescent risk-taking, and to consider their potential for practical application. We identified a range of sociological theories. These explained risk-taking: (i) as being due to lack of social integration; (ii) as a consequence of isolation from mainstream society; (iii) as a rite of passage; (iv) as a response to social constraints; (v) as resistance; (vi) as an aspect of adolescent development; (vii) by the theory of the ‘habitus’; (viii) by situated rationality and social action theories; and (ix) as social practice. We consider these theories in terms of their potential to inform public health interventions for young people. PMID:25999784

  18. Environmental Risk Assessment of antimicrobials applied in veterinary medicine-A field study and laboratory approach.

    PubMed

    Slana, Marko; Dolenc, Marija Sollner

    2013-01-01

    The fate and environmental risk of antimicrobial compounds of different groups of veterinary medicine pharmaceuticals (VMP's) have been compared. The aim was to demonstrate a correlation between the physical and chemical properties of active compounds and their metabolism in target animals, as well as their fate in the environment. In addition, the importance of techniques for manure management and agricultural practice and their influence on the fate of active compounds is discussed. The selected active compounds are shown to be susceptible to at least one environmental factor (sun, water, bacterial or fungal degradation) to which they are exposed during their life cycle, which contributes to its degradation. Degradation under a number of environmental factors has also to be considered as authentic information additional to that observed in the limited conditions in laboratory studies and in Environmental Risk Assessment calculations. PMID:23274419

  19. Capability for Integrated Systems Risk-Reduction Analysis

    NASA Technical Reports Server (NTRS)

    Mindock, J.; Lumpkins, S.; Shelhamer, M.

    2016-01-01

    NASA's Human Research Program (HRP) is working to increase the likelihoods of human health and performance success during long-duration missions, and subsequent crew long-term health. To achieve these goals, there is a need to develop an integrated understanding of how the complex human physiological-socio-technical mission system behaves in spaceflight. This understanding will allow HRP to provide cross-disciplinary spaceflight countermeasures while minimizing resources such as mass, power, and volume. This understanding will also allow development of tools to assess the state of and enhance the resilience of individual crewmembers, teams, and the integrated mission system. We will discuss a set of risk-reduction questions that has been identified to guide the systems approach necessary to meet these needs. In addition, a framework of factors influencing human health and performance in space, called the Contributing Factor Map (CFM), is being applied as the backbone for incorporating information addressing these questions from sources throughout HRP. Using the common language of the CFM, information from sources such as the Human System Risk Board summaries, Integrated Research Plan, and HRP-funded publications has been combined and visualized in ways that allow insight into cross-disciplinary interconnections in a systematic, standardized fashion. We will show examples of these visualizations. We will also discuss applications of the resulting analysis capability that can inform science portfolio decisions, such as areas in which cross-disciplinary solicitations or countermeasure development will potentially be fruitful.

  20. A flexible count data regression model for risk analysis.

    PubMed

    Guikema, Seth D; Coffelt, Jeremy P; Goffelt, Jeremy P

    2008-02-01

    In many cases, risk and reliability analyses involve estimating the probabilities of discrete events such as hardware failures and occurrences of disease or death. There is often additional information in the form of explanatory variables that can be used to help estimate the likelihood of different numbers of events in the future through the use of an appropriate regression model, such as a generalized linear model. However, existing generalized linear models (GLM) are limited in their ability to handle the types of variance structures often encountered in using count data in risk and reliability analysis. In particular, standard models cannot handle both underdispersed data (variance less than the mean) and overdispersed data (variance greater than the mean) in a single coherent modeling framework. This article presents a new GLM based on a reformulation of the Conway-Maxwell Poisson (COM) distribution that is useful for both underdispersed and overdispersed count data and demonstrates this model by applying it to the assessment of electric power system reliability. The results show that the proposed COM GLM can provide as good of fits to data as the commonly used existing models for overdispered data sets while outperforming these commonly used models for underdispersed data sets. PMID:18304118

  1. Racial and ethnic disparities in perinatal mortality: applying the perinatal periods of risk model to identify areas for intervention.

    PubMed Central

    Besculides, Melanie; Laraque, Fabienne

    2005-01-01

    OBJECTIVES: To determine the feto-infant mortality rate for New York City, assess racial/ethnic variations and identify areas for intervention using the Perinatal Periods of Risk (PPOR) approach. METHODS: The PPOR model examines fetal and infant deaths by age at death (fetal, neonatal, postneonatal) and birthweight (500-1499, > or =1500 g). It groups age at death and birthweight into four categories to identify problems hypothesized to lead to the death: factors related to Maternal Health and Prematurity, Maternal Care, Newborn Care and Infant Health. The model was applied to fetal and infant deaths occurring in New York City using Vital Records data from 1996-2000. Analysis was completed for the entire city and by race/ethnicity (white non-Hispanic, black non-Hispanic, Hispanic, Asians/Pacific Islander). RESULTS: The overall feto-infant mortality rate was 11.5/1,000 live births plus fetal deaths. This rate varied by race/ethnicity; black non-Hispanics had a higher rate than other racial/ethnic groups. Conditions related to maternal health and prematurity were the largest contributing factors to feto-infant mortality (5.9/1000) in New York City. Among blacks and Hispanics, problems related to maternal health and prematurity contributed a larger share than among whites and Asians/Pacific Islanders. CONCLUSION: The use of the PPOR approach shows that the racial/ethnic disparities in feto-infant mortality that exist in New York City are largely related to maternal health and prematurity. Interventions to reduce the feto-infant mortality rate should include preconception care and improvements in women's health. PMID:16173328

  2. Risk Analysis and the Construction of News.

    ERIC Educational Resources Information Center

    Wilkins, Lee; Patterson, Philip

    1987-01-01

    Explains that the news media commit fundamental errors of attribution in covering risk situations by (1) treating them as novelties, (2) failing to analyze the entire system, and (3) using insufficiently analytical language. (NKA)

  3. Fire behavior and risk analysis in spacecraft

    NASA Technical Reports Server (NTRS)

    Friedman, Robert; Sacksteder, Kurt R.

    1988-01-01

    Practical risk management for present and future spacecraft, including space stations, involves the optimization of residual risks balanced by the spacecraft operational, technological, and economic limitations. Spacecraft fire safety is approached through three strategies, in order of risk: (1) control of fire-causing elements, through exclusion of flammable materials for example; (2) response to incipient fires through detection and alarm; and (3) recovery of normal conditions through extinguishment and cleanup. Present understanding of combustion in low gravity is that, compared to normal gravity behavior, fire hazards may be reduced by the absence of buoyant gas flows yet at the same time increased by ventilation flows and hot particle expulsion. This paper discusses the application of low-gravity combustion knowledge and appropriate aircraft analogies to fire detection, fire fighting, and fire-safety decisions for eventual fire-risk management and optimization in spacecraft.

  4. Inclusive Elementary Classroom Teacher Knowledge of and Attitudes toward Applied Behavior Analysis and Autism Spectrum Disorder and Their Use of Applied Behavior Analysis

    ERIC Educational Resources Information Center

    McCormick, Jennifer A.

    2011-01-01

    The purpose of this study was to examine inclusive elementary teacher knowledge and attitude toward Autism Spectrum Disorder (ASD) and applied behavior analysis (ABA) and their use of ABA. Furthermore, this study examined if knowledge and attitude predicted use of ABA. A survey was developed and administered through a web-based program. Of the…

  5. Nonstationary risk analysis of climate extremes

    NASA Astrophysics Data System (ADS)

    Chavez-Demoulin, V.; Davison, A. C.; Suveges, M.

    2009-04-01

    There is growing interest in the modelling of the size and frequency of rare events in a changing climate. Standard models for extreme events are based on the modelling of annual maxima or exceedances over high or under low thresholds: in either case appropriate probability distributions are fitted to the data, and extrapolation to rare events is based on the fitted models. Very often, however, extremal models do not take full advantage of techniques that are standard in other domains of statistics. Smoothing methods are now well-established in many domains of statistics, and are increasingly used in analysis of extremal data. The crucial idea of smoothing is to replace a simple linear or quadratic form of dependence of one variable on another by a more flexible form, and thus to 'allow the data to speak for themselves,ánd thus, perhaps, to reveal unexpected features. There are many approaches to smoothing in the context of linear regression, of which the use of spline smoothing and of local polynomial modelling are perhaps the most common. Under the first, a basis of spline functions is used to represent the dependence; often this is called generalised additive modelling. Under the second, polynomial models are fitted locally to the data, resulting in a more flexible overall fit. The selection of the degree of smoothing is crucial, and there are automatic ways to do this. The talk will describe some applications of smoothing to data on temperature extremes, elucidating the relation between cold winter weather in the Alps and the North Atlantic Oscillation, and changes in the lengths of usually hot and cold spells in Britain. The work mixes classical models for extremes, generalised additive modelling, local polynomial smoothing, and the bootstrap. References Chavez-Demoulin, V. and Davison, A. C. (2005) Generalized additive modelling of sample extremes. Applied Statistics, 54, 207-222. Süveges, M. (2007) Likelihood estimation of the extremal index. Extremes, 10

  6. The role of risk analysis in understanding bioterrorism.

    PubMed

    Haas, Charles N

    2002-08-01

    Recent events have made the domestic risk from bioterrorism more tangible. The risk management process so far, however, has not benefited from many of the contributions that analysts, communicators, and managers can make to the public discourse. Risk professionals can contribute much to the understanding of and solutions to bioterrorist events and threats. This article will provide an overview of the bioterrorism problem and outline a number of areas to which members of the Society for Risk Analysis, and other risk practitioners, could usefully contribute. PMID:12224741

  7. Risk analysis of an RTG on the Space Shuttle

    NASA Technical Reports Server (NTRS)

    Frank, Michael V.

    1991-01-01

    As part of the effort to review the Ulysses Final Safety Analysis Report and to understand the risk of plutonium release from the Ulysses spacecraft General Purpose Heat Source-Radioisotope Thermal Generator (GPHS-RTG), the Interagency Nuclear Safety Review Panel (INSRP) and the author performed an integrated, quantitative analysis of the uncertainties of the calculated risk of plutonium release from Ulysses. Using state-of-the-art probabilistic risk assessment technology, the uncertainty analysis accounted for both variability and uncertainty of the key parameters of the risk analysis. The results show tht INSRP had high confidence that risk of fatal cancers from potential plutonium release associated with calculated launch and deployment accident scenarios is low.

  8. Comparing risk in conventional and organic dairy farming in the Netherlands: an empirical analysis.

    PubMed

    Berentsen, P B M; Kovacs, K; van Asseldonk, M A P M

    2012-07-01

    This study was undertaken to contribute to the understanding of why most dairy farmers do not convert to organic farming. Therefore, the objective of this research was to assess and compare risks for conventional and organic farming in the Netherlands with respect to gross margin and the underlying price and production variables. To investigate the risk factors a farm accountancy database was used containing panel data from both conventional and organic representative Dutch dairy farms (2001-2007). Variables with regard to price and production risk were identified using a gross margin analysis scheme. Price risk variables were milk price and concentrate price. The main production risk variables were milk yield per cow, roughage yield per hectare, and veterinary costs per cow. To assess risk, an error component implicit detrending method was applied and the resulting detrended standard deviations were compared between conventional and organic farms. Results indicate that the risk included in the gross margin per cow is significantly higher in organic farming. This is caused by both higher price and production risks. Price risks are significantly higher in organic farming for both milk price and concentrate price. With regard to production risk, only milk yield per cow poses a significantly higher risk in organic farming. PMID:22720936

  9. Impediments to applying the 'dignity of risk' principle in residential aged care services.

    PubMed

    Ibrahim, Joseph E; Davis, Marie-Claire

    2013-09-01

    This discussion paper identifies four core factors currently impeding the application of the dignity of risk principle in residential aged care settings in Victoria, Australia: the fluctuating decision-making ability of residents; multiple participants in decision-making; discordance between espoused values and actions; and confusion and fear around legal responsibilities of care providers. Potential solutions identified include a conceptual shift in approach and consensus between key stakeholders, as well as more tangible solutions such as education and point-of-care decision support tools. PMID:24028460

  10. Risk analysis of Finnish peacekeeping in Kosovo.

    PubMed

    Lehtomäki, Kyösti; Pääkkönen, Rauno J; Rantanen, Jorma

    2005-04-01

    The research team interviewed over 90 Finnish battalion members in Kosovo, visited 22 units or posts, registered its observations, and made any necessary measurements. Key persons were asked to list the most important risks for occupational safety and health in their area of responsibility. Altogether, 106 accidents and 40 cases of disease resulted in compensation claims in 2000. The risks to the peacekeeping force were about twice those of the permanent staff of military trainees in Finland. Altogether, 21 accidents or cases of disease resulted in sick leave for at least 3 months after service. One permanent injury resulted from an explosion. Biological, chemical, and physical factors caused 8 to 9 occupational illnesses each. Traffic accidents, operational factors, and munitions and mines were evaluated to be the three most important risk factors, followed by occupational hygiene, living conditions (mold, fungi, dust), and general hygiene. Possible fatal risks, such as traffic accidents and munitions and explosives, received a high ranking in both the subjective and the objective evaluations. One permanent injury resulted from an explosion, and two traffic accidents involved a fatality, although not of a peacekeeper. The reduction of sports and military training accidents, risk-control programs, and, for some tasks, better personal protection is considered a development challenge for the near future. PMID:15876212

  11. HAZARD: A probabilistic method for ecological risk assessment to be applied in marine and coastal quality modeling

    SciTech Connect

    Schobben, H.P.M.; Scholten, M.C.T.; Karman, C.C.

    1994-12-31

    HAZARD is a relatively simple, probabilistic method for ecological risk assessment that can be added to the traditional water quality models. It can be seen as an advanced form of the traditional PEC/NEC-method. It compares the potential environmental concentration with sensitivity data for a set of species. In principle ecological risk modeling implies handling biological variation and uncertainties with regard to the causal chain from the actual exposure to pollutants to the final effects on marine biota. Probability techniques, known from statistical data processing, can deal with some of these variations and uncertainties. The basic principles, assumptions and limiting conditions will be discussed and illustrated with practical examples, including: the characterization of the sensitivity of marine biota by means of a frequency distribution of effect concentrations; the calculation of the intensity of the actual exposure of biota by a comparison of the temporal and spatial distributions of both pollutants and biota; the adjustment of ecotoxicological data to allow for a specific risk analysis for actual field conditions; application of the model for a risk analysis of incidental pollution by means of a translation of sensitivity data to short exposure times; the principle of calculating the risk of exposure to a mixture of pollutants; a concept in which the capacity of ecological recovery is taken into account.

  12. Applying a sociolinguistic model to the analysis of informed consent documents.

    PubMed

    Granero-Molina, José; Fernández-Sola, Cayetano; Aguilera-Manrique, Gabriel

    2009-11-01

    Information on the risks and benefits related to surgical procedures is essential for patients in order to obtain their informed consent. Some disciplines, such as sociolinguistics, offer insights that are helpful for patient-professional communication in both written and oral consent. Communication difficulties become more acute when patients make decisions through an informed consent document because they may sign this with a lack of understanding and information, and consequently feel deprived of their freedom to make their choice about different treatments or surgery. This article discusses findings from documentary analysis using the sociolinguistic SPEAKING model, which was applied to the general and specific informed consent documents required for laparoscopic surgery of the bile duct at Torrecárdenas Hospital, Almería, Spain. The objective of this procedure was to identify flaws when information was provided, together with its readability, its voluntary basis, and patients' consent. The results suggest potential linguistic communication difficulties, different languages being used, cultural clashes, asymmetry of communication between professionals and patients, assignment of rights on the part of patients, and overprotection of professionals and institutions. PMID:19889919

  13. Applying Monte Carlo Simulation to Launch Vehicle Design and Requirements Analysis

    NASA Technical Reports Server (NTRS)

    Hanson, J. M.; Beard, B. B.

    2010-01-01

    This Technical Publication (TP) is meant to address a number of topics related to the application of Monte Carlo simulation to launch vehicle design and requirements analysis. Although the focus is on a launch vehicle application, the methods may be applied to other complex systems as well. The TP is organized so that all the important topics are covered in the main text, and detailed derivations are in the appendices. The TP first introduces Monte Carlo simulation and the major topics to be discussed, including discussion of the input distributions for Monte Carlo runs, testing the simulation, how many runs are necessary for verification of requirements, what to do if results are desired for events that happen only rarely, and postprocessing, including analyzing any failed runs, examples of useful output products, and statistical information for generating desired results from the output data. Topics in the appendices include some tables for requirements verification, derivation of the number of runs required and generation of output probabilistic data with consumer risk included, derivation of launch vehicle models to include possible variations of assembled vehicles, minimization of a consumable to achieve a two-dimensional statistical result, recontact probability during staging, ensuring duplicated Monte Carlo random variations, and importance sampling.

  14. Oil shale health and environmental risk analysis

    SciTech Connect

    Gratt, L.B.

    1983-04-01

    The potential human health and environmental risks of hypothetical one-million-barrels-per-day oil shale industry have been analyzed to serve as an aid in the formulation and management of a program of environmental research. The largest uncertainties for expected fatalities are in the public sector from air pollutants although the occupational sector is estimated to have 60% more expected fatalities than the public sector. Occupational safety and illness have been analyzed for the oil shale fuel cycle from extraction to delivery of products for end use. Pneumoconiosis from the dust environment is the worker disease resulting in the greatest number of fatalities, followed by chronic bronchitis, internal cancer, and skin cancers, respectively. Research recommendations are presented for reducing the uncertainties in the risks analyzed and to fill data gaps to estimate other risks.

  15. Cross-Cultural Approach of Postpartum Depression: Manifestation, Practices Applied, Risk Factors and Therapeutic Interventions.

    PubMed

    Evagorou, Olympia; Arvaniti, Aikaterini; Samakouri, Maria

    2016-03-01

    It is a well known fact that postpartum depression (PPD) is a global phenomenon that women may experience, regardless of cultural identity and beliefs. This literature review presents the cultural beliefs and postnatal practices around the world, in each continent and people's origins, looking through the extent to which they contribute positively or negatively to the onset of the disease. 106 articles were used in this research, through a systematic electronic search of Pubmed (Medline) and Scopus. Comparison is also made between the prevalence, the risk factors and the different ways of appearance of the disease around the world and among immigrants. Finally, the initiatives and interventions made so far by the governments and institutions with a view to prevent and address this global problem are presented. The results showed (a) that different cultures share the same risk factors towards the disease (b) significant differences in the prevalence of the disease among both Western and non Western cultures and between the cultures themselves (c) more tendencies for somatization of depressive symptoms in non-Western cultures, (d) different postnatal practices between cultures, which are not always effective (e) the more non-West a culture is, the less interventions concern on mental health; the same phenomenon is observed on populations burdened by immigration. The beliefs held by culture should be taken seriously in detecting of PPD, as well as the assessment of the needs of women who have recently given birth. PMID:25986531

  16. Applying Transactional Analysis and Personality Assessment to Improve Patient Counseling and Communication Skills

    PubMed Central

    Lawrence, Lesa

    2007-01-01

    Objective To teach pharmacy students how to apply transactional analysis and personality assessment to patient counseling to improve communication. Design A lecture series for a required pharmacy communications class was developed to teach pharmacy students how to apply transactional analysis and personality assessment to patient counseling. Students were asked to apply these techniques and to report their experiences. A personality self-assessment was also conducted. Assessment After attending the lecture series, students were able to apply the techniques and demonstrated an understanding of the psychological factors that may affect patient communication, an appreciation for the diversity created by different personality types, the ability to engage patients based on adult-to-adult interaction cues, and the ability to adapt the interactive patient counseling model to different personality traits. Conclusion Students gained a greater awareness of transactional analysis and personality assessment by applying these concepts. This understanding will help students communicate more effectively with patients. PMID:17786269

  17. Risk Analysis for Environmental Health Triage

    SciTech Connect

    Bogen, K T

    2005-11-18

    The Homeland Security Act mandates development of a national, risk-based system to support planning for, response to and recovery from emergency situations involving large-scale toxic exposures. To prepare for and manage consequences effectively, planners and responders need not only to identify zones of potentially elevated individual risk, but also to predict expected casualties. Emergency response support systems now define ''consequences'' by mapping areas in which toxic chemical concentrations do or may exceed Acute Exposure Guideline Levels (AEGLs) or similar guidelines. However, because AEGLs do not estimate expected risks, current unqualified claims that such maps support consequence management are misleading. Intentionally protective, AEGLs incorporate various safety/uncertainty factors depending on scope and quality of chemical-specific toxicity data. Some of these factors are irrelevant, and others need to be modified, whenever resource constraints or exposure-scenario complexities require responders to make critical trade-off (triage) decisions in order to minimize expected casualties. AEGL-exceedance zones cannot consistently be aggregated, compared, or used to calculate expected casualties, and so may seriously misguide emergency response triage decisions. Methods and tools well established and readily available to support environmental health protection are not yet developed for chemically related environmental health triage. Effective triage decisions involving chemical risks require a new assessment approach that focuses on best estimates of likely casualties, rather than on upper plausible bounds of individual risk. If risk-based consequence management is to become a reality, federal agencies tasked with supporting emergency response must actively coordinate to foster new methods that can support effective environmental health triage.

  18. Risk analysis: divergent models and convergent interpretations

    NASA Technical Reports Server (NTRS)

    Carnes, B. A.; Gavrilova, N.

    2001-01-01

    Material presented at a NASA-sponsored workshop on risk models for exposure conditions relevant to prolonged space flight are described in this paper. Analyses used mortality data from experiments conducted at Argonne National Laboratory on the long-term effects of external whole-body irradiation on B6CF1 mice by 60Co gamma rays and fission neutrons delivered as a single exposure or protracted over either 24 or 60 once-weekly exposures. The maximum dose considered was restricted to 1 Gy for neutrons and 10 Gy for gamma rays. Proportional hazard models were used to investigate the shape of the dose response at these lower doses for deaths caused by solid-tissue tumors and tumors of either connective or epithelial tissue origin. For protracted exposures, a significant mortality effect was detected at a neutron dose of 14 cGy and a gamma-ray dose of 3 Gy. For single exposures, radiation-induced mortality for neutrons also occurred within the range of 10-20 cGy, but dropped to 86 cGy for gamma rays. Plots of risk relative to control estimated for each observed dose gave a visual impression of nonlinearity for both neutrons and gamma rays. At least for solid-tissue tumors, male and female mortality was nearly identical for gamma-ray exposures, but mortality risks for females were higher than for males for neutron exposures. As expected, protracting the gamma-ray dose reduced mortality risks. Although curvature consistent with that observed visually could be detected by a model parameterized to detect curvature, a relative risk term containing only a simple term for total dose was usually sufficient to describe the dose response. Although detectable mortality for the three pathology end points considered typically occurred at the same level of dose, the highest risks were almost always associated with deaths caused by tumors of epithelial tissue origin.

  19. Risk analysis: Wet weather flows in S.E. Michigan

    SciTech Connect

    Bulkley, J.W.

    1994-12-31

    Institutional aspects of risk analysis are examined in two separate activities in the state of Michigan. The state of Michigan`s relative risk project, patterned after the US EPA`s risk reduction report, provided an institutional mechanism to rank environmental issues facing the state of Michigan in terms of relative risk in 1991--92. This project identified 24 important environmental issues facing the state. These were categorized into four primary groups ranging from highest priority to medium priority. Seventeen of the identified issues are directly related to water resources, including water pollution in the Great Lakes. Combined sewer overflows are identified as the primary focus remaining for point-source discharges. The second institutional aspect of risk analysis considered in this paper is the development of a demonstration program for combined sewer overflow in the Rouge River Basin, which will help establish the trade-off between increased and risk reduction from such overflows.

  20. Integrated Hybrid System Architecture for Risk Analysis

    NASA Technical Reports Server (NTRS)

    Moynihan, Gary P.; Fonseca, Daniel J.; Ray, Paul S.

    2010-01-01

    A conceptual design has been announced of an expert-system computer program, and the development of a prototype of the program, intended for use as a project-management tool. The program integrates schedule and risk data for the purpose of determining the schedule applications of safety risks and, somewhat conversely, the effects of changes in schedules on changes on safety. It is noted that the design has been delivered to a NASA client and that it is planned to disclose the design in a conference presentation.

  1. Latent Model Analysis of Substance Use and HIV Risk Behaviors among High-Risk Minority Adults

    ERIC Educational Resources Information Center

    Wang, Min Qi; Matthew, Resa F.; Chiu, Yu-Wen; Yan, Fang; Bellamy, Nikki D.

    2007-01-01

    Objectives: This study evaluated substance use and HIV risk profile using a latent model analysis based on ecological theory, inclusive of a risk and protective factor framework, in sexually active minority adults (N=1,056) who participated in a federally funded substance abuse and HIV prevention health initiative from 2002 to 2006. Methods: Data…

  2. Analysis of driver casualty risk for different work zone types.

    PubMed

    Weng, Jinxian; Meng, Qiang

    2011-09-01

    Using driver casualty data from the Fatality Analysis Report System, this study examines driver casualty risk and investigates the risk contributing factors in the construction, maintenance and utility work zones. The multiple t-tests results show that the driver casualty risk is statistically different depending on the work zone type. Moreover, construction work zones have the largest driver casualty risk, followed by maintenance and utility work zones. Three separate logistic regression models are developed to predict driver casualty risk for the three work zone types because of their unique features. Finally, the effects of risk factors on driver casualty risk for each work zone type are examined and compared. For all three work zone types, five significant risk factors including road alignment, truck involvement, most harmful event, vehicle age and notification time are associated with increased driver casualty risk while traffic control devices and restraint use are associated with reduced driver casualty risk. However, one finding is that three risk factors (light condition, gender and day of week) exhibit opposing effects on the driver casualty risk in different types of work zones. This may largely be due to different work zone features and driver behavior in different types of work zones. PMID:21658509

  3. 2007 Wholesale Power Rate Case Final Proposal : Risk Analysis Study.

    SciTech Connect

    United States. Bonneville Power Administration.

    2006-07-01

    BPA's operating environment is filled with numerous uncertainties, and thus the rate-setting process must take into account a wide spectrum of risks. The objective of the Risk Analysis is to identify, model, and analyze the impacts that key risks have on BPA's net revenue (total revenues less total expenses). This is carried out in two distinct steps: a risk analysis step, in which the distributions, or profiles, of operating and non operating risks are defined, and a risk mitigation step, in which different rate tools are tested to assess their ability to recover BPA's costs in the face of this uncertainty. Two statistical models are used in the risk analysis step for this rate proposal, the Risk Analysis Model (RiskMod), and the Non-Operating Risk Model (NORM), while a third model, the ToolKit, is used to test the effectiveness of rate tools options in the risk mitigation step. RiskMod is discussed in Sections 2.1 through 2.4, the NORM is discussed in Section 2.5, and the ToolKit is discussed in Section 3. The models function together so that BPA can develop rates that cover all of its costs and provide a high probability of making its Treasury payments on time and in full during the rate period. By law, BPA's payments to Treasury are the lowest priority for revenue application, meaning that payments to Treasury are the first to be missed if financial reserves are insufficient to pay all bills on time. For this reason, BPA measures its potential for recovering costs in terms of probability of being able to make Treasury payments on time (also known as Treasury Payment Probability or TPP).

  4. Research in progress in applied mathematics, numerical analysis, and computer science

    NASA Technical Reports Server (NTRS)

    1990-01-01

    Research conducted at the Institute in Science and Engineering in applied mathematics, numerical analysis, and computer science is summarized. The Institute conducts unclassified basic research in applied mathematics in order to extend and improve problem solving capabilities in science and engineering, particularly in aeronautics and space.

  5. The value of integrating information from multiple hazards for flood risk analysis and management

    NASA Astrophysics Data System (ADS)

    Castillo-Rodríguez, J. T.; Escuder-Bueno, I.; Altarejos-García, L.; Serrano-Lombillo, A.

    2014-02-01

    This article presents a methodology for estimating flood risk in urban areas integrating pluvial flooding, river flooding and failure of both small and large dams. The first part includes a review of basic concepts on flood risk analysis, evaluation and management. Flood risk analyses may be developed at local, regional and national level, however a general methodology to perform a quantitative flood risk analysis including different flood hazards is still required. The second part describes the proposed methodology, which presents an integrated approach - combining pluvial, river flooding and flooding from dam failure, as applied to a case study: an urban area located downstream of a dam under construction. The methodology enhances the approach developed within the SUFRI project ("Sustainable Strategies of Urban Flood Risk Management to cope with the residual risk", 2009-2011). This article also shows how outcomes from flood risk analysis provide better and more complete information to inform authorities, local entities and the stakeholders involved in decision-making with regard to flood risk management.

  6. Performance analysis of high quality parallel preconditioners applied to 3D finite element structural analysis

    SciTech Connect

    Kolotilina, L.; Nikishin, A.; Yeremin, A.

    1994-12-31

    The solution of large systems of linear equations is a crucial bottleneck when performing 3D finite element analysis of structures. Also, in many cases the reliability and robustness of iterative solution strategies, and their efficiency when exploiting hardware resources, fully determine the scope of industrial applications which can be solved on a particular computer platform. This is especially true for modern vector/parallel supercomputers with large vector length and for modern massively parallel supercomputers. Preconditioned iterative methods have been successfully applied to industrial class finite element analysis of structures. The construction and application of high quality preconditioners constitutes a high percentage of the total solution time. Parallel implementation of high quality preconditioners on such architectures is a formidable challenge. Two common types of existing preconditioners are the implicit preconditioners and the explicit preconditioners. The implicit preconditioners (e.g. incomplete factorizations of several types) are generally high quality but require solution of lower and upper triangular systems of equations per iteration which are difficult to parallelize without deteriorating the convergence rate. The explicit type of preconditionings (e.g. polynomial preconditioners or Jacobi-like preconditioners) require sparse matrix-vector multiplications and can be parallelized but their preconditioning qualities are less than desirable. The authors present results of numerical experiments with Factorized Sparse Approximate Inverses (FSAI) for symmetric positive definite linear systems. These are high quality preconditioners that possess a large resource of parallelism by construction without increasing the serial complexity.

  7. Risk and value analysis of SETI

    NASA Technical Reports Server (NTRS)

    Billingham, J.

    1986-01-01

    The risks, values, and costs of the SETI project are evaluated and compared with those of the Viking project. Examination of the scientific values, side benefits, and costs of the two projects reveal that both projects provide equal benefits at equal costs. The probability of scientific and technical success is analyzed.

  8. Economic Risk Analysis: Using Analytical and Monte Carlo Techniques.

    ERIC Educational Resources Information Center

    O'Donnell, Brendan R.; Hickner, Michael A.; Barna, Bruce A.

    2002-01-01

    Describes the development and instructional use of a Microsoft Excel spreadsheet template that facilitates analytical and Monte Carlo risk analysis of investment decisions. Discusses a variety of risk assessment methods followed by applications of the analytical and Monte Carlo methods. Uses a case study to illustrate use of the spreadsheet tool…

  9. The JPL Cost Risk Analysis Approach that Incorporates Engineering Realism

    NASA Technical Reports Server (NTRS)

    Harmon, Corey C.; Warfield, Keith R.; Rosenberg, Leigh S.

    2006-01-01

    This paper discusses the JPL Cost Engineering Group (CEG) cost risk analysis approach that accounts for all three types of cost risk. It will also describe the evaluation of historical cost data upon which this method is based. This investigation is essential in developing a method that is rooted in engineering realism and produces credible, dependable results to aid decision makers.

  10. [Disinfection of water: on the need for analysis and solution of fundamental and applied problems].

    PubMed

    Mokienko, A V

    2014-01-01

    In the paper there is presented an analysis of hygienic--medical and environmental aspects of water disinfection as exemplified of chlorine and chlorine dioxide (CD). The concept of persistent multivariate risk for aquatic pathogens, the own vision of the mechanism of formation of chlorine resistance of bacteria under the influence of biocides based on a two-step process of information and spatial interaction of the receptor and the substrate, the hypothesis of hormetic stimulating effect of residual active chlorine (in the complex with other factors) on the growth of aquatic pathogens have been proposed. The aggravation of the significance of halogen containing compounds (HCC) as byproducts of water chlorination in terms of their potential danger as toxicants and carcinogens has been substantiated. Analysis of hygienic and medical and environmental aspects of the use of chlorine dioxide as a means of disinfection of water allowed to justify chemism of its biocidal effect and mechanisms of bactericidal, virucidal, protozoocidal, sporicidal, algacidal actions, removal of biofilms, formation of disinfection byproducts. Chlorine dioxide was shown both to provide epidemic safety of drinking water due to its high virucidal, bactericidal and mycocidal action and to be toxicologically harmless in the context of the influence on the organism of laboratory animals as well as in relation to aquatic organisms under the discharge of disinfected wastewater. There has proved the necessity of the close relationship of fundamental and applied research in performing the first in terms of depth study of microbiological, molecular genetic and epidemiological problems of disinfection (chlorination) of water and the implementation of the latters by means of the introduction of alternative, including combined, technologies for water treatment and disinfection. PMID:24749274

  11. System Analysis Applied to Autonomy: Application to High-Altitude Long-Endurance Remotely Operated Aircraft

    NASA Technical Reports Server (NTRS)

    Young, Larry A.; Yetter, Jeffrey A.; Guynn, Mark D.

    2006-01-01

    Maturation of intelligent systems technologies and their incorporation into aerial platforms are dictating the development of new analysis tools and incorporation of such tools into existing system analysis methodologies in order to fully capture the trade-offs of autonomy on vehicle and mission success. A first-order "system analysis of autonomy" methodology is outlined in this paper. Further, this analysis methodology is subsequently applied to notional high-altitude long-endurance (HALE) aerial vehicle missions.

  12. DOE 2009 Geothermal Risk Analysis: Methodology and Results (Presentation)

    SciTech Connect

    Young, K. R.; Augustine, C.; Anderson, A.

    2010-02-01

    This presentation summarizes the methodology and results for a probabilistic risk analysis of research, development, and demonstration work-primarily for enhanced geothermal systems (EGS)-sponsored by the U.S. Department of Energy Geothermal Technologies Program.

  13. American Airlines Propeller STOL Transport Economic Risk Analysis

    NASA Technical Reports Server (NTRS)

    Ransone, B.

    1972-01-01

    A Monte Carlo risk analysis on the economics of STOL transports in air passenger traffic established the probability of making the expected internal rate of financial return, or better, in a hypothetical regular Washington/New York intercity operation.

  14. SARA (System Analysis and Risk Assessment): An NRC risk management tool

    SciTech Connect

    Sattison, M.B.; Robinson, R.C.

    1987-01-01

    The System Analysis and Risk Assessment (SARA) system is being developed at the Idaho National Engineering Laboratory under the direction and guidance of the Office of Nuclear Regulatory Research. The objective of this project is to provide a personal computer (PC)-based, user-friendly system for the computation and analysis of information on nuclear power plant risk characteristics. The latest released version of SARA (Version 2.0) contains the probabilistic risk assessment (PRA) results from four of the five plants included in the NUREG-1150 study. These plants are: Surry;Peach Bottom;Sequoyah;and Grand Gulf.

  15. Method of error analysis for phase-measuring algorithms applied to photoelasticity.

    PubMed

    Quiroga, J A; González-Cano, A

    1998-07-10

    We present a method of error analysis that can be applied for phase-measuring algorithms applied to photoelasticity. We calculate the contributions to the measurement error of the different elements of a circular polariscope as perturbations of the Jones matrices associated with each element. The Jones matrix of the real polariscope can then be calculated as a sum of the nominal matrix and a series of contributions that depend on the errors associated with each element separately. We apply this method to the analysis of phase-measuring algorithms for the determination of isoclinics and isochromatics, including comparisons with real measurements. PMID:18285900

  16. Comparative Risk Analysis for Metropolitan Solid Waste Management Systems

    NASA Astrophysics Data System (ADS)

    Chang, Ni-Bin; Wang, S. F.

    1996-01-01

    Conventional solid waste management planning usually focuses on economic optimization, in which the related environmental impacts or risks are rarely considered. The purpose of this paper is to illustrate the methodology of how optimization concepts and techniques can be applied to structure and solve risk management problems such that the impacts of air pollution, leachate, traffic congestion, and noise increments can be regulated in the iong-term planning of metropolitan solid waste management systems. Management alternatives are sequentially evaluated by adding several environmental risk control constraints stepwise in an attempt to improve the management strategies and reduce the risk impacts in the long run. Statistics associated with those risk control mechanisms are presented as well. Siting, routing, and financial decision making in such solid waste management systems can also be achieved with respect to various resource limitations and disposal requirements.

  17. Risk assessment methodology applied to counter IED research & development portfolio prioritization

    SciTech Connect

    Shevitz, Daniel W; O' Brien, David A; Zerkle, David K; Key, Brian P; Chavez, Gregory M

    2009-01-01

    In an effort to protect the United States from the ever increasing threat of domestic terrorism, the Department of Homeland Security, Science and Technology Directorate (DHS S&T), has significantly increased research activities to counter the terrorist use of explosives. More over, DHS S&T has established a robust Counter-Improvised Explosive Device (C-IED) Program to Deter, Predict, Detect, Defeat, and Mitigate this imminent threat to the Homeland. The DHS S&T portfolio is complicated and changing. In order to provide the ''best answer'' for the available resources, DHS S&T would like some ''risk based'' process for making funding decisions. There is a definite need for a methodology to compare very different types of technologies on a common basis. A methodology was developed that allows users to evaluate a new ''quad chart'' and rank it, compared to all other quad charts across S&T divisions. It couples a logic model with an evidential reasoning model using an Excel spreadsheet containing weights of the subjective merits of different technologies. The methodology produces an Excel spreadsheet containing the aggregate rankings of the different technologies. It uses Extensible Logic Modeling (ELM) for logic models combined with LANL software called INFTree for evidential reasoning.

  18. Environmental risk assessment in GMO analysis.

    PubMed

    Pirondini, Andrea; Marmiroli, Nelson

    2008-01-01

    Genetically modified or engineered organisms (GMOs, GEOs) are utilised in agriculture, expressing traits of interest, such as insect or herbicide resistance. Soybean, maize, cotton and oilseed rape are the GM crops with the largest acreage in the world. The distribution of GM acreage in the different countries is related with the different positions concerning labelling of GMO products: based on the principle of substantial equivalence, or rather based on the precautionary principle. The paper provides an overview on how the risks associated with release of GMO in the environments can be analysed and predicted, in view of a possible coexistence of GM and non-GM organisms in agriculture.Risk assessment procedures, both qualitative and quantitative, are compared in the context of application to GMOs considering also legislation requirements (Directive 2001/18/EC). Criteria and measurable properties to assess harm for human health and environmental safety are listed, and the possible consequences are evaluated in terms of significance.Finally, a mapping of the possible risks deriving from GMO release is reported, focusing on gene transfer to related species, horizontal gene transfer, direct and indirect effects on non target organisms, development of resistance in target organisms, and effects on biodiversity. PMID:19048472

  19. Environmental risk assessment in GMO analysis.

    PubMed

    Pirondini, Andrea; Marmiroli, Nelson

    2010-01-01

    Genetically modified or engineered organisms (GMOs, GEOs) are utilised in agriculture, expressing traits of interest, such as insect or herbicide resistance. Soybean, maize, cotton and oilseed rape are the GM crops with the largest acreage in the world. The distribution of GM acreage in the different countries is related with the different positions concerning labelling of GMO products: based on the principle of substantial equivalence, or rather based on the precautionary principle. The paper provides an overview on how the risks associated with release of GMO in the environments can be analysed and predicted, in view of a possible coexistence of GM and non-GM organisms in agriculture.Risk assessment procedures, both qualitative and quantitative, are compared in the context of application to GMOs considering also legislation requirements (Directive 2001/18/EC). Criteria and measurable properties to assess harm for human health and environmental safety are listed, and the possible consequences are evaluated in terms of significance.Finally, a mapping of the possible risks deriving from GMO release is reported, focusing on gene transfer to related species, horizontal gene transfer, direct and indirect effects on non target organisms, development of resistance in target organisms, and effects on biodiversity. PMID:21384330

  20. A review of the technology and process on integrated circuits failure analysis applied in communications products

    NASA Astrophysics Data System (ADS)

    Ming, Zhimao; Ling, Xiaodong; Bai, Xiaoshu; Zong, Bo

    2016-02-01

    The failure analysis of integrated circuits plays a very important role in the improvement of the reliability in communications products. This paper intends to mainly introduce the failure analysis technology and process of integrated circuits applied in the communication products. There are many technologies for failure analysis, include optical microscopic analysis, infrared microscopic analysis, acoustic microscopy analysis, liquid crystal hot spot detection technology, optical microscopic analysis technology, micro analysis technology, electrical measurement, microprobe technology, chemical etching technology and ion etching technology. The integrated circuit failure analysis depends on the accurate confirmation and analysis of chip failure mode, the search of the root failure cause, the summary of failure mechanism and the implement of the improvement measures. Through the failure analysis, the reliability of integrated circuit and rate of good products can improve.

  1. Probabilistic risk analysis toward cost-effective 3S (safety, safeguards, security) implementation

    SciTech Connect

    Suzuki, Mitsutoshi; Mochiji, Toshiro

    2014-09-30

    Probabilistic Risk Analysis (PRA) has been introduced for several decades in safety and nuclear advanced countries have already used this methodology in their own regulatory systems. However, PRA has not been developed in safeguards and security so far because of inherent difficulties in intentional and malicious acts. In this paper, probabilistic proliferation and risk analysis based on random process is applied to hypothetical reprocessing process and physical protection system in nuclear reactor with the Markov model that was originally developed by the Proliferation Resistance and Physical Protection Working Group (PRPPWG) in Generation IV International Framework (GIF). Through the challenge to quantify the security risk with a frequency in this model, integrated risk notion among 3S to pursue the cost-effective installation of those countermeasures is discussed in a heroic manner.

  2. Probabilistic risk analysis toward cost-effective 3S (safety, safeguards, security) implementation

    NASA Astrophysics Data System (ADS)

    Suzuki, Mitsutoshi; Mochiji, Toshiro

    2014-09-01

    Probabilistic Risk Analysis (PRA) has been introduced for several decades in safety and nuclear advanced countries have already used this methodology in their own regulatory systems. However, PRA has not been developed in safeguards and security so far because of inherent difficulties in intentional and malicious acts. In this paper, probabilistic proliferation and risk analysis based on random process is applied to hypothetical reprocessing process and physical protection system in nuclear reactor with the Markov model that was originally developed by the Proliferation Resistance and Physical Protection Working Group (PRPPWG) in Generation IV International Framework (GIF). Through the challenge to quantify the security risk with a frequency in this model, integrated risk notion among 3S to pursue the cost-effective installation of those countermeasures is discussed in a heroic manner.

  3. Bayesian Inference for NASA Probabilistic Risk and Reliability Analysis

    NASA Technical Reports Server (NTRS)

    Dezfuli, Homayoon; Kelly, Dana; Smith, Curtis; Vedros, Kurt; Galyean, William

    2009-01-01

    This document, Bayesian Inference for NASA Probabilistic Risk and Reliability Analysis, is intended to provide guidelines for the collection and evaluation of risk and reliability-related data. It is aimed at scientists and engineers familiar with risk and reliability methods and provides a hands-on approach to the investigation and application of a variety of risk and reliability data assessment methods, tools, and techniques. This document provides both: A broad perspective on data analysis collection and evaluation issues. A narrow focus on the methods to implement a comprehensive information repository. The topics addressed herein cover the fundamentals of how data and information are to be used in risk and reliability analysis models and their potential role in decision making. Understanding these topics is essential to attaining a risk informed decision making environment that is being sought by NASA requirements and procedures such as 8000.4 (Agency Risk Management Procedural Requirements), NPR 8705.05 (Probabilistic Risk Assessment Procedures for NASA Programs and Projects), and the System Safety requirements of NPR 8715.3 (NASA General Safety Program Requirements).

  4. Economic Risk Analysis of Experimental Cropping Systems Using the SMART Risk Tool

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Recently, a variant of stochastic dominance called stochastic efficiency with respect to a function (SERF) has been developed and applied. Unlike traditional stochastic dominance approaches, SERF uses the concept of certainty equivalents (CEs) to rank a set of risk-efficient alternatives instead of ...

  5. Risk Analysis for Unintentional Slide Deployment During Airline Operations.

    PubMed

    Ayra, Eduardo S; Insua, David Ríos; Castellanos, María Eugenia; Larbi, Lydia

    2015-09-01

    We present a risk analysis undertaken to mitigate problems in relation to the unintended deployment of slides under normal operations within a commercial airline. This type of incident entails relevant costs for the airline industry. After assessing the likelihood and severity of its consequences, we conclude that such risks need to be managed. We then evaluate the effectiveness of various countermeasures, describing and justifying the chosen ones. We also discuss several issues faced when implementing and communicating the proposed measures, thus fully illustrating the risk analysis process. PMID:26061899

  6. Land Use Adaptation Strategies Analysis in Landslide Risk Region

    NASA Astrophysics Data System (ADS)

    Lin, Yu-Ching; Chang, Chin-Hsin; Chen, Ying-Tung

    2013-04-01

    In order to respond to the impact of climate and environmental change on Taiwanese mountain region, this study used GTZ (2004) Risk analysis guidelines to assess the landslide risk for 178 Taiwanese mountain towns. This study used 7 indicators to assess landslide risk, which are rainfall distribution, natural environment vulnerability (e.g., rainfall threshold criterion for debris flow, historical disaster frequency, landslide ratio, and road density), physicality vulnerability (e.g., population density) and socio-economic vulnerability (e.g., population with higher education, death rate and income). The landslide risk map can be obtained by multiplying 7 indicators together and ranking the product. The map had 5 risk ranges, and towns within the range of 4 to 5, which are high landslide risk regions, and have high priority in reducing risk. This study collected the regions with high landslide risk regions and analyzed the difference after Typhoon Morakot (2009). The spatial distribution showed that after significant environmental damage high landslide risk regions moved from central to south Taiwan. The changeable pattern of risk regions pointed out the necessity of updating the risk map periodically. Based on the landslide risk map and the land use investigation data which was provided by the National Land Surveying and Mapping Center in 2007, this study calculated the size of the land use area with landslide disaster risk. According to the above results and discussion, this study can be used to suggest appropriate land use adaptation strategies provided for reducing landslide risk under the impact of climate and environmental change.

  7. Environmental risk analysis of hazardous material rail transportation.

    PubMed

    Saat, Mohd Rapik; Werth, Charles J; Schaeffer, David; Yoon, Hongkyu; Barkan, Christopher P L

    2014-01-15

    An important aspect of railroad environmental risk management involves tank car transportation of hazardous materials. This paper describes a quantitative, environmental risk analysis of rail transportation of a group of light, non-aqueous-phase liquid (LNAPL) chemicals commonly transported by rail in North America. The Hazardous Materials Transportation Environmental Consequence Model (HMTECM) was used in conjunction with a geographic information system (GIS) analysis of environmental characteristics to develop probabilistic estimates of exposure to different spill scenarios along the North American rail network. The risk analysis incorporated the estimated clean-up cost developed using the HMTECM, route-specific probability distributions of soil type and depth to groundwater, annual traffic volume, railcar accident rate, and tank car safety features, to estimate the nationwide annual risk of transporting each product. The annual risk per car-mile (car-km) and per ton-mile (ton-km) was also calculated to enable comparison between chemicals and to provide information on the risk cost associated with shipments of these products. The analysis and the methodology provide a quantitative approach that will enable more effective management of the environmental risk of transporting hazardous materials. PMID:24239259

  8. Applied Behavior Analysis: Its Impact on the Treatment of Mentally Retarded Emotionally Disturbed People.

    ERIC Educational Resources Information Center

    Matson, Johnny L.; Coe, David A.

    1992-01-01

    This article reviews applications of the applied behavior analysis ideas of B. F. Skinner and others to persons with both mental retardation and emotional disturbance. The review examines implications of behavior analysis for operant conditioning and radical behaviorism, schedules of reinforcement, and emotion and mental illness. (DB)

  9. Improving Skill Development: An Exploratory Study Comparing a Philosophical and an Applied Ethical Analysis Technique

    ERIC Educational Resources Information Center

    Al-Saggaf, Yeslam; Burmeister, Oliver K.

    2012-01-01

    This exploratory study compares and contrasts two types of critical thinking techniques; one is a philosophical and the other an applied ethical analysis technique. The two techniques analyse an ethically challenging situation involving ICT that a recent media article raised to demonstrate their ability to develop the ethical analysis skills of…

  10. Causal Modeling--Path Analysis a New Trend in Research in Applied Linguistics

    ERIC Educational Resources Information Center

    Rastegar, Mina

    2006-01-01

    This article aims at discussing a new statistical trend in research in applied linguistics. This rather new statistical procedure is causal modeling--path analysis. The article demonstrates that causal modeling--path analysis is the best statistical option to use when the effects of a multitude of L2 learners' variables on language achievement are…

  11. An Objective Comparison of Applied Behavior Analysis and Organizational Behavior Management Research

    ERIC Educational Resources Information Center

    Culig, Kathryn M.; Dickinson, Alyce M.; McGee, Heather M.; Austin, John

    2005-01-01

    This paper presents an objective review, analysis, and comparison of empirical studies targeting the behavior of adults published in Journal of Applied Behavior Analysis (JABA) and Journal of Organizational Behavior Management (JOBM) between 1997 and 2001. The purpose of the comparisons was to identify similarities and differences with respect to…

  12. Sociosexuality Education for Persons with Autism Spectrum Disorders Using Principles of Applied Behavior Analysis

    ERIC Educational Resources Information Center

    Wolfe, Pamela S.; Condo, Bethany; Hardaway, Emily

    2009-01-01

    Applied behavior analysis (ABA) has emerged as one of the most effective empirically based strategies for instructing individuals with autism spectrum disorders (ASD). Four ABA-based strategies that have been found effective are video modeling, visual strategies, social script fading, and task analysis. Individuals with ASD often struggle with…

  13. Dynamic taxonomies applied to a web-based relational database for geo-hydrological risk mitigation

    NASA Astrophysics Data System (ADS)

    Sacco, G. M.; Nigrelli, G.; Bosio, A.; Chiarle, M.; Luino, F.

    2012-02-01

    In its 40 years of activity, the Research Institute for Geo-hydrological Protection of the Italian National Research Council has amassed a vast and varied collection of historical documentation on landslides, muddy-debris flows, and floods in northern Italy from 1600 to the present. Since 2008, the archive resources have been maintained through a relational database management system. The database is used for routine study and research purposes as well as for providing support during geo-hydrological emergencies, when data need to be quickly and accurately retrieved. Retrieval speed and accuracy are the main objectives of an implementation based on a dynamic taxonomies model. Dynamic taxonomies are a general knowledge management model for configuring complex, heterogeneous information bases that support exploratory searching. At each stage of the process, the user can explore or browse the database in a guided yet unconstrained way by selecting the alternatives suggested for further refining the search. Dynamic taxonomies have been successfully applied to such diverse and apparently unrelated domains as e-commerce and medical diagnosis. Here, we describe the application of dynamic taxonomies to our database and compare it to traditional relational database query methods. The dynamic taxonomy interface, essentially a point-and-click interface, is considerably faster and less error-prone than traditional form-based query interfaces that require the user to remember and type in the "right" search keywords. Finally, dynamic taxonomy users have confirmed that one of the principal benefits of this approach is the confidence of having considered all the relevant information. Dynamic taxonomies and relational databases work in synergy to provide fast and precise searching: one of the most important factors in timely response to emergencies.

  14. Risk Assessment of Infrastructure System of Systems with Precursor Analysis.

    PubMed

    Guo, Zhenyu; Haimes, Yacov Y

    2016-08-01

    Physical infrastructure systems are commonly composed of interconnected and interdependent subsystems, which in their essence constitute system of systems (S-o-S). System owners and policy researchers need tools to foresee potential emergent forced changes and to understand their impact so that effective risk management strategies can be developed. We develop a systemic framework for precursor analysis to support the design of an effective and efficient precursor monitoring and decision support system with the ability to (i) identify and prioritize indicators of evolving risks of system failure; and (ii) evaluate uncertainties in precursor analysis to support informed and rational decision making. This integrated precursor analysis framework is comprised of three processes: precursor identification, prioritization, and evaluation. We use an example of a highway bridge S-o-S to demonstrate the theories and methodologies of the framework. Bridge maintenance processes involve many interconnected and interdependent functional subsystems and decision-making entities and bridge failure can have broad social and economic consequences. The precursor analysis framework, which constitutes an essential part of risk analysis, examines the impact of various bridge inspection and maintenance scenarios. It enables policy researchers and analysts who are seeking a risk perspective on bridge infrastructure in a policy setting to develop more risk informed policies and create guidelines to efficiently allocate limited risk management resources and mitigate severe consequences resulting from bridge failures. PMID:27575259

  15. Leakage risk assessment of the In Salah CO2 storage project: Applying the Certification Framework in a dynamic context.

    SciTech Connect

    Oldenburg, C.M.; Jordan, P.D.; Nicot, J.-P.; Mazzoldi, A.; Gupta, A.K.; Bryant, S.L.

    2010-08-01

    The Certification Framework (CF) is a simple risk assessment approach for evaluating CO{sub 2} and brine leakage risk at geologic carbon sequestration (GCS) sites. In the In Salah CO{sub 2} storage project assessed here, five wells at Krechba produce natural gas from the Carboniferous C10.2 reservoir with 1.7-2% CO{sub 2} that is delivered to the Krechba gas processing plant, which also receives high-CO{sub 2} natural gas ({approx}10% by mole fraction) from additional deeper gas reservoirs and fields to the south. The gas processing plant strips CO{sub 2} from the natural gas that is then injected through three long horizontal wells into the water leg of the Carboniferous gas reservoir at a depth of approximately 1,800 m. This injection process has been going on successfully since 2004. The stored CO{sub 2} has been monitored over the last five years by a Joint Industry Project (JIP) - a collaboration of BP, Sonatrach, and Statoil with co-funding from US DOE and EU DG Research. Over the years the JIP has carried out extensive analyses of the Krechba system including two risk assessment efforts, one before injection started, and one carried out by URS Corporation in September 2008. The long history of injection at Krechba, and the accompanying characterization, modeling, and performance data provide a unique opportunity to test and evaluate risk assessment approaches. We apply the CF to the In Salah CO{sub 2} storage project at two different stages in the state of knowledge of the project: (1) at the pre-injection stage, using data available just prior to injection around mid-2004; and (2) after four years of injection (September 2008) to be comparable to the other risk assessments. The main risk drivers for the project are CO{sub 2} leakage into potable groundwater and into the natural gas cap. Both well leakage and fault/fracture leakage are likely under some conditions, but overall the risk is low due to ongoing mitigation and monitoring activities. Results of

  16. Risk Analysis for Resource Planning Optimization

    NASA Technical Reports Server (NTRS)

    Cheung, Kar-Ming

    2008-01-01

    This paper describes a systems engineering approach to resource planning by integrating mathematical modeling and constrained optimization, empirical simulation, and theoretical analysis techniques to generate an optimal task plan in the presence of uncertainties.

  17. Selenium Exposure and Cancer Risk: an Updated Meta-analysis and Meta-regression

    PubMed Central

    Cai, Xianlei; Wang, Chen; Yu, Wanqi; Fan, Wenjie; Wang, Shan; Shen, Ning; Wu, Pengcheng; Li, Xiuyang; Wang, Fudi

    2016-01-01

    The objective of this study was to investigate the associations between selenium exposure and cancer risk. We identified 69 studies and applied meta-analysis, meta-regression and dose-response analysis to obtain available evidence. The results indicated that high selenium exposure had a protective effect on cancer risk (pooled OR = 0.78; 95%CI: 0.73–0.83). The results of linear and nonlinear dose-response analysis indicated that high serum/plasma selenium and toenail selenium had the efficacy on cancer prevention. However, we did not find a protective efficacy of selenium supplement. High selenium exposure may have different effects on specific types of cancer. It decreased the risk of breast cancer, lung cancer, esophageal cancer, gastric cancer, and prostate cancer, but it was not associated with colorectal cancer, bladder cancer, and skin cancer. PMID:26786590

  18. Applying under-sampling techniques and cost-sensitive learning methods on risk assessment of breast cancer.

    PubMed

    Hsu, Jia-Lien; Hung, Ping-Cheng; Lin, Hung-Yen; Hsieh, Chung-Ho

    2015-04-01

    Breast cancer is one of the most common cause of cancer mortality. Early detection through mammography screening could significantly reduce mortality from breast cancer. However, most of screening methods may consume large amount of resources. We propose a computational model, which is solely based on personal health information, for breast cancer risk assessment. Our model can be served as a pre-screening program in the low-cost setting. In our study, the data set, consisting of 3976 records, is collected from Taipei City Hospital starting from 2008.1.1 to 2008.12.31. Based on the dataset, we first apply the sampling techniques and dimension reduction method to preprocess the testing data. Then, we construct various kinds of classifiers (including basic classifiers, ensemble methods, and cost-sensitive methods) to predict the risk. The cost-sensitive method with random forest classifier is able to achieve recall (or sensitivity) as 100 %. At the recall of 100 %, the precision (positive predictive value, PPV), and specificity of cost-sensitive method with random forest classifier was 2.9 % and 14.87 %, respectively. In our study, we build a breast cancer risk assessment model by using the data mining techniques. Our model has the potential to be served as an assisting tool in the breast cancer screening. PMID:25712814

  19. Confirmation of standard error analysis techniques applied to EXAFS using simulations

    SciTech Connect

    Booth, Corwin H; Hu, Yung-Jin

    2009-12-14

    Systematic uncertainties, such as those in calculated backscattering amplitudes, crystal glitches, etc., not only limit the ultimate accuracy of the EXAFS technique, but also affect the covariance matrix representation of real parameter errors in typical fitting routines. Despite major advances in EXAFS analysis and in understanding all potential uncertainties, these methods are not routinely applied by all EXAFS users. Consequently, reported parameter errors are not reliable in many EXAFS studies in the literature. This situation has made many EXAFS practitioners leery of conventional error analysis applied to EXAFS data. However, conventional error analysis, if properly applied, can teach us more about our data, and even about the power and limitations of the EXAFS technique. Here, we describe the proper application of conventional error analysis to r-space fitting to EXAFS data. Using simulations, we demonstrate the veracity of this analysis by, for instance, showing that the number of independent dat a points from Stern's rule is balanced by the degrees of freedom obtained from a 2 statistical analysis. By applying such analysis to real data, we determine the quantitative effect of systematic errors. In short, this study is intended to remind the EXAFS community about the role of fundamental noise distributions in interpreting our final results.

  20. Risk D&D Rapid Prototype: Scenario Documentation and Analysis Tool

    SciTech Connect

    Unwin, Stephen D.; Seiple, Timothy E.

    2009-05-28

    Report describes process and methodology associated with a rapid prototype tool for integrating project risk analysis and health & safety risk analysis for decontamination and decommissioning projects.

  1. Seawater intrusion risk analysis under climate change conditions for the Gaza Strip aquifer (Palestine)

    NASA Astrophysics Data System (ADS)

    Dentoni, Marta; Deidda, Roberto; Paniconi, Claudio; Marrocu, Marino; Lecca, Giuditta

    2014-05-01

    Seawater intrusion (SWI) has become a major threat to coastal freshwater resources, particularly in the Mediterranean basin, where this problem is exacerbated by the lack of appropriate groundwater resources management and with serious potential impacts from projected climate changes. A proper analysis and risk assessment that includes climate scenarios is essential for the design of water management measures to mitigate the environmental and socio-economic impacts of SWI. In this study a methodology for SWI risk analysis in coastal aquifers is developed and applied to the Gaza Strip coastal aquifer in Palestine. The method is based on the origin-pathway-target model, evaluating the final value of SWI risk by applying the overlay principle to the hazard map (representing the origin of SWI), the vulnerability map (representing the pathway of groundwater flow) and the elements map (representing the target of SWI). Results indicate the important role of groundwater simulation in SWI risk assessment and illustrate how mitigation measures can be developed according to predefined criteria to arrive at quantifiable expected benefits. Keywords: Climate change, coastal aquifer, seawater intrusion, risk analysis, simulation/optimization model. Acknowledgements. The study is partially funded by the project "Climate Induced Changes on the Hydrology of Mediterranean Basins (CLIMB)", FP7-ENV-2009-1, GA 244151.

  2. Risk analysis of tyramine concentration in food production

    NASA Astrophysics Data System (ADS)

    Doudová, L.; Buňka, F.; Michálek, J.; Sedlačík, M.; Buňková, L.

    2013-10-01

    The contribution is focused on risk analysis in food microbiology. This paper evaluates the effect of selected factors on tyramine production in bacterial strains of Lactococcus genus which were assigned as tyramine producers. Tyramine is a biogenic amine sythesized from an amino acid called tyrosine. It can be found in certain foodstuffs (often in cheese), and can cause a pseudo-response in sensitive individuals. The above-mentioned bacteria are commonly used in the biotechnological process of cheese production as starter cultures. The levels of factors were chosen with respect to the conditions which can occur in this technological process. To describe and compare tyramine production in chosen microorganisms, generalized regression models were applied. Tyramine production was modelled by Gompertz curves according to the selected factors (the lactose concentration of 0-1% w/v, NaCl 0-2% w/v and aero/anaerobiosis) for 3 different types of bacterial cultivation. Moreover, estimates of model parameters were calculated and tested; multiple comparisons were discussed as well. The aim of this paper is to find a combination of factors leading to a similar tyramine production level.

  3. Statin use and risk of fracture: a meta-analysis

    PubMed Central

    Jin, Shaolin; Jiang, Jianping; Bai, Pengcheng; Zhang, Mei; Tong, Xujun; Wang, Hui; Lu, Youqun

    2015-01-01

    This meta-analysis investigates the associations of statins use and fracture risk. Two reviewers independently searched six databases including PubMed, Cochrane Library, Ovid, Embase, China National Knowledge Infrastructure (CNKI) and Wanfang databases. Studies retrieved from database searches were screened using our stringent inclusion and exclusion criteria. A sum of 17 studies, published between 2000 and 2014, were included in this meta-analysis. The results of this meta-analysis suggested that statins use was associated with a decreased risk of fracture (OR=0.80; 95% CI, 0.73-0.88; P < 0.00001). In the subgroup analysis by study design, statins was significantly associated with a decreased risk of fracture in both case-control studies (OR=0.67; 95% CI, 0.55-0.87; P < 0.0001) and cohort studies (OR=0.86; 95% CI, 0.77-0.97; P=0.02). In the female subgroup analysis, statins user showed decreased fracture risk (OR=0.76; 95% CI, 0.63-0.92; P=0.005). In the subgroup analysis by duration of follow-up, studies with both long and short duration of follow-up showed decreased risk of fracture (OR=0.67; 95% CI, 0.54-0.82; P=0.001 and OR=0.85; 95% CI, 0.74-0.96; P=0.01). Studies with large sample size and small sample size showed decreased risk of fracture (OR=0.85; 95% CI, 0.77-0.94; P=0.002 and OR=0.65; 95% CI, 0.54-0.78; P < 0.0001). In conclusion, this meta-analysis suggested a significant association between statins use and decreased fracture risk. PMID:26221409

  4. Contract Negotiations Supported Through Risk Analysis

    NASA Astrophysics Data System (ADS)

    Rodrigues, Sérgio A.; Vaz, Marco A.; Souza, Jano M.

    Many clients often view software as a commodity; then, it is critical that IT sellers know how to create value into their offering to differentiate their service from all the others. Clients sometimes refuse to contract software development due to lack of technical understanding or simply because they are afraid of IT contractual commitments. The IT negotiators who recognize the importance of this issue and the reason why it is a problem will be able to work to reach the commercial terms they want. Therefore, this chapter aims to stimulate IT professionals to improve their negotiation skills and presents a computational tool to support managers to get the best out of software negotiations through the identification of contract risks.

  5. Walking the line: Understanding pedestrian behaviour and risk at rail level crossings with cognitive work analysis.

    PubMed

    Read, Gemma J M; Salmon, Paul M; Lenné, Michael G; Stanton, Neville A

    2016-03-01

    Pedestrian fatalities at rail level crossings (RLXs) are a public safety concern for governments worldwide. There is little literature examining pedestrian behaviour at RLXs and no previous studies have adopted a formative approach to understanding behaviour in this context. In this article, cognitive work analysis is applied to understand the constraints that shape pedestrian behaviour at RLXs in Melbourne, Australia. The five phases of cognitive work analysis were developed using data gathered via document analysis, behavioural observation, walk-throughs and critical decision method interviews. The analysis demonstrates the complex nature of pedestrian decision making at RLXs and the findings are synthesised to provide a model illustrating the influences on pedestrian decision making in this context (i.e. time, effort and social pressures). Further, the CWA outputs are used to inform an analysis of the risks to safety associated with pedestrian behaviour at RLXs and the identification of potential interventions to reduce risk. PMID:26518501

  6. Preliminary Technical Risk Analysis for the Geothermal Technologies Program

    SciTech Connect

    2009-01-18

    This report explains the goals, methods, and results of a probabilistic analysis of technical risk for a portfolio of R&D projects in the DOE Geothermal Technologies Program (The Program). The analysis is a task by Princeton Energy Resources International, LLC, in support of the National Renewable Energy Laboratory on behalf of the Program. The main challenge in the analysis lies in translating R&D results to a quantitative reflection of technical risk for a key Program metric: levelized cost of energy (LCOE).

  7. Systematic analysis of natural hazards along infrastructure networks using a GIS-tool for risk assessment

    NASA Astrophysics Data System (ADS)

    Baruffini, Mirko

    2010-05-01

    Due to the topographical conditions in Switzerland, the highways and the railway lines are frequently exposed to natural hazards as rockfalls, debris flows, landslides, avalanches and others. With the rising incidence of those natural hazards, protection measures become an important political issue. However, they are costly, and maximal protection is most probably not economically feasible. Furthermore risks are distributed in space and time. Consequently, important decision problems to the public sector decision makers are derived. This asks for a high level of surveillance and preservation along the transalpine lines. Efficient protection alternatives can be obtained consequently considering the concept of integral risk management. Risk analysis, as the central part of risk management, has become gradually a generally accepted approach for the assessment of current and future scenarios (Loat & Zimmermann 2004). The procedure aims at risk reduction which can be reached by conventional mitigation on one hand and the implementation of land-use planning on the other hand: a combination of active and passive mitigation measures is applied to prevent damage to buildings, people and infrastructures. With a Geographical Information System adapted to run with a tool developed to manage Risk analysis it is possible to survey the data in time and space, obtaining an important system for managing natural risks. As a framework, we adopt the Swiss system for risk analysis of gravitational natural hazards (BUWAL 1999). It offers a complete framework for the analysis and assessment of risks due to natural hazards, ranging from hazard assessment for gravitational natural hazards, such as landslides, collapses, rockfalls, floodings, debris flows and avalanches, to vulnerability assessment and risk analysis, and the integration into land use planning at the cantonal and municipality level. The scheme is limited to the direct consequences of natural hazards. Thus, we develop a

  8. State of the art in benefit-risk analysis: medicines.

    PubMed

    Luteijn, J M; White, B C; Gunnlaugsdóttir, H; Holm, F; Kalogeras, N; Leino, O; Magnússon, S H; Odekerken, G; Pohjola, M V; Tijhuis, M J; Tuomisto, J T; Ueland, Ø; McCarron, P A; Verhagen, H

    2012-01-01

    Benefit-risk assessment in medicine has been a valuable tool in the regulation of medicines since the 1960s. Benefit-risk assessment takes place in multiple stages during a medicine's life-cycle and can be conducted in a variety of ways, using methods ranging from qualitative to quantitative. Each benefit-risk assessment method is subject to its own specific strengths and limitations. Despite its widespread and long-time use, benefit-risk assessment in medicine is subject to debate and suffers from a number of limitations and is currently still under development. This state of the art review paper will discuss the various aspects and approaches to benefit-risk assessment in medicine in a chronological pathway. The review will discuss all types of benefit-risk assessment a medicinal product will undergo during its lifecycle, from Phase I clinical trials to post-marketing surveillance and health technology assessment for inclusion in public formularies. The benefit-risk profile of a drug is dynamic and differs for different indications and patient groups. In the end of this review we conclude benefit-risk analysis in medicine is a developed practice that is subject to continuous improvement and modernisation. Improvement not only in methodology, but also in cooperation between organizations can improve benefit-risk assessment. PMID:21683115

  9. Germany wide seasonal flood risk analysis for agricultural crops

    NASA Astrophysics Data System (ADS)

    Klaus, Stefan; Kreibich, Heidi; Kuhlmann, Bernd; Merz, Bruno; Schröter, Kai

    2016-04-01

    In recent years, large-scale flood risk analysis and mapping has gained attention. Regional to national risk assessments are needed, for example, for national risk policy developments, for large-scale disaster management planning and in the (re-)insurance industry. Despite increasing requests for comprehensive risk assessments some sectors have not received much scientific attention, one of these is the agricultural sector. In contrast to other sectors, agricultural crop losses depend strongly on the season. Also flood probability shows seasonal variation. Thus, the temporal superposition of high flood susceptibility of crops and high flood probability plays an important role for agricultural flood risk. To investigate this interrelation and provide a large-scale overview of agricultural flood risk in Germany, an agricultural crop loss model is used for crop susceptibility analyses and Germany wide seasonal flood-frequency analyses are undertaken to derive seasonal flood patterns. As a result, a Germany wide map of agricultural flood risk is shown as well as the crop type most at risk in a specific region. The risk maps may provide guidance for federal state-wide coordinated designation of retention areas.

  10. Geotechnical risk analysis by flat dilatometer (DMT)

    NASA Astrophysics Data System (ADS)

    Amoroso, Sara; Monaco, Paola

    2015-04-01

    In the last decades we have assisted at a massive migration from laboratory testing to in situ testing, to the point that, today, in situ testing is often the major part of a geotechnical investigation. The State of the Art indicates that direct-push in situ tests, such as the Cone Penetration Test (CPT) and the Flat Dilatometer Test (DMT), are fast and convenient in situ tests for routine site investigation. In most cases the DMT estimated parameters, in particular the undrained shear strength su and the constrained modulus M, are used with the common design methods of Geotechnical Engineering for evaluating bearing capacity, settlements etc. The paper focuses on the prediction of settlements of shallow foundations, that is probably the No. 1 application of the DMT, especially in sands, where undisturbed samples cannot be retrieved, and on the risk associated with their design. A compilation of documented case histories that compare DMT-predicted vs observed settlements, was collected by Monaco et al. (2006), indicating that, in general, the constrained modulus M can be considered a reasonable "operative modulus" (relevant to foundations in "working conditions") for settlement predictions based on the traditional linear elastic approach. Indeed, the use of a site investigation method, such as DMT, that improve the accuracy of design parameters, reduces risk, and the design can then center on the site's true soil variability without parasitic test variability. In this respect, Failmezger et al. (1999, 2015) suggested to introduce Beta probability distribution, that provides a realistic and useful description of variability for geotechnical design problems. The paper estimates Beta probability distribution in research sites where DMT tests and observed settlements are available. References Failmezger, R.A., Rom, D., Ziegler, S.R. (1999). "SPT? A better approach of characterizing residual soils using other in-situ tests", Behavioral Characterics of Residual Soils, B

  11. Risk analysis and safety policy developments in the Netherlands.

    PubMed

    Bottelberghs, P H

    2000-01-01

    In the Netherlands, external safety policy has been developed and implemented since the early eighties on the basis of a risk-based approach involving quantitative criteria for the tolerability of risk. Good experiences have been gained with the risk policy that applies to some 4000 establishments in the Netherlands where hazardous substances are present. On the basis of these experiences, legislation is now being prepared to give the risk tolerability criteria a full legal basis. This is aimed, in particular, to balance between risk control measures at the source through the licensing system, and spatial planning instruments to protect, e.g. residential areas against major hazards. The revision of the Seveso directive (96/82/EC) leads to the implementation of an integrated form of safety reporting, evaluation and inspection. Practical tools were developed for this implementation, e.g. for facilitating the selection of establishments and for assessing risks from major hazard establishments to surface water. In the past few years, the application of risk-based safety policy has been extended to other fields than establishments, e.g. for transport of hazardous chemicals and external safety of airports. PMID:10677654

  12. Analysis of some meteorological variables time series relevant in urban environments by applying the multifractal analysis

    NASA Astrophysics Data System (ADS)

    Pavon-Dominguez, Pablo; Ariza-Villaverde, Ana B.; Jimenez-Hornero, Francisco J.; Gutierrez de Rave, Eduardo

    2010-05-01

    surface wind direction and mean temperature according to the accumulation of points in the extreme of the spectra right tails. This fact confirming that the knowledge of the relationships between the multifractal parameters helps to complete the information regarding to the influence of rare values in the time series. With respect to the relationships between the parameters of the multifractal spectra and those calculated from the descriptive statistics for the meteorological variables considered here, a strong correlation was detected between the rare high values, represented by the extreme points in the spectra left tails, and the leptokurtic shape of the frequency distributions. In addition, for the same rare high values it could be checked a significant negative correlation between them and the coefficients of variation. The spectra left tails, corresponding to high values in the time series, exhibited greater amplitudes for those variables distributions that showed higher dispersion and positive coefficients of skewness. The multifractal analysis has shown itself to be a suitable and efficient approach to characterizing the most important meteorological variables affecting cities environment providing information that can be applied to increase the knowledge on the urban climate dynamics.

  13. Dynamic optimization of ISR sensors using a risk-based reward function applied to ground and space surveillance scenarios

    NASA Astrophysics Data System (ADS)

    DeSena, J. T.; Martin, S. R.; Clarke, J. C.; Dutrow, D. A.; Newman, A. J.

    2012-06-01

    As the number and diversity of sensing assets available for intelligence, surveillance and reconnaissance (ISR) operations continues to expand, the limited ability of human operators to effectively manage, control and exploit the ISR ensemble is exceeded, leading to reduced operational effectiveness. Automated support both in the processing of voluminous sensor data and sensor asset control can relieve the burden of human operators to support operation of larger ISR ensembles. In dynamic environments it is essential to react quickly to current information to avoid stale, sub-optimal plans. Our approach is to apply the principles of feedback control to ISR operations, "closing the loop" from the sensor collections through automated processing to ISR asset control. Previous work by the authors demonstrated non-myopic multiple platform trajectory control using a receding horizon controller in a closed feedback loop with a multiple hypothesis tracker applied to multi-target search and track simulation scenarios in the ground and space domains. This paper presents extensions in both size and scope of the previous work, demonstrating closed-loop control, involving both platform routing and sensor pointing, of a multisensor, multi-platform ISR ensemble tasked with providing situational awareness and performing search, track and classification of multiple moving ground targets in irregular warfare scenarios. The closed-loop ISR system is fullyrealized using distributed, asynchronous components that communicate over a network. The closed-loop ISR system has been exercised via a networked simulation test bed against a scenario in the Afghanistan theater implemented using high-fidelity terrain and imagery data. In addition, the system has been applied to space surveillance scenarios requiring tracking of space objects where current deliberative, manually intensive processes for managing sensor assets are insufficiently responsive. Simulation experiment results are presented

  14. Ecological risk assessment of the antibiotic enrofloxacin applied to Pangasius catfish farms in the Mekong Delta, Vietnam.

    PubMed

    Andrieu, Margot; Rico, Andreu; Phu, Tran Minh; Huong, Do Thi Thanh; Phuong, Nguyen Thanh; Van den Brink, Paul J

    2015-01-01

    Antibiotics applied in aquaculture production may be released into the environment and contribute to the deterioration of surrounding aquatic ecosystems. In the present study, we assessed the ecological risks posed by the use of the antibiotic enrofloxacin (ENR), and its main metabolite ciprofloxacin (CIP), in a Pangasius catfish farm in the Mekong Delta region, Vietnam. Water and sediment samples were collected in a stream receiving effluents from a Pangasius catfish farm that had applied ENR. The toxicity of ENR and CIP was assessed on three tropical aquatic species: the green-algae Chlorella sp. (72 h - growth inhibition test), the micro-invertebrate Moina macrocopa (48 h - immobilization test), and the Nile tilapia (Oreochromis niloticus). The toxic effects on O. niloticus were evaluated by measuring the cholinesterase (ChE) and catalase (CAT) activities in the fish brain and muscles, respectively, and by considering feed exposure and water exposure separately. Ecological risks were assessed by comparing maximum exposure concentrations with predicted no effect concentrations for cyanobacteria, green algae, invertebrates and fish derived with available toxicity data. The results of this study showed that maximum antibiotic concentrations in Pangasius catfish farm effluents were 0.68 μg L(-1) for ENR and 0.25 μg L(-1) for CIP (dissolved water concentrations). Antibiotics accumulated in sediments down-stream the effluent discharge point at concentrations up to 2590 μg kg(-1) d.w. and 592 μg kg(-1) d.w. for ENR and CIP, respectively. The calculated EC50 values for ENR and CIP were 111000 and 23000 μg L(-1) for Chlorella sp., and 69000 and 71000 μg L(-1) for M. macrocopa, respectively. Significant effects on the ChE and CAT enzymatic activities of O. niloticus were observed at 5 g kg(-1) feed and 400-50000 μg L(-1), for both antibiotics. The results of the ecological risk assessment performed in this study indicated only minor risks for cyanobacteria

  15. Structural reliability analysis and seismic risk assessment

    SciTech Connect

    Hwang, H.; Reich, M.; Shinozuka, M.

    1984-01-01

    This paper presents a reliability analysis method for safety evaluation of nuclear structures. By utilizing this method, it is possible to estimate the limit state probability in the lifetime of structures and to generate analytically the fragility curves for PRA studies. The earthquake ground acceleration, in this approach, is represented by a segment of stationary Gaussian process with a zero mean and a Kanai-Tajimi Spectrum. All possible seismic hazard at a site represented by a hazard curve is also taken into consideration. Furthermore, the limit state of a structure is analytically defined and the corresponding limit state surface is then established. Finally, the fragility curve is generated and the limit state probability is evaluated. In this paper, using a realistic reinforced concrete containment as an example, results of the reliability analysis of the containment subjected to dead load, live load and ground earthquake acceleration are presented and a fragility curve for PRA studies is also constructed.

  16. State of the art in benefit-risk analysis: introduction.

    PubMed

    Verhagen, H; Tijhuis, M J; Gunnlaugsdóttir, H; Kalogeras, N; Leino, O; Luteijn, J M; Magnússon, S H; Odekerken, G; Pohjola, M V; Tuomisto, J T; Ueland, Ø; White, B C; Holm, F

    2012-01-01

    Risk-taking is normal in everyday life if there are associated (perceived) benefits. Benefit-Risk Analysis (BRA) compares the risk of a situation to its related benefits and addresses the acceptability of the risk. Over the past years BRA in relation to food and food ingredients has gained attention. Food, and even the same food ingredient, may confer both beneficial and adverse effects. Measures directed at food safety may lead to suboptimal or insufficient levels of ingredients from a benefit perspective. In BRA, benefits and risks of food (ingredients) are assessed in one go and may conditionally be expressed into one currency. This allows the comparison of adverse and beneficial effects to be qualitative and quantitative. A BRA should help policy-makers to make more informed and balanced benefit-risk management decisions. Not allowing food benefits to occur in order to guarantee food safety is a risk management decision much the same as accepting some risk in order to achieve more benefits. BRA in food and nutrition is making progress, but difficulties remain. The field may benefit from looking across its borders to learn from other research areas. The BEPRARIBEAN project (Best Practices for Risk-Benefit Analysis: experience from out of food into food; http://en.opasnet.org/w/Bepraribean) aims to do so, by working together with Medicines, Food Microbiology, Environmental Health, Economics & Marketing-Finance and Consumer Perception. All perspectives are reviewed and subsequently integrated to identify opportunities for further development of BRA for food and food ingredients. Interesting issues that emerge are the varying degrees of risk that are deemed acceptable within the areas and the trend towards more open and participatory BRA processes. A set of 6 'state of the art' papers covering the above areas and a paper integrating the separate (re)views are published in this volume. PMID:21679738

  17. Credibility analysis of risk classes by generalized linear model

    NASA Astrophysics Data System (ADS)

    Erdemir, Ovgucan Karadag; Sucu, Meral

    2016-06-01

    In this paper generalized linear model (GLM) and credibility theory which are frequently used in nonlife insurance pricing are combined for reliability analysis. Using full credibility standard, GLM is associated with limited fluctuation credibility approach. Comparison criteria such as asymptotic variance and credibility probability are used to analyze the credibility of risk classes. An application is performed by using one-year claim frequency data of a Turkish insurance company and results of credible risk classes are interpreted.

  18. Multi-hazard risk analysis for management strategies

    NASA Astrophysics Data System (ADS)

    Kappes, M.; Keiler, M.; Bell, R.; Glade, T.

    2009-04-01

    Risk management is very often operating in a reactive way, responding to an event, instead of proactive starting with risk analysis and building up the whole process of risk evaluation, prevention, event management and regeneration. Since damage and losses from natural hazards raise continuously more and more studies, concepts (e.g. Switzerland or South Tyrol-Bolozano) and software packages (e.g. ARMAGEDOM, HAZUS or RiskScape) are developed to guide, standardize and facilitate the risk analysis. But these approaches focus on different aspects and are mostly closely adapted to the situation (legislation, organization of the administration, specific processes etc.) of the specific country or region. We propose in this study the development of a flexible methodology for multi-hazard risk analysis, identifying the stakeholders and their needs, processes and their characteristics, modeling approaches as well as incoherencies occurring by combining all these different aspects. Based on this concept a flexible software package will be established consisting of ArcGIS as central base and being complemented by various modules for hazard modeling, vulnerability assessment and risk calculation. Not all modules will be developed newly but taken from the current state-of-the-art and connected or integrated into ArcGIS. For this purpose two study sites, Valtellina in Italy and Bacelonnette in France, were chosen and the hazards types debris flows, rockfalls, landslides, avalanches and floods are planned to be included in the tool for a regional multi-hazard risk analysis. Since the central idea of this tool is its flexibility this will only be a first step, in the future further processes and scales can be included and the instrument thus adapted to any study site.

  19. Uncertainty propagation analysis applied to volcanic ash dispersal at Mt. Etna by using a Lagrangian model

    NASA Astrophysics Data System (ADS)

    de'Michieli Vitturi, Mattia; Pardini, Federica; Spanu, Antonio; Neri, Augusto; Vittoria Salvetti, Maria

    2015-04-01

    Volcanic ash clouds represent a major hazard for populations living nearby volcanic centers producing a risk for humans and a potential threat to crops, ground infrastructures, and aviation traffic. Lagrangian particle dispersal models are commonly used for tracking ash particles emitted from volcanic plumes and transported under the action of atmospheric wind fields. In this work, we present the results of an uncertainty propagation analysis applied to volcanic ash dispersal from weak plumes with specific focus on the uncertainties related to the grain-size distribution of the mixture. To this aim, the Eulerian fully compressible mesoscale non-hydrostatic model WRF was used to generate the driving wind, representative of the atmospheric conditions occurring during the event of November 24, 2006 at Mt. Etna. Then, the Lagrangian particle model LPAC (de' Michieli Vitturi et al., JGR 2010) was used to simulate the transport of mass particles under the action of atmospheric conditions. The particle motion equations were derived by expressing the Lagrangian particle acceleration as the sum of the forces acting along its trajectory, with drag forces calculated as a function of particle diameter, density, shape and Reynolds number. The simulations were representative of weak plume events of Mt. Etna and aimed to quantify the effect on the dispersal process of the uncertainty in the particle sphericity and in the mean and variance of a log-normal distribution function describing the grain-size of ash particles released from the eruptive column. In order to analyze the sensitivity of particle dispersal to these uncertain parameters with a reasonable number of simulations, and therefore with affordable computational costs, response surfaces in the parameter space were built by using the generalized polynomial chaos technique. The uncertainty analysis allowed to quantify the most probable values, as well as their pdf, of the number of particles as well as of the mean and

  20. Risk Analysis and Decision Making FY 2013 Milestone Report

    SciTech Connect

    Engel, David W.; Dalton, Angela C.; Dale, Crystal; Jones, Edward; Thompson, J.

    2013-06-01

    Risk analysis and decision making is one of the critical objectives of CCSI, which seeks to use information from science-based models with quantified uncertainty to inform decision makers who are making large capital investments. The goal of this task is to develop tools and capabilities to facilitate the development of risk models tailored for carbon capture technologies, quantify the uncertainty of model predictions, and estimate the technical and financial risks associated with the system. This effort aims to reduce costs by identifying smarter demonstrations, which could accelerate development and deployment of the technology by several years.

  1. Gambler Risk Perception: A Mental Model and Grounded Theory Analysis.

    PubMed

    Spurrier, Michael; Blaszczynski, Alexander; Rhodes, Paul

    2015-09-01

    Few studies have investigated how gamblers perceive risk or the role of risk perception in disordered gambling. The purpose of the current study therefore was to obtain data on lay gamblers' beliefs on these variables and their effects on decision-making, behaviour, and disordered gambling aetiology. Fifteen regular lay gamblers (non-problem/low risk, moderate risk and problem gamblers) completed a semi-structured interview following mental models and grounded theory methodologies. Gambler interview data was compared to an expert 'map' of risk-perception, to identify comparative gaps or differences associated with harmful or safe gambling. Systematic overlapping processes of data gathering and analysis were used to iteratively extend, saturate, test for exception, and verify concepts and themes emerging from the data. The preliminary findings suggested that gambler accounts supported the presence of expert conceptual constructs, and to some degree the role of risk perception in protecting against or increasing vulnerability to harm and disordered gambling. Gambler accounts of causality, meaning, motivation, and strategy were highly idiosyncratic, and often contained content inconsistent with measures of disordered gambling. Disordered gambling appears heavily influenced by relative underestimation of risk and overvaluation of gambling, based on explicit and implicit analysis, and deliberate, innate, contextual, and learned processing evaluations and biases. PMID:24402720

  2. Quantitative Risk Analysis of Obstacle Limitation Standards

    NASA Astrophysics Data System (ADS)

    Sandaradura, Amila Silva

    Obstacle limitation surfaces (OLS) are the main safeguard against objects that can pose a hazard to aircraft operations at and around the airports. The standard dimensions of the most of these surfaces were estimated using the pilot's experience at the time when they were included in to the standard documents. As a result, some of these standards may have been overestimated while others may not provide an adequate level of safety. With airports moving to the Safety Management System (SMS) approach to design and operations safety, proper evaluation of the level of safety provided by OLS at specific sites becomes great importance to airport operators. There is no published evidence, however, for the estimation of the safety level provided by the existing OLS standards. Moreover, the rationale used by the ICAO to establish existing OLS standards is not readily available in the standard documents. Therefore this study attempts to collect actual flight path data using information provided by air traffic control radars and construct a methodology to assess the probability of aircraft deviating from their intended/protected path. The extension of the developed methodology can be used to estimate the OLS dimensions that provide an acceptable safety level for the aircraft operations. This will be helpful to estimate safe and efficient standard dimensions of the OLS and assess the risk level of objects to the aircraft operations around airports. In order to assess the existing standards and show the applications of the methodology, three case studies were conducted using aircraft data collected from Ottawa (CYOW), Calgary (CYYC) and Edmonton (CYEG) International Airports.

  3. Risk analysis. HIV / AIDS country profile: Mozambique.

    PubMed

    1996-12-01

    Mozambique's National STD/AIDS Control Program (NACP) estimates that, at present, about 8% of the population is infected with human immunodeficiency virus (HIV). The epidemic is expected to peak in 1997. By 2001, Mozambique is projected to have 1,650,000 HIV-positive adults 15-49 years of age, of whom 500,000 will have developed acquired immunodeficiency syndrome (AIDS), and 500,000 AIDS orphans. Incidence rates are highest in the country's central region, the transport corridors, and urban centers. The rapid spread of HIV has been facilitated by extreme poverty, the social upheaval and erosion of traditional norms created by years of political conflict and civil war, destruction of the primary health care infrastructure, growth of the commercial sex work trade, and labor migration to and from neighboring countries with high HIV prevalence. Moreover, about 10% of the adult population suffers from sexually transmitted diseases (STDs), including genital ulcers. NACP, created in 1988, is attempting to curb the further spread of HIV through education aimed at changing high-risk behaviors and condom distribution to prevent STD transmission. Theater performances and radio/television programs are used to reach the large illiterate population. The integration of sex education and STD/AIDS information in the curricula of primary and secondary schools and universities has been approved by the Ministry of Education. Several private companies have been persuaded to distribute condoms to their employees. Finally, the confidentiality of HIV patients has been guaranteed. In 1993, the total AIDS budget was US $1.67 million, 50% of which was provided by the European Union. The European Commission seeks to develop a national strategy for managing STDs within the primary health care system. PMID:12320532

  4. Predictive Validity of Pressure Ulcer Risk Assessment Tools for Elderly: A Meta-Analysis.

    PubMed

    Park, Seong-Hi; Lee, Young-Shin; Kwon, Young-Mi

    2016-04-01

    Preventing pressure ulcers is one of the most challenging goals existing for today's health care provider. Currently used tools which assess risk of pressure ulcer development rarely evaluate the accuracy of predictability, especially in older adults. The current study aimed at providing a systemic review and meta-analysis of 29 studies using three pressure ulcer risk assessment tools: Braden, Norton, and Waterlow Scales. Overall predictive validities of pressure ulcer risks in the pooled sensitivity and specificity indicated a similar range with a moderate accuracy level in all three scales, while heterogeneity showed more than 80% variability among studies. The studies applying the Braden Scale used five different cut-off points representing the primary cause of heterogeneity. Results indicate that commonly used screening tools for pressure ulcer risk have limitations regarding validity and accuracy for use with older adults due to heterogeneity among studies. PMID:26337859

  5. Hazmat transport: a methodological framework for the risk analysis of marshalling yards.

    PubMed

    Cozzani, Valerio; Bonvicini, Sarah; Spadoni, Gigliola; Zanelli, Severino

    2007-08-17

    A methodological framework was outlined for the comprehensive risk assessment of marshalling yards in the context of quantified area risk analysis. Three accident typologies were considered for yards: (i) "in-transit-accident-induced" releases; (ii) "shunting-accident-induced" spills; and (iii) "non-accident-induced" leaks. A specific methodology was developed for the assessment of expected release frequencies and equivalent release diameters, based on the application of HazOp and Fault Tree techniques to reference schemes defined for the more common types of railcar vessels used for "hazmat" transportation. The approach was applied to the assessment of an extended case-study. The results evidenced that "non-accident-induced" leaks in marshalling yards represent an important contribution to the overall risk associated to these zones. Furthermore, the results confirmed the considerable role of these fixed installations to the overall risk associated to "hazmat" transportation. PMID:17418942

  6. Approach to proliferation risk assessment based on multiple objective analysis framework

    SciTech Connect

    Andrianov, A.; Kuptsov, I.

    2013-07-01

    The approach to the assessment of proliferation risk using the methods of multi-criteria decision making and multi-objective optimization is presented. The approach allows the taking into account of the specifics features of the national nuclear infrastructure, and possible proliferation strategies (motivations, intentions, and capabilities). 3 examples of applying the approach are shown. First, the approach has been used to evaluate the attractiveness of HEU (high enriched uranium)production scenarios at a clandestine enrichment facility using centrifuge enrichment technology. Secondly, the approach has been applied to assess the attractiveness of scenarios for undeclared production of plutonium or HEU by theft of materials circulating in nuclear fuel cycle facilities and thermal reactors. Thirdly, the approach has been used to perform a comparative analysis of the structures of developing nuclear power systems based on different types of nuclear fuel cycles, the analysis being based on indicators of proliferation risk.

  7. Risk Analysis Methodology for Kistler's K-1 Reusable Launch Vehicle

    NASA Astrophysics Data System (ADS)

    Birkeland, Paul W.

    2002-01-01

    Missile risk analysis methodologies were originally developed in the 1940s as the military experimented with intercontinental ballistic missile (ICBM) technology. As the range of these missiles increased, it became apparent that some means of assessing the risk posed to neighboring populations was necessary to gauge the relative safety of a given test. There were many unknowns at the time, and technology was unpredictable at best. Risk analysis itself was in its infancy. Uncertainties in technology and methodology led to an ongoing bias toward conservative assumptions to adequately bound the problem. This methodology ultimately became the Casualty Expectation Analysis that is used to license Expendable Launch Vehicles (ELVs). A different risk analysis approach was adopted by the commercial aviation industry in the 1950s. At the time, commercial aviation technology was more firmly in hand than ICBM technology. Consequently commercial aviation risk analysis focused more closely on the hardware characteristics. Over the years, this approach has enabled the advantages of technological and safety advances in commercial aviation hardware to manifest themselves in greater capabilities and opportunities. The Boeing 777, for example, received approval for trans-oceanic operations "out of the box," where all previous aircraft were required, at the very least, to demonstrate operations over thousands of hours before being granted such approval. This "out of the box" approval is likely to become standard for all subsequent designs. In short, the commercial aircraft approach to risk analysis created a more flexible environment for industry evolution and growth. In contrast, the continued use of the Casualty Expectation Analysis by the launch industry is likely to hinder industry maturation. It likely will cause any safety and reliability gains incorporated into RLV design to be masked by the conservative assumptions made to "bound the problem." Consequently, for the launch

  8. Thorough approach to measurement uncertainty analysis applied to immersed heat exchanger testing

    SciTech Connect

    Farrington, R B; Wells, C V

    1986-04-01

    This paper discusses the value of an uncertainty analysis, discusses how to determine measurement uncertainty, and then details the sources of error in instrument calibration, data acquisition, and data reduction for a particular experiment. Methods are discussed to determine both the systematic (or bias) error in an experiment as well as to determine the random (or precision) error in the experiment. The detailed analysis is applied to two sets of conditions in measuring the effectiveness of an immersed coil heat exchanger. It shows the value of such analysis as well as an approach to reduce overall measurement uncertainty and to improve the experiment. This paper outlines how to perform an uncertainty analysis and then provides a detailed example of how to apply the methods discussed in the paper. The authors hope this paper will encourage researchers and others to become more concerned with their measurement processes and to report measurement uncertainty with all of their test results.

  9. Hip fracture risk estimation based on principal component analysis of QCT atlas: a preliminary study

    NASA Astrophysics Data System (ADS)

    Li, Wenjun; Kornak, John; Harris, Tamara; Lu, Ying; Cheng, Xiaoguang; Lang, Thomas

    2009-02-01

    We aim to capture and apply 3-dimensional bone fragility features for fracture risk estimation. Using inter-subject image registration, we constructed a hip QCT atlas comprising 37 patients with hip fractures and 38 age-matched controls. In the hip atlas space, we performed principal component analysis to identify the principal components (eigen images) that showed association with hip fracture. To develop and test a hip fracture risk model based on the principal components, we randomly divided the 75 QCT scans into two groups, one serving as the training set and the other as the test set. We applied this model to estimate a fracture risk index for each test subject, and used the fracture risk indices to discriminate the fracture patients and controls. To evaluate the fracture discrimination efficacy, we performed ROC analysis and calculated the AUC (area under curve). When using the first group as the training group and the second as the test group, the AUC was 0.880, compared to conventional fracture risk estimation methods based on bone densitometry, which had AUC values ranging between 0.782 and 0.871. When using the second group as the training group, the AUC was 0.839, compared to densitometric methods with AUC values ranging between 0.767 and 0.807. Our results demonstrate that principal components derived from hip QCT atlas are associated with hip fracture. Use of such features may provide new quantitative measures of interest to osteoporosis.

  10. 2007 Wholesale Power Rate Case Initial Proposal : Risk Analysis Study.

    SciTech Connect

    United States. Bonneville Power Administration.

    2005-11-01

    The Federal Columbia River Power System (FCRPS), operated on behalf of the ratepayers of the PNW by BPA and other Federal agencies, faces many uncertainties during the FY 2007-2009 rate period. Among these uncertainties, the largest revolve around hydro conditions, market prices and river operations for fish recovery. In order to provide a high probability of making its U.S. Treasury payments, BPA performs a Risk Analysis as part of its rate-making process. In this Risk Analysis, BPA identifies key risks, models their relationships, and then analyzes their impacts on net revenues (total revenues less expenses). BPA subsequently evaluates in the ToolKit Model the Treasury Payment Probability (TPP) resulting from the rates, risks, and risk mitigation measures described here and in the Wholesale Power Rate Development Study (WPRDS). If the TPP falls short of BPA's standard, additional risk mitigation revenues, such as PNRR and CRAC revenues are incorporated in the modeling in ToolKit until the TPP standard is met. Increased wholesale market price volatility and six years of drought have significantly changed the profile of risk and uncertainty facing BPA and its stakeholders. These present new challenges for BPA in its effort to keep its power rates as low as possible while fully meeting its obligations to the U.S. Treasury. As a result, the risk BPA faces in not receiving the level of secondary revenues that have been credited to power rates before receiving those funds is greater. In addition to market price volatility, BPA also faces uncertainty around the financial impacts of operations for fish programs in FY 2006 and in the FY 2007-2009 rate period. A new Biological Opinion or possible court-ordered change to river operations in FY 2006 through FY 2009 may reduce BPA's net revenues included Initial Proposal. Finally, the FY 2007-2009 risk analysis includes new operational risks as well as a more comprehensive analysis of non-operating risks. Both the operational

  11. Risk analysis of 222Rn gas received from East Anatolian Fault Zone in Turkey

    NASA Astrophysics Data System (ADS)

    Yilmaz, Mucahit; Kulahci, Fatih

    2016-06-01

    In this study, risk analysis and probability distribution methodologies are applied for 222Rn gas data received from Sürgü (Malatya) station located on East Anatolian Fault Zone (EAFZ). 222Rn data are recorded between 21.02.2007 and 06.06.2010 dates. For study are used total 1151 222Rn data. Changes in concentration of 222Rn are modeled as statistically.

  12. Theoretically Motivated Interventions for Reducing Sexual Risk Taking in Adolescence: A Randomized Controlled Experiment Applying Fuzzy-trace Theory

    PubMed Central

    Reyna, Valerie F.; Mills, Britain A.

    2014-01-01

    Fuzzy-trace theory is a theory of memory, judgment, and decision-making, and their development. We applied advances in this theory to increase the efficacy and durability of a multicomponent intervention to promote risk reduction and avoidance of premature pregnancy and STIs. 734 adolescents from high schools and youth programs in three states (Arizona, Texas, and New York) were randomly assigned to one of three curriculum groups: RTR (Reducing the Risk), RTR+ (a modified version of RTR using fuzzy-trace theory), and a control group. We report effects of curriculum on self-reported behaviors and behavioral intentions plus psychosocial mediators of those effects, namely, attitudes and norms, motives to have sex or get pregnant, self-efficacy and behavioral control, and gist/verbatim constructs. Among 26 outcomes, 19 showed an effect of at least one curriculum relative to the control group: RTR+ produced improvements for 17 outcomes and RTR produced improvements for 12 outcomes. For RTR+, two differences (for perceived parental norms and global benefit perception) were confined to age, gender, or racial/ethnic subgroups. Effects of RTR+ on sexual initiation emerged six months after the intervention, when many adolescents became sexually active. Effects of RTR+ were greater than RTR for nine outcomes, and remained significantly greater than controls at one-year follow-up for 12 outcomes. Consistent with fuzzy-trace theory, results suggest that, by emphasizing gist representations, which are preserved over long time periods and are key memories used in decision-making, the enhanced intervention produced larger and more sustained effects on behavioral outcomes and psychosocial mediators of adolescent risk-taking. PMID:24773191

  13. Theoretically motivated interventions for reducing sexual risk taking in adolescence: a randomized controlled experiment applying fuzzy-trace theory.

    PubMed

    Reyna, Valerie F; Mills, Britain A

    2014-08-01

    Fuzzy-trace theory is a theory of memory, judgment, and decision making, and their development. We applied advances in this theory to increase the efficacy and durability of a multicomponent intervention to promote risk reduction and avoidance of premature pregnancy and sexually transmitted infections. Seven hundred and thirty-four adolescents from high schools and youth programs in 3 states (Arizona, Texas, and New York) were randomly assigned to 1 of 3 curriculum groups: RTR (Reducing the Risk), RTR+ (a modified version of RTR using fuzzy-trace theory), and a control group. We report effects of curriculum on self-reported behaviors and behavioral intentions plus psychosocial mediators of those effects: namely, attitudes and norms, motives to have sex or get pregnant, self-efficacy and behavioral control, and gist/verbatim constructs. Among 26 outcomes, 19 showed an effect of at least 1 curriculum relative to the control group: RTR+ produced improvements for 17 outcomes and RTR produced improvements for 12 outcomes. For RTR+, 2 differences (for perceived parental norms and global benefit perception) were confined to age, gender, or racial/ethnic subgroups. Effects of RTR+ on sexual initiation emerged 6 months after the intervention, when many adolescents became sexually active. Effects of RTR+ were greater than RTR for 9 outcomes, and remained significantly greater than controls at 1-year follow-up for 12 outcomes. Consistent with fuzzy-trace theory, results suggest that by emphasizing gist representations, which are preserved over long periods and are key memories used in decision making, the enhanced intervention produced larger and more sustained effects on behavioral outcomes and psychosocial mediators of adolescent risk taking. PMID:24773191

  14. Bayes Method Plant Aging Risk Analysis

    Energy Science and Technology Software Center (ESTSC)

    1992-03-13

    DORIAN is an integrated package for performing Bayesian aging analysis of reliability data; e.g. for identifying trends in component failure rates and/or outage durations as a function of time. The user must specify several alternatives hypothesized aging models (i.e. possible trends) along with prior probabilities indicating the subjective probability that each trend is actually the correct one. DORIAN then uses component failure and/or repair data over time to update these prior probabilities and develop amore » posterior probability for each aging model, representing the probability that each model is the correct one in light of the observed data rather than a priori. Mean, median, and 5th and 95th percentile trends are also compiled from the posterior probabilities.« less

  15. A semi-quantitative approach to GMO risk-benefit analysis.

    PubMed

    Morris, E Jane

    2011-10-01

    In many countries there are increasing calls for the benefits of genetically modified organisms (GMOs) to be considered as well as the risks, and for a risk-benefit analysis to form an integral part of GMO regulatory frameworks. This trend represents a shift away from the strict emphasis on risks, which is encapsulated in the Precautionary Principle that forms the basis for the Cartagena Protocol on Biosafety, and which is reflected in the national legislation of many countries. The introduction of risk-benefit analysis of GMOs would be facilitated if clear methodologies were available to support the analysis. Up to now, methodologies for risk-benefit analysis that would be applicable to the introduction of GMOs have not been well defined. This paper describes a relatively simple semi-quantitative methodology that could be easily applied as a decision support tool, giving particular consideration to the needs of regulators in developing countries where there are limited resources and experience. The application of the methodology is demonstrated using the release of an insect resistant maize variety in South Africa as a case study. The applicability of the method in the South African regulatory system is also discussed, as an example of what might be involved in introducing changes into an existing regulatory process. PMID:21197601

  16. RiskChanges Spatial Decision Support system for the analysis of changing multi-hazard risk

    NASA Astrophysics Data System (ADS)

    van Westen, Cees; Zhang, Kaixi; Bakker, Wim; Andrejchenko, Vera; Berlin, Julian; Olyazadeh, Roya; Cristal, Irina

    2015-04-01

    Within the framework of the EU FP7 Marie Curie Project CHANGES and the EU FP7 Copernicus project INCREO a spatial decision support system was developed with the aim to analyse the effect of risk reduction planning alternatives on reducing the risk now and in the future, and support decision makers in selecting the best alternatives. Central to the SDSS are the stakeholders. The envisaged users of the system are organizations involved in planning of risk reduction measures, and that have staff capable of visualizing and analyzing spatial data at a municipal scale. The SDSS should be able to function in different countries with different legal frameworks and with organizations with different mandates. These could be subdivided into Civil protection organization with the mandate to design disaster response plans, Expert organizations with the mandate to design structural risk reduction measures (e.g. dams, dikes, check-dams etc), and planning organizations with the mandate to make land development plans. The SDSS can be used in different ways: analyzing the current level of risk, analyzing the best alternatives for risk reduction, the evaluation of the consequences of possible future scenarios to the risk levels, and the evaluation how different risk reduction alternatives will lead to risk reduction under different future scenarios. The SDSS is developed based on open source software and following open standards, for code as well as for data formats and service interfaces. Code development was based upon open source software as well. The architecture of the system is modular. The various parts of the system are loosely coupled, extensible, using standards for interoperability, flexible and web-based. The Spatial Decision Support System is composed of a number of integrated components. The Risk Assessment component allows to carry out spatial risk analysis, with different degrees of complexity, ranging from simple exposure (overlay of hazard and assets maps) to

  17. Applied Behavior Analysis in Autism Spectrum Disorders: Recent Developments, Strengths, and Pitfalls

    ERIC Educational Resources Information Center

    Matson, Johnny L.; Turygin, Nicole C.; Beighley, Jennifer; Rieske, Robert; Tureck, Kimberly; Matson, Michael L.

    2012-01-01

    Autism has become one of the most heavily researched topics in the field of mental health and education. While genetics has been the most studied of all topics, applied behavior analysis (ABA) has also received a great deal of attention, and has arguably yielded the most promising results of any research area to date. The current paper provides a…

  18. Using Applied Behaviour Analysis as Standard Practice in a UK Special Needs School

    ERIC Educational Resources Information Center

    Foran, Denise; Hoerger, Marguerite; Philpott, Hannah; Jones, Elin Walker; Hughes, J. Carl; Morgan, Jonathan

    2015-01-01

    This article describes how applied behaviour analysis can be implemented effectively and affordably in a maintained special needs school in the UK. Behaviour analysts collaborate with classroom teachers to provide early intensive behaviour education for young children with autism spectrum disorders (ASD), and function based behavioural…

  19. Applied Behavior Analysis in the Treatment of Severe Psychiatric Disorders: A Bibliography.

    ERIC Educational Resources Information Center

    Scotti, Joseph R.; And Others

    Clinical research in the area of severe psychiatric disorders constituted the major focus for the discipline of applied behavior analysis during the early 1960s. Recently, however, there appears to be a notable lack of a behavioral focus within many inpatient psychiatric settings and a relative dearth of published behavioral treatment studies with…

  20. A Self-Administered Parent Training Program Based upon the Principles of Applied Behavior Analysis

    ERIC Educational Resources Information Center

    Maguire, Heather M.

    2012-01-01

    Parents often respond to challenging behavior exhibited by their children in such a way that unintentionally strengthens it. Applied behavior analysis (ABA) is a research-based science that has been proven effective in remediating challenging behavior in children. Although many parents could benefit from using strategies from the field of ABA with…

  1. A Case Study in the Misrepresentation of Applied Behavior Analysis in Autism: The Gernsbacher Lectures

    PubMed Central

    Morris, Edward K

    2009-01-01

    I know that most men, including those at ease with problems of the greatest complexity, can seldom accept the simplest and most obvious truth if it be such as would oblige them to admit the falsity of conclusions which they have proudly taught to others, and which they have woven, thread by thread, into the fabrics of their life. (Tolstoy, 1894) This article presents a case study in the misrepresentation of applied behavior analysis for autism based on Morton Ann Gernsbacher's presentation of a lecture titled “The Science of Autism: Beyond the Myths and Misconceptions.” Her misrepresentations involve the characterization of applied behavior analysis, descriptions of practice guidelines, reviews of the treatment literature, presentations of the clinical trials research, and conclusions about those trials (e.g., children's improvements are due to development, not applied behavior analysis). The article also reviews applied behavior analysis' professional endorsements and research support, and addresses issues in professional conduct. It ends by noting the deleterious effects that misrepresenting any research on autism (e.g., biological, developmental, behavioral) have on our understanding and treating it in a transdisciplinary context. PMID:22478522

  2. Evolution of Applied Behavior Analysis in the Treatment of Individuals With Autism

    ERIC Educational Resources Information Center

    Wolery, Mark; Barton, Erin E.; Hine, Jeffrey F.

    2005-01-01

    Two issues of each volume of the Journal of Applied Behavior Analysis were reviewed to identify research reports focusing on individuals with autism. The identified articles were analyzed to describe the ages of individuals with autism, the settings in which the research occurred, the nature of the behaviors targeted for intervention, and the…

  3. Lovaas Model of Applied Behavior Analysis. What Works Clearinghouse Intervention Report

    ERIC Educational Resources Information Center

    What Works Clearinghouse, 2010

    2010-01-01

    The "Lovaas Model of Applied Behavior Analysis" is a type of behavioral therapy that initially focuses on discrete trials: brief periods of one-on-one instruction, during which a teacher cues a behavior, prompts the appropriate response, and provides reinforcement to the child. Children in the program receive an average of 35 to 40 hours of…

  4. A National UK Census of Applied Behavior Analysis School Provision for Children with Autism

    ERIC Educational Resources Information Center

    Griffith, G. M.; Fletcher, R.; Hastings, R. P.

    2012-01-01

    Over more than a decade, specialist Applied Behavior Analysis (ABA) schools or classes for children with autism have developed in the UK and Ireland. However, very little is known internationally about how ABA is defined in practice in school settings, the characteristics of children supported in ABA school settings, and the staffing structures…

  5. Applied Behavior Analysis Programs for Autism: Sibling Psychosocial Adjustment during and Following Intervention Use

    ERIC Educational Resources Information Center

    Cebula, Katie R.

    2012-01-01

    Psychosocial adjustment in siblings of children with autism whose families were using a home-based, applied behavior analysis (ABA) program was compared to that of siblings in families who were not using any intensive autism intervention. Data gathered from parents, siblings and teachers indicated that siblings in ABA families experienced neither…

  6. Graphical and Numerical Descriptive Analysis: Exploratory Tools Applied to Vietnamese Data

    ERIC Educational Resources Information Center

    Haughton, Dominique; Phong, Nguyen

    2004-01-01

    This case study covers several exploratory data analysis ideas, the histogram and boxplot, kernel density estimates, the recently introduced bagplot--a two-dimensional extension of the boxplot--as well as the violin plot, which combines a boxplot with a density shape plot. We apply these ideas and demonstrate how to interpret the output from these…

  7. Japanese Encephalitis Risk and Contextual Risk Factors in Southwest China: A Bayesian Hierarchical Spatial and Spatiotemporal Analysis

    PubMed Central

    Zhao, Xing; Cao, Mingqin; Feng, Hai-Huan; Fan, Heng; Chen, Fei; Feng, Zijian; Li, Xiaosong; Zhou, Xiao-Hua

    2014-01-01

    It is valuable to study the spatiotemporal pattern of Japanese encephalitis (JE) and its association with the contextual risk factors in southwest China, which is the most endemic area in China. Using data from 2004 to 2009, we applied GISmapping and spatial autocorrelation analysis to analyze reported incidence data of JE in 438 counties in southwest China, finding that JE cases were not randomly distributed, and a Bayesian hierarchical spatiotemporal model identified the east part of southwest China as a high risk area. Meanwhile, the Bayesian hierarchical spatial model in 2006 demonstrated a statistically significant association between JE and the agricultural and climatic variables, including the proportion of rural population, the pig-to-human ratio, the monthly precipitation and the monthly mean minimum and maximum temperatures. Particular emphasis was placed on the time-lagged effect for climatic factors. The regression method and the Spearman correlation analysis both identified a two-month lag for the precipitation, while the regression method found a one-month lag for temperature. The results show that the high risk area in the east part of southwest China may be connected to the agricultural and climatic factors. The routine surveillance and the allocation of health resources should be given more attention in this area. Moreover, the meteorological variables might be considered as possible predictors of JE in southwest China. PMID:24739769

  8. Neural network analysis for evaluating cancer risk in thyroid nodules with an indeterminate diagnosis at aspiration cytology: identification of a low-risk subgroup.

    PubMed

    Ippolito, Antonio M; De Laurentiis, Michelino; La Rosa, Giacomo L; Eleuteri, Antonio; Tagliaferri, Roberto; De Placido, Sabino; Vigneri, Riccardo; Belfiore, Antonino

    2004-12-01

    Thyroid nodules with a predominant follicular structure are often diagnosed as indeterminate at fine-needle aspiration biopsy (FNAB). We studied 453 patients with a thyroid nodule diagnosed as indeterminate at FNAB by using a feed-forward artificial neural network (ANN) analysis to integrate cytologic and clinical data, with the goal of subgrouping patients into a high-risk and in a low-risk category. Three hundred seventy-one patients were used to train the network and 82 patients were used to validate the model. The cytologic smears were blindly reviewed and classified in a high-risk and a low-risk subgroup on the basis of standard criteria. Neural network analysis subdivided the 371 lesions of the first series into a high-risk group (cancer rate of approximately 33% at histology) and a low-risk group (cancer rate of 3%). Only cytologic parameters contributed to this classification. Analysis of the receiver operating characteristic (ROC) curves demonstrated that the ANN model discriminated with higher sensitivity and specificity between benign and malignant nodules compared to standard cytologic criteria (p < 0.001). This value did not show degradation when ANN predictions were applied to the validation series of 82 nodules. In conclusion, neural network analysis of cytologic data may be a useful tool to refine the risk of cancer in patients with lesions diagnosed as indeterminate by FNAB. PMID:15650360

  9. Risk analysis by FMEA as an element of analytical validation.

    PubMed

    van Leeuwen, J F; Nauta, M J; de Kaste, D; Odekerken-Rombouts, Y M C F; Oldenhof, M T; Vredenbregt, M J; Barends, D M

    2009-12-01

    We subjected a Near-Infrared (NIR) analytical procedure used for screening drugs on authenticity to a Failure Mode and Effects Analysis (FMEA), including technical risks as well as risks related to human failure. An FMEA team broke down the NIR analytical method into process steps and identified possible failure modes for each step. Each failure mode was ranked on estimated frequency of occurrence (O), probability that the failure would remain undetected later in the process (D) and severity (S), each on a scale of 1-10. Human errors turned out to be the most common cause of failure modes. Failure risks were calculated by Risk Priority Numbers (RPNs)=O x D x S. Failure modes with the highest RPN scores were subjected to corrective actions and the FMEA was repeated, showing reductions in RPN scores and resulting in improvement indices up to 5.0. We recommend risk analysis as an addition to the usual analytical validation, as the FMEA enabled us to detect previously unidentified risks. PMID:19640668

  10. State of the art in benefit-risk analysis: food and nutrition.

    PubMed

    Tijhuis, M J; de Jong, N; Pohjola, M V; Gunnlaugsdóttir, H; Hendriksen, M; Hoekstra, J; Holm, F; Kalogeras, N; Leino, O; van Leeuwen, F X R; Luteijn, J M; Magnússon, S H; Odekerken, G; Rompelberg, C; Tuomisto, J T; Ueland, Ø; White, B C; Verhagen, H

    2012-01-01

    Benefit-risk assessment in food and nutrition is relatively new. It weighs the beneficial and adverse effects that a food (component) may have, in order to facilitate more informed management decisions regarding public health issues. It is rooted in the recognition that good food and nutrition can improve health and that some risk may be acceptable if benefit is expected to outweigh it. This paper presents an overview of current concepts and practices in benefit-risk analysis for food and nutrition. It aims to facilitate scientists and policy makers in performing, interpreting and evaluating benefit-risk assessments. Historically, the assessments of risks and benefits have been separate processes. Risk assessment is mainly addressed by toxicology, as demanded by regulation. It traditionally assumes that a maximum safe dose can be determined from experimental studies (usually in animals) and that applying appropriate uncertainty factors then defines the 'safe' intake for human populations. There is a minor role for other research traditions in risk assessment, such as epidemiology, which quantifies associations between determinants and health effects in humans. These effects can be both adverse and beneficial. Benefit assessment is newly developing in regulatory terms, but has been the subject of research for a long time within nutrition and epidemiology. The exact scope is yet to be defined. Reductions in risk can be termed benefits, but also states rising above 'the average health' are explored as benefits. In nutrition, current interest is in 'optimal' intake; from a population perspective, but also from a more individualised perspective. In current approaches to combine benefit and risk assessment, benefit assessment mirrors the traditional risk assessment paradigm of hazard identification, hazard characterization, exposure assessment and risk characterization. Benefit-risk comparison can be qualitative and quantitative. In a quantitative comparison, benefits

  11. A meta-analysis on depression and subsequent cancer risk

    PubMed Central

    2007-01-01

    Background The authors tested the hypothesis that depression is a possible factor influencing the course of cancer by reviewing prospective epidemiological studies and calculating summary relative risks. Methods Studies were identified by computerized searches of Medline, Embase and PsycINFO. as well as manual searches of reference lists of selected publications. Inclusion criteria were cohort design, population-based sample, structured measurement of depression and outcome of cancer known for depressed and non-depressed subjects Results Thirteen eligible studies were identified. Based on eight studies with complete crude data on overall cancer, our summary relative risk (95% confidence interval) was 1.19 (1.06–1.32). After adjustment for confounders we pooled a summary relative risk of 1.12 (0.99–1.26). No significant association was found between depression and subsequent breast cancer risk, based on seven heterogeneous studies, with or without adjustment for possible confounders. Subgroup analysis of studies with a follow-up of ten years or more, however, resulted in a statistically significant summary relative risk of 2.50 (1.06–5.91). No significant associations were found for lung, colon or prostate cancer. Conclusion This review suggests a tendency towards a small and marginally significant association between depression and subsequent overall cancer risk and towards a stronger increase of breast cancer risk emerging many years after a previous depression. PMID:18053168

  12. Key Attributes of the SAPHIRE Risk and Reliability Analysis Software for Risk-Informed Probabilistic Applications

    SciTech Connect

    Curtis Smith; James Knudsen; Kellie Kvarfordt; Ted Wood

    2008-08-01

    The Idaho National Laboratory is a primary developer of probabilistic risk and reliability analysis (PRRA) tools, dating back over 35 years. Evolving from mainframe-based software, the current state-of-the-practice has lead to the creation of the SAPHIRE software. Currently, agencies such as the Nuclear Regulatory Commission, the National Aeronautics and Aerospace Agency, the Department of Energy, and the Department of Defense use version 7 of the SAPHIRE software for many of their risk-informed activities. In order to better understand and appreciate the power of software as part of risk-informed applications, we need to recall that our current analysis methods and solution methods have built upon pioneering work done 30 to 40 years ago. We contrast this work with the current capabilities in the SAPHIRE analysis package. As part of this discussion, we provide information for both the typical features and special analysis capabilities which are available. We also present the application and results typically found with state-of-the-practice PRRA models. By providing both a high-level and detailed look at the SAPHIRE software, we give a snapshot in time for the current use of software tools in a risk-informed decision arena.

  13. INDEPENDENT COMPONENT ANALYSIS (ICA) APPLIED TO LONG BUNCH BEAMS IN THE LOS ALAMOS PROTON STORAGE RING

    SciTech Connect

    Kolski, Jeffrey S.; Macek, Robert J.; McCrady, Rodney C.; Pang, Xiaoying

    2012-05-14

    Independent component analysis (ICA) is a powerful blind source separation (BSS) method. Compared to the typical BSS method, principal component analysis (PCA), which is the BSS foundation of the well known model independent analysis (MIA), ICA is more robust to noise, coupling, and nonlinearity. ICA of turn-by-turn beam position data has been used to measure the transverse betatron phase and amplitude functions, dispersion function, linear coupling, sextupole strength, and nonlinear beam dynamics. We apply ICA in a new way to slices along the bunch and discuss the source signals identified as betatron motion and longitudinal beam structure.

  14. Rocky Flats Plant Live-Fire Range Risk Analysis Report

    SciTech Connect

    Nicolosi, S.L.; Rodriguez, M.A.

    1994-04-01

    The objective of the Live-Fire Range Risk Analysis Report (RAR) is to provide an authorization basis for operation as required by DOE 5480.16. The existing Live-Fire Range does not have a safety analysis-related authorization basis. EG&G Rocky Flats, Inc. has worked with DOE and its representatives to develop a format and content description for development of an RAR for the Live-Fire Range. Development of the RAR is closely aligned with development of the design for a baffle system to control risks from errant projectiles. DOE 5480.16 requires either an RAR or a safety analysis report (SAR) for live-fire ranges. An RAR rather than a SAR was selected in order to gain flexibility to more closely address the safety analysis and conduct of operation needs for a live-fire range in a cost-effective manner.

  15. Preliminary Technical Risk Analysis for the Geothermal Technologies Program

    SciTech Connect

    McVeigh, J.; Cohen, J.; Vorum, M.; Porro, G.; Nix, G.

    2007-03-01

    This report explains the goals, methods, and results of a probabilistic analysis of technical risk for a portfolio of R&D projects in the DOE Geothermal Technologies Program ('the Program'). The analysis is a task by Princeton Energy Resources International, LLC (PERI), in support of the National Renewable Energy Laboratory (NREL) on behalf of the Program. The main challenge in the analysis lies in translating R&D results to a quantitative reflection of technical risk for a key Program metric: levelized cost of energy (LCOE). This requires both computational development (i.e., creating a spreadsheet-based analysis tool) and a synthesis of judgments by a panel of researchers and experts of the expected results of the Program's R&D.

  16. An improved method for risk evaluation in failure modes and effects analysis of CNC lathe

    NASA Astrophysics Data System (ADS)

    Rachieru, N.; Belu, N.; Anghel, D. C.

    2015-11-01

    Failure mode and effects analysis (FMEA) is one of the most popular reliability analysis tools for identifying, assessing and eliminating potential failure modes in a wide range of industries. In general, failure modes in FMEA are evaluated and ranked through the risk priority number (RPN), which is obtained by the multiplication of crisp values of the risk factors, such as the occurrence (O), severity (S), and detection (D) of each failure mode. However, the crisp RPN method has been criticized to have several deficiencies. In this paper, linguistic variables, expressed in Gaussian, trapezoidal or triangular fuzzy numbers, are used to assess the ratings and weights for the risk factors S, O and D. A new risk assessment system based on the fuzzy set theory and fuzzy rule base theory is to be applied to assess and rank risks associated to failure modes that could appear in the functioning of Turn 55 Lathe CNC. Two case studies have been shown to demonstrate the methodology thus developed. It is illustrated a parallel between the results obtained by the traditional method and fuzzy logic for determining the RPNs. The results show that the proposed approach can reduce duplicated RPN numbers and get a more accurate, reasonable risk assessment. As a result, the stability of product and process can be assured.

  17. Characterization and evaluation of uncertainty in probabilistic risk analysis

    SciTech Connect

    Parry, G.W.; Winter, P.W.

    1981-01-01

    The sources of uncertainty in probabilistic risk analysis are discussed, using the event and fault-tree methodology as an example. The role of statistics in quantifying these uncertainties is investigated. A class of uncertainties is identified which are, at present, unquantifiable using either classical or Bayesian statistics. it is argued that Bayesian statistics is the more appropriate vehicle for the probabilistic analysis of rare events, and a short review is given with some discussion on the representation of ignorance.

  18. A free and open source QGIS plugin for flood risk analysis: FloodRisk

    NASA Astrophysics Data System (ADS)

    Albano, Raffaele; Sole, Aurelia; Mancusi, Leonardo

    2016-04-01

    An analysis of global statistics shows a substantial increase in flood damage over the past few decades. Moreover, it is expected that flood risk will continue to rise due to the combined effect of increasing numbers of people and economic assets in risk-prone areas and the effects of climate change. In order to increase the resilience of European economies and societies, the improvement of risk assessment and management has been pursued in the last years. This results in a wide range of flood analysis models of different complexities with substantial differences in underlying components needed for its implementation, as geographical, hydrological and social differences demand specific approaches in the different countries. At present, it is emerging the need of promote the creation of open, transparent, reliable and extensible tools for a comprehensive, context-specific and applicable flood risk analysis. In this context, the free and open-source Quantum GIS (QGIS) plugin "FloodRisk" is a good starting point to address this objective. The vision of the developers of this free and open source software (FOSS) is to combine the main features of state-of-the-art science, collaboration, transparency and interoperability in an initiative to assess and communicate flood risk worldwide and to assist authorities to facilitate the quality and fairness of flood risk management at multiple scales. Among the scientific community, this type of activity can be labelled as "participatory research", intended as adopting a set of techniques that "are interactive and collaborative" and reproducible, "providing a meaningful research experience that both promotes learning and generates knowledge and research data through a process of guided discovery"' (Albano et al., 2015). Moreover, this FOSS geospatial approach can lowering the financial barriers to understanding risks at national and sub-national levels through a spatio-temporal domain and can provide better and more complete

  19. Quantitative risk analysis of oil storage facilities in seismic areas.

    PubMed

    Fabbrocino, Giovanni; Iervolino, Iunio; Orlando, Francesca; Salzano, Ernesto

    2005-08-31

    Quantitative risk analysis (QRA) of industrial facilities has to take into account multiple hazards threatening critical equipment. Nevertheless, engineering procedures able to evaluate quantitatively the effect of seismic action are not well established. Indeed, relevant industrial accidents may be triggered by loss of containment following ground shaking or other relevant natural hazards, either directly or through cascade effects ('domino effects'). The issue of integrating structural seismic risk into quantitative probabilistic seismic risk analysis (QpsRA) is addressed in this paper by a representative study case regarding an oil storage plant with a number of atmospheric steel tanks containing flammable substances. Empirical seismic fragility curves and probit functions, properly defined both for building-like and non building-like industrial components, have been crossed with outcomes of probabilistic seismic hazard analysis (PSHA) for a test site located in south Italy. Once the seismic failure probabilities have been quantified, consequence analysis has been performed for those events which may be triggered by the loss of containment following seismic action. Results are combined by means of a specific developed code in terms of local risk contour plots, i.e. the contour line for the probability of fatal injures at any point (x, y) in the analysed area. Finally, a comparison with QRA obtained by considering only process-related top events is reported for reference. PMID:15908107

  20. Research in progress in applied mathematics, numerical analysis, fluid mechanics, and computer science

    NASA Technical Reports Server (NTRS)

    1994-01-01

    This report summarizes research conducted at the Institute for Computer Applications in Science and Engineering in applied mathematics, fluid mechanics, and computer science during the period October 1, 1993 through March 31, 1994. The major categories of the current ICASE research program are: (1) applied and numerical mathematics, including numerical analysis and algorithm development; (2) theoretical and computational research in fluid mechanics in selected areas of interest to LaRC, including acoustics and combustion; (3) experimental research in transition and turbulence and aerodynamics involving LaRC facilities and scientists; and (4) computer science.

  1. RASOR Project: Rapid Analysis and Spatialisation of Risk, from Hazard to Risk using EO data

    NASA Astrophysics Data System (ADS)

    Rossi, Lauro; Rudari, Roberto

    2016-04-01

    Over recent decades, there has been a dramatic rise in disasters, and their impact on human populations. Escalation in complexities in our societies is making risks increasingly difficult to understand and changing the ways in which hazards interact with each other. The Rapid Analysis and Spatialisation and Of Risk (RASOR) project developed a multi-hazard risk analysis platform to support the full cycle of disaster management. RASOR provides up-to-date hazard information across floods and geohazards, up-to-date exposure data from known sources and newly-generated EO-based data, and characterised quantitatively their vulnerabilities. RASOR also adapts the newly-developed 12m resolution global TanDEM-X Digital Elevation Model (DEM) to risk management applications, using it as a base layer to develop specific disaster scenarios. RASOR overlays archived and near real-time very high resolution optical and radar satellite data, combined with in situ data for both global and local applications. A scenario-driven query system allows users to project situations into the future and model multi-hazard risk both before and during an event. Applications with regards to different case study sites are presented in order to illustrate the platform potential.

  2. Safety risk analysis of an innovative environmental technology.

    PubMed

    Parnell, G S; Frimpon, M; Barnes, J; Kloeber, J M; Deckro, R E; Jackson, J A

    2001-02-01

    The authors describe a decision and risk analysis performed for the cleanup of a large Department of Energy mixed-waste subsurface disposal area governed by the Comprehensive Environmental Response, Compensation, and Liability Act (CERCLA). In a previous study, the authors worked with the site decision makers, state regulators, and U.S. Environmental Protection Agency regional regulators to develop a CERCLA-based multiobjective decision analysis value model and used the model to perform a screening analysis of 28 remedial alternatives. The analysis results identified an innovative technology, in situ vitrification, with high effectiveness versus cost. Since this technology had not been used on this scale before, the major uncertainties were contaminant migration and pressure buildup. Pressure buildup was a safety concern due to the potential risks to worker safety. With the help of environmental technology experts remedial alternative changes were identified to mitigate the concerns about contaminant migration and pressure buildup. The analysis results showed that the probability of an event with a risk to worker safety had been significantly reduced. Based on these results, site decision makers have refocused their test program to examine in situ vitrification and have continued the use of the CERCLA-based decision analysis methodology to analyze remedial alternatives. PMID:11332543

  3. An object-oriented approach to risk and reliability analysis : methodology and aviation safety applications.

    SciTech Connect

    Dandini, Vincent John; Duran, Felicia Angelica; Wyss, Gregory Dane

    2003-09-01

    This article describes how features of event tree analysis and Monte Carlo-based discrete event simulation can be combined with concepts from object-oriented analysis to develop a new risk assessment methodology, with some of the best features of each. The resultant object-based event scenario tree (OBEST) methodology enables an analyst to rapidly construct realistic models for scenarios for which an a priori discovery of event ordering is either cumbersome or impossible. Each scenario produced by OBEST is automatically associated with a likelihood estimate because probabilistic branching is integral to the object model definition. The OBEST methodology is then applied to an aviation safety problem that considers mechanisms by which an aircraft might become involved in a runway incursion incident. The resulting OBEST model demonstrates how a close link between human reliability analysis and probabilistic risk assessment methods can provide important insights into aviation safety phenomenology.

  4. Applying behavior analysis to clinical problems: review and analysis of habit reversal.

    PubMed Central

    Miltenberger, R G; Fuqua, R W; Woods, D W

    1998-01-01

    This article provides a review and analysis of habit reversal, a multicomponent procedure developed by Azrin and Nunn (1973, 1974) for the treatment of nervous habits, tics, and stuttering. The article starts with a discussion of the behaviors treated with habit reversal, behavioral covariation among habits, and functional analysis and assessment of habits. Research on habit reversal and simplified versions of the procedure is then described. Next the article discusses the limitations of habit reversal and the evidence for its generality. The article concludes with an analysis of the behavioral processes involved in habit reversal and suggestions for future research. PMID:9757583

  5. MetaNetVar: Pipeline for applying network analysis tools for genomic variants analysis

    PubMed Central

    Moyer, Eric; Hagenauer, Megan; Lesko, Matthew; Francis, Felix; Rodriguez, Oscar; Nagarajan, Vijayaraj; Huser, Vojtech; Busby, Ben

    2016-01-01

    Network analysis can make variant analysis better. There are existing tools like HotNet2 and dmGWAS that can provide various analytical methods. We developed a prototype of a pipeline called MetaNetVar that allows execution of multiple tools. The code is published at https://github.com/NCBI-Hackathons/Network_SNPs. A working prototype is published as an Amazon Machine Image - ami-4510312f . PMID:27158457

  6. A Study in the Founding of Applied Behavior Analysis Through Its Publications

    PubMed Central

    Morris, Edward K.; Altus, Deborah E.; Smith, Nathaniel G.

    2013-01-01

    This article reports a study of the founding of applied behavior analysis through its publications. Our methods included hand searches of sources (e.g., journals, reference lists), search terms (i.e., early, applied, behavioral, research, literature), inclusion criteria (e.g., the field's applied dimension), and (d) challenges to their face and content validity. Our results were 36 articles published between 1959 and 1967 that we organized into 4 groups: 12 in 3 programs of research and 24 others. Our discussion addresses (a) limitations in our method (e.g., the completeness of our search), (b) challenges to the validity of our methods and results (e.g., convergent validity), and (c) priority claims about the field's founding. We conclude that the claims are irresolvable because identification of the founding publications depends significantly on methods and because the field's founding was an evolutionary process. We close with suggestions for future research. PMID:25729133

  7. Clean Air Act Title III accidental emission release risk management program, and how it applies to landfills

    SciTech Connect

    Hibbard, C.S.

    1999-07-01

    On June 20, 1996, EPA promulgated regulations pursuant to Title III of the Clean Air Act (CAA) Amendments of 1990 (Section 112(r)(7) of the CAA). The rule, contained in 40 CFR Part 68, is called Accidental Release Prevention Requirements: Risk Management Programs, and is intended to improve accident prevention and emergency response practices at facilities that store and/or use hazardous substances. Methane is a designated highly hazardous chemical (HHC) under the rule. The rule applies to facilities that have 10,000 pounds of methane or more in any process, roughly equivalent to about 244,000 cubic feet of methane. The US EPA has interpreted this threshold quantity as applying to landfill gas within landfills. This paper presents an overview of the Accidental Release Prevention regulations, and how landfills are affected by the requirements. This paper describes methodologies for calculating the threshold quantity of landfill gas in a landfill. Methane is in landfill gas as a mixture. Because landfill gas can burn readily, down to concentrations of about five percent methane, the entire landfill gas mixture must be treated as the regulated substance, and counts toward the 10,000-pound threshold. It is reasonable to assume that the entire landfill gas collection system, active or passive, is filled with landfill gas, and that a calculation of the volume of the system would be a calculation of the landfill gas present in the process on the site. However, the US EPA has indicated that there are some instances in which pore space gas should be included in this calculation. This paper presents methods available to calculate the amount of pore space gas in a landfill, and how to determine how much of that gas might be available for an explosion. The paper goes through how to conduct the release assessment to determine the worst-case hazard zone around the landfill.

  8. Application of a risk analysis method to different technologies for producing a monoclonal antibody employed in hepatitis B vaccine manufacturing.

    PubMed

    Milá, Lorely; Valdés, Rodolfo; Tamayo, Andrés; Padilla, Sigifredo; Ferro, Williams

    2012-03-01

    CB.Hep-1 monoclonal antibody (mAb) is used for a recombinant Hepatitis B vaccine manufacturing, which is included in a worldwide vaccination program against Hepatitis B disease. The use of this mAb as immunoligand has been addressed into one of the most efficient steps of active pharmaceutical ingredient purification process. Regarding this, Quality Risk Management (QRM) provides an excellent framework for the risk management use in pharmaceutical manufacturing and quality decision-making applications. Consequently, this study sought applying a prospective risk analysis methodology Failure Mode Effects Analysis (FMEA) as QRM tool for analyzing different CB.Hep-1 mAb manufacturing technologies. As main conclusions FMEA was successfully used to assess risks associated with potential problems in CB.Hep-1 mAb manufacturing processes. The severity and occurrence of risks analysis evidenced that the percentage of very high severe risks ranged 31.0-38.7% of all risks and the huge majority of risks have a very low occurrence level (61.9-83.3%) in all assessed technologies. Finally, additive Risk Priority Number, was descending ordered as follow: transgenic plants (2636), ascites (2577), transgenic animals (2046) and hollow fiber bioreactors (1654), which also corroborated that in vitro technology, should be the technology of choice for CB.Hep-1 mAb manufacturing in terms of risks and mAb molecule quality. PMID:22285820

  9. Analysis of automated highway system risks and uncertainties. Volume 5

    SciTech Connect

    Sicherman, A.

    1994-10-01

    This volume describes a risk analysis performed to help identify important Automated Highway System (AHS) deployment uncertainties and quantify their effect on costs and benefits for a range of AHS deployment scenarios. The analysis identified a suite of key factors affecting vehicle and roadway costs, capacities and market penetrations for alternative AHS deployment scenarios. A systematic protocol was utilized for obtaining expert judgments of key factor uncertainties in the form of subjective probability percentile assessments. Based on these assessments, probability distributions on vehicle and roadway costs, capacity and market penetration were developed for the different scenarios. The cost/benefit risk methodology and analysis provide insights by showing how uncertainties in key factors translate into uncertainties in summary cost/benefit indices.

  10. [Clustering analysis applied to near-infrared spectroscopy analysis of Chinese traditional medicine].

    PubMed

    Liu, Mu-qing; Zhou, De-cheng; Xu, Xin-yuan; Sun, Yao-jie; Zhou, Xiao-li; Han, Lei

    2007-10-01

    The present article discusses the clustering analysis used in the near-infrared (NIR) spectroscopy analysis of Chinese traditional medicines, which provides a new method for the classification of Chinese traditional medicines. Samples selected purposely in the authors' research to measure their absorption spectra in seconds by a multi-channel NIR spectrometer developed in the authors' lab were safrole, eucalypt oil, laurel oil, turpentine, clove oil and three samples of costmary oil from different suppliers. The spectra in the range of 0.70-1.7 microm were measured with air as background and the results indicated that they are quite distinct. Qualitative mathematical model was set up and cluster analysis based on the spectra was carried out through different clustering methods for optimization, and came out the cluster correlation coefficient of 0.9742 in the authors' research. This indicated that cluster analysis of the group of samples is practicable. Also it is reasonable to get the result that the calculated classification of 8 samples was quite accorded with their characteristics, especially the three samples of costmary oil were in the closest classification of the clustering analysis. PMID:18306778

  11. A multicriteria decision analysis model and risk assessment framework for carbon capture and storage.

    PubMed

    Humphries Choptiany, John Michael; Pelot, Ronald

    2014-09-01

    Multicriteria decision analysis (MCDA) has been applied to various energy problems to incorporate a variety of qualitative and quantitative criteria, usually spanning environmental, social, engineering, and economic fields. MCDA and associated methods such as life-cycle assessments and cost-benefit analysis can also include risk analysis to address uncertainties in criteria estimates. One technology now being assessed to help mitigate climate change is carbon capture and storage (CCS). CCS is a new process that captures CO2 emissions from fossil-fueled power plants and injects them into geological reservoirs for storage. It presents a unique challenge to decisionmakers (DMs) due to its technical complexity, range of environmental, social, and economic impacts, variety of stakeholders, and long time spans. The authors have developed a risk assessment model using a MCDA approach for CCS decisions such as selecting between CO2 storage locations and choosing among different mitigation actions for reducing risks. The model includes uncertainty measures for several factors, utility curve representations of all variables, Monte Carlo simulation, and sensitivity analysis. This article uses a CCS scenario example to demonstrate the development and application of the model based on data derived from published articles and publicly available sources. The model allows high-level DMs to better understand project risks and the tradeoffs inherent in modern, complex energy decisions. PMID:24772997

  12. Association between cadmium exposure and renal cancer risk: a meta-analysis of observational studies

    PubMed Central

    Song, Ju kun; Luo, Hong; Yin, Xin hai; Huang, Guang lei; Luo, Si yang; Lin, Du ren; Yuan, Dong Bo; Zhang, Wei; Zhu, Jian guo

    2015-01-01

    Cadmium (Cd) is a widespread environmental pollutant and has been a recognized carcinogen for several decades. Many observational studies reported Cd exposure might be one cause of renal cancer. However, these findings are inconsistent. We conducted a meta-analysis to evaluate the relationship between cadmium exposure and renal cancer risk. A comprehensive PubMed and Embase search was conducted to retrieve observational studies meeting our meta-analysis criteria. A combined odds ratio (OR) and corresponding 95% confidence interval (CI) were applied to assess the association between Cd exposure and renal cancer risk. The meta-analysis showed that a high Cd exposure significantly increased renal cancer 1.47 times (OR = 1.47; 95% CI = 1.27 to 1.71, for highest versus lowest category of cadmium categories). The significant association remained consistent when stratified by geographic region and gender, however mixed results were produced when stratified by sample size, study design, NOS score, adjustment for covariates, effects measure, and exposure type. Our results indicated that a high Cd exposure was associated with increased renal cancer risk and the association was higher for occupational exposure compared with non-occupational exposure. This meta-analysis suggests that a high Cd exposure may be a risk factor for renal cancer in occupational population. PMID:26656678

  13. Sensitivity Analysis of Launch Vehicle Debris Risk Model

    NASA Technical Reports Server (NTRS)

    Gee, Ken; Lawrence, Scott L.

    2010-01-01

    As part of an analysis of the loss of crew risk associated with an ascent abort system for a manned launch vehicle, a model was developed to predict the impact risk of the debris resulting from an explosion of the launch vehicle on the crew module. The model consisted of a debris catalog describing the number, size and imparted velocity of each piece of debris, a method to compute the trajectories of the debris and a method to calculate the impact risk given the abort trajectory of the crew module. The model provided a point estimate of the strike probability as a function of the debris catalog, the time of abort and the delay time between the abort and destruction of the launch vehicle. A study was conducted to determine the sensitivity of the strike probability to the various model input parameters and to develop a response surface model for use in the sensitivity analysis of the overall ascent abort risk model. The results of the sensitivity analysis and the response surface model are presented in this paper.

  14. Risk-based planning analysis for a single levee

    NASA Astrophysics Data System (ADS)

    Hui, Rui; Jachens, Elizabeth; Lund, Jay

    2016-04-01

    Traditional risk-based analysis for levee planning focuses primarily on overtopping failure. Although many levees fail before overtopping, few planning studies explicitly include intermediate geotechnical failures in flood risk analysis. This study develops a risk-based model for two simplified levee failure modes: overtopping failure and overall intermediate geotechnical failure from through-seepage, determined by the levee cross section represented by levee height and crown width. Overtopping failure is based only on water level and levee height, while through-seepage failure depends on many geotechnical factors as well, mathematically represented here as a function of levee crown width using levee fragility curves developed from professional judgment or analysis. These levee planning decisions are optimized to minimize the annual expected total cost, which sums expected (residual) annual flood damage and annualized construction costs. Applicability of this optimization approach to planning new levees or upgrading existing levees is demonstrated preliminarily for a levee on a small river protecting agricultural land, and a major levee on a large river protecting a more valuable urban area. Optimized results show higher likelihood of intermediate geotechnical failure than overtopping failure. The effects of uncertainty in levee fragility curves, economic damage potential, construction costs, and hydrology (changing climate) are explored. Optimal levee crown width is more sensitive to these uncertainties than height, while the derived general principles and guidelines for risk-based optimal levee planning remain the same.

  15. Lycopene and Risk of Prostate Cancer: A Systematic Review and Meta-Analysis.

    PubMed

    Chen, Ping; Zhang, Wenhao; Wang, Xiao; Zhao, Keke; Negi, Devendra Singh; Zhuo, Li; Qi, Mao; Wang, Xinghuan; Zhang, Xinhua

    2015-08-01

    Prostate cancer (PCa) is a common illness for aging males. Lycopene has been identified as an antioxidant agent with potential anticancer properties. Studies investigating the relation between lycopene and PCa risk have produced inconsistent results. This study aims to determine dietary lycopene consumption/circulating concentration and any potential dose-response associations with the risk of PCa. Eligible studies published in English up to April 10, 2014, were searched and identified from Pubmed, Sciencedirect Online, Wiley online library databases and hand searching. The STATA (version 12.0) was applied to process the dose-response meta-analysis. Random effects models were used to calculate pooled relative risks (RRs) and 95% confidence intervals (CIs) and to incorporate variation between studies. The linear and nonlinear dose-response relations were evaluated with data from categories of lycopene consumption/circulating concentrations. Twenty-six studies were included with 17,517 cases of PCa reported from 563,299 participants. Although inverse association between lycopene consumption and PCa risk was not found in all studies, there was a trend that with higher lycopene intake, there was reduced incidence of PCa (P = 0.078). Removal of one Chinese study in sensitivity analysis, or recalculation using data from only high-quality studies for subgroup analysis, indicated that higher lycopene consumption significantly lowered PCa risk. Furthermore, our dose-response meta-analysis demonstrated that higher lycopene consumption was linearly associated with a reduced risk of PCa with a threshold between 9 and 21 mg/day. Consistently, higher circulating lycopene levels significantly reduced the risk of PCa. Interestingly, the concentration of circulating lycopene between 2.17 and 85 μg/dL was linearly inversed with PCa risk whereas there was no linear association >85 μg/dL. In addition, greater efficacy for the circulating lycopene concentration on

  16. Applying behavior analysis to school violence and discipline problems: Schoolwide positive behavior support

    PubMed Central

    Anderson, Cynthia M.; Kincaid, Donald

    2005-01-01

    School discipline is a growing concern in the United States. Educators frequently are faced with discipline problems ranging from infrequent but extreme problems (e.g., shootings) to less severe problems that occur at high frequency (e.g., bullying, insubordination, tardiness, and fighting). Unfortunately, teachers report feeling ill prepared to deal effectively with discipline problems in schools. Further, research suggests that many commonly used strategies, such as suspension, expulsion, and other reactive strategies, are not effective for ameliorating discipline problems and may, in fact, make the situation worse. The principles and technology of behavior analysis have been demonstrated to be extremely effective for decreasing problem behavior and increasing social skills exhibited by school children. Recently, these principles and techniques have been applied at the level of the entire school, in a movement termed schoolwide positive behavior support. In this paper we review the tenets of schoolwide positive behavior support, demonstrating the relation between this technology and applied behavior analysis. PMID:22478439

  17. Applying behavior analysis to school violence and discipline problems: Schoolwide positive behavior support.

    PubMed

    Anderson, Cynthia M; Kincaid, Donald

    2005-01-01

    School discipline is a growing concern in the United States. Educators frequently are faced with discipline problems ranging from infrequent but extreme problems (e.g., shootings) to less severe problems that occur at high frequency (e.g., bullying, insubordination, tardiness, and fighting). Unfortunately, teachers report feeling ill prepared to deal effectively with discipline problems in schools. Further, research suggests that many commonly used strategies, such as suspension, expulsion, and other reactive strategies, are not effective for ameliorating discipline problems and may, in fact, make the situation worse. The principles and technology of behavior analysis have been demonstrated to be extremely effective for decreasing problem behavior and increasing social skills exhibited by school children. Recently, these principles and techniques have been applied at the level of the entire school, in a movement termed schoolwide positive behavior support. In this paper we review the tenets of schoolwide positive behavior support, demonstrating the relation between this technology and applied behavior analysis. PMID:22478439

  18. Ongoing Analysis of Rocket Based Combined Cycle Engines by the Applied Fluid Dynamics Analysis Group at Marshall Space Flight Center

    NASA Technical Reports Server (NTRS)

    Ruf, Joseph; Holt, James B.; Canabal, Francisco

    1999-01-01

    This paper presents the status of analyses on three Rocket Based Combined Cycle configurations underway in the Applied Fluid Dynamics Analysis Group (TD64). TD64 is performing computational fluid dynamics analysis on a Penn State RBCC test rig, the proposed Draco axisymmetric RBCC engine and the Trailblazer engine. The intent of the analysis on the Penn State test rig is to benchmark the Finite Difference Navier Stokes code for ejector mode fluid dynamics. The Draco engine analysis is a trade study to determine the ejector mode performance as a function of three engine design variables. The Trailblazer analysis is to evaluate the nozzle performance in scramjet mode. Results to date of each analysis are presented.

  19. RADTRAN 5: A computer code for transportation risk analysis

    SciTech Connect

    Neuhauser, K. S.; Kanipe, F. L.

    1991-01-01

    RADTRAN 5 is a computer code developed at Sandia National Laboratories (SNL) in Albuquerque, NM, to estimate radiological and nonradiological risks of radioactive materials transportation. RADTRAN 5 is written in ANSI Standard FORTRAN 77 and contains significant advances in the methodology for route-specific analysis first developed by SNL for RADTRAN 4 (Neuhauser and Kanipe, 1992). Like the previous RADTRAN codes, RADTRAN 5 contains two major modules for incident-free and accident risk amlysis, respectively. All commercially important transportation modes may be analyzed with RADTRAN 5: highway by combination truck; highway by light-duty vehicle; rail; barge; ocean-going ship; cargo air; and passenger air.

  20. Adversarial Risk Analysis for Urban Security Resource Allocation.

    PubMed

    Gil, César; Rios Insua, David; Rios, Jesus

    2016-04-01

    Adversarial risk analysis (ARA) provides a framework to deal with risks originating from intentional actions of adversaries. We show how ARA may be used to allocate security resources in the protection of urban spaces. We take into account the spatial structure and consider both proactive and reactive measures, in that we aim at both trying to reduce criminality as well as recovering as best as possible from it, should it happen. We deal with the problem by deploying an ARA model over each spatial unit, coordinating the models through resource constraints, value aggregation, and proximity. We illustrate our approach with an example that uncovers several relevant policy issues. PMID:26927388