Science.gov

Sample records for applying risk analysis

  1. A Hygrothermal Risk Analysis Applied to Residential Unvented Attics

    SciTech Connect

    Pallin, Simon B; Kehrer, Manfred

    2013-01-01

    Aresidential building, constructed with an unvented attic, is acommonroof assembly in the United States.The expected hygrothermal performance and service life of the roof are difficult to estimate due to a number of varying parameters.Typical parameters expected to vary are the climate, direction, and slope of the roof as well as the radiation properties of the surface material. Furthermore, influential parameters are indoor moisture excess, air leakages through the attic floor, and leakages from air-handling unit and ventilation ducts. In addition, the type of building materials such as the insulation material and closed or open cell spray polyurethane foam will influence the future performance of the roof. A development of a simulation model of the roof assembly will enable a risk and sensitivity analysis, in which the most important varying parameters on the hygrothermal performance can be determined. The model is designed to perform probabilistic simulations using mathematical and hygrothermal calculation tools. The varying input parameters can be chosen from existing measurements, simulations, or standards. An analysis is applied to determine the risk of consequences, such as mold growth, rot, or energy demand of the HVAC unit. Furthermore, the future performance of the roof can be simulated in different climates to facilitate the design of an efficient and reliable roof construction with the most suitable technical solution and to determine the most appropriate building materials for a given climate

  2. Downside Risk analysis applied to the Hedge Funds universe

    NASA Astrophysics Data System (ADS)

    Perelló, Josep

    2007-09-01

    Hedge Funds are considered as one of the portfolio management sectors which shows a fastest growing for the past decade. An optimal Hedge Fund management requires an appropriate risk metrics. The classic CAPM theory and its Ratio Sharpe fail to capture some crucial aspects due to the strong non-Gaussian character of Hedge Funds statistics. A possible way out to this problem while keeping the CAPM simplicity is the so-called Downside Risk analysis. One important benefit lies in distinguishing between good and bad returns, that is: returns greater or lower than investor's goal. We revisit most popular Downside Risk indicators and provide new analytical results on them. We compute these measures by taking the Credit Suisse/Tremont Investable Hedge Fund Index Data and with the Gaussian case as a benchmark. In this way, an unusual transversal lecture of the existing Downside Risk measures is provided.

  3. [Risk Analysis applied to food safety in Brazil: prospects and challenges].

    PubMed

    Figueiredo, Ana Virgínia de Almeida; Miranda, Maria Spínola

    2011-04-01

    The scope of this case study is to discuss the ideas of the Brazilian Codex Alimentarius Committee (CCAB) coordinated by National Institute of Metrology, Standardization and Industrial Quality (Inmetro), with respect to the Codex Alimentarius norm on Risk Analysis (RA) applied to Food Safety. The objectives of this investigation were to identify and analyze the opinion of CCAB members on RA and to register their proposals for the application of this norm in Brazil, highlighting the local limitations and potential detected. CCAB members were found to be in favor of the Codex Alimentarius initiative of instituting an RA norm to promote the health safety of foods that circulate on the international market. There was a consensus that the Brazilian government should incorporate RA as official policy to improve the country's system of food control and leverage Brazilian food exports. They acknowledge that Brazil has the technical-scientific capacity to apply this norm, though they stressed several political and institutional limitations. The members consider RA to be a valid initiative for tackling risks in food, due to its ability to improve food safety control measures adopted by the government.

  4. Applying Uncertainty Analysis to a Risk Assessment for the Pesticide Permethrin

    EPA Science Inventory

    We discuss the application of methods of uncertainty analysis from our previous poster to the problem of a risk assessment for exposure to the food-use pesticide permethrin resulting from residential pesticide crack and crevice application. Exposures are simulated by the SHEDS (S...

  5. Risk analysis and Monte Carlo simulation applied to the generation of drilling AFE estimates

    SciTech Connect

    Peterson, S.K.; Murtha, J.A.; Schneider, F.F.

    1995-06-01

    This paper presents a method for developing an authorization-for-expenditure (AFE)-generating model and illustrates the technique with a specific offshore field development case study. The model combines Monte Carlo simulation and statistical analysis of historical drilling data to generate more accurate, risked, AFE estimates. In addition to the general method, two examples of making AFE time estimates for North Sea wells with the presented techniques are given.

  6. Applying geologic sensitivity analysis to environmental risk management: The financial implications

    SciTech Connect

    Rogers, D.T.

    1999-07-01

    The financial risks associated with environmental contamination can be staggering and are often difficult to identify and accurately assess. Geologic sensitivity analysis is gaining recognition as a significant and useful tool that can empower the user with crucial information concerning environmental risk management and brownfield redevelopment. It is particularly useful when (1) evaluating the potential risks associated with redevelopment of historical industrial facilities (brownfields) and (2) planning for future development, especially in areas of rapid development because the number of potential contaminating sources often increases with an increase in economic development. An examination of the financial implications relating to geologic sensitivity analysis in southeastern Michigan from numerous case studies indicate that the environmental cost of contamination may be 100 to 1,000 times greater at a geologically sensitive location compared to the least sensitive location. Geologic sensitivity analysis has demonstrated that near-surface geology may influence the environmental impact of a contaminated site to a greater extent than the amount and type of industrial development.

  7. Common cause evaluations in applied risk analysis of nuclear power plants. [PWR

    SciTech Connect

    Taniguchi, T.; Ligon, D.; Stamatelatos, M.

    1983-04-01

    Qualitative and quantitative approaches were developed for the evaluation of common cause failures (CCFs) in nuclear power plants and were applied to the analysis of the auxiliary feedwater systems of several pressurized water reactors (PWRs). Key CCF variables were identified through a survey of experts in the field and a review of failure experience in operating PWRs. These variables were classified into categories of high, medium, and low defense against a CCF. Based on the results, a checklist was developed for analyzing CCFs of systems. Several known techniques for quantifying CCFs were also reviewed. The information provided valuable insights in the development of a new model for estimating CCF probabilities, which is an extension of and improvement over the Beta Factor method. As applied to the analysis of the PWR auxiliary feedwater systems, the method yielded much more realistic values than the original Beta Factor method for a one-out-of-three system.

  8. A review of dendrogeomorphological research applied to flood risk analysis in Spain

    NASA Astrophysics Data System (ADS)

    Díez-Herrero, A.; Ballesteros, J. A.; Ruiz-Villanueva, V.; Bodoque, J. M.

    2013-08-01

    Over the last forty years, applying dendrogeomorphology to palaeoflood analysis has improved estimates of the frequency and magnitude of past floods worldwide. This paper reviews the main results obtained by applying dendrogeomorphology to flood research in several case studies in Central Spain. These dendrogeomorphological studies focused on the following topics: (1) anatomical analysis to understand the physiological response of trees to flood damage and improve sampling efficiency; (2) compiling robust flood chronologies in ungauged mountain streams, (3) determining flow depth and estimating flood discharge using two-dimensional hydraulic modelling, and comparing them with other palaeostage indicators; (4) calibrating hydraulic model parameters (i.e. Manning roughness); and (5) implementing stochastic-based, cost-benefit analysis to select optimal mitigation measures. The progress made in these areas is presented with suggestions for further research to improve the applicability of dendrogeochronology to palaeoflood studies. Further developments will include new methods for better identification of the causes of specific types of flood damage to trees (e.g. tilted trees) or stable isotope analysis of tree rings to identify the climatic conditions associated with periods of increasing flood magnitude or frequency.

  9. Applied Behavior Analysis

    ERIC Educational Resources Information Center

    Szapacs, Cindy

    2006-01-01

    Teaching strategies that work for typically developing children often do not work for those diagnosed with an autism spectrum disorder. However, teaching strategies that work for children with autism do work for typically developing children. In this article, the author explains how the principles and concepts of Applied Behavior Analysis can be…

  10. Analysis of agreement between cardiac risk stratification protocols applied to participants of a center for cardiac rehabilitation

    PubMed Central

    Santos, Ana A. S.; Silva, Anne K. F.; Vanderlei, Franciele M.; Christofaro, Diego G. D.; Gonçalves, Aline F. L.; Vanderlei, Luiz C. M.

    2016-01-01

    ABSTRACT Background Cardiac risk stratification is related to the risk of the occurrence of events induced by exercise. Despite the existence of several protocols to calculate risk stratification, studies indicating that there is similarity between these protocols are still unknown. Objective To evaluate the agreement between the existing protocols on cardiac risk rating in cardiac patients. Method The records of 50 patients from a cardiac rehabilitation program were analyzed, from which the following information was extracted: age, sex, weight, height, clinical diagnosis, medical history, risk factors, associated diseases, and the results from the most recent laboratory and complementary tests performed. This information was used for risk stratification of the patients in the protocols of the American College of Sports Medicine, the Brazilian Society of Cardiology, the American Heart Association, the protocol designed by Frederic J. Pashkow, the American Association of Cardiovascular and Pulmonary Rehabilitation, the Société Française de Cardiologie, and the Sociedad Española de Cardiología. Descriptive statistics were used to characterize the sample and the analysis of agreement between the protocols was calculated using the Kappa coefficient. Differences were considered with a significance level of 5%. Results Of the 21 analyses of agreement, 12 were considered significant between the protocols used for risk classification, with nine classified as moderate and three as low. No agreements were classified as excellent. Different proportions were observed in each risk category, with significant differences between the protocols for all risk categories. Conclusion The agreements between the protocols were considered low and moderate and the risk proportions differed between protocols. PMID:27556385

  11. Municipal solid waste management health risk assessment from air emissions for China by applying life cycle analysis.

    PubMed

    Li, Hua; Nitivattananon, Vilas; Li, Peng

    2015-05-01

    This study is to quantify and objectively evaluate the extent of environmental health risks from three waste treatment options suggested by the national municipal solid waste management enhancing strategy (No [2011] 9 of the State Council, promulgated on 19 April 2011), which includes sanitary landfill, waste-to-energy incineration and compost, together with the material recovery facility through a case study in Zhangqiu City of China. It addresses potential chronic health risks from air emissions to residential receptors in the impacted area. It combines field survey, analogue survey, design documents and life cycle inventory methods in defining the source strength of chemicals of potential concern. The modelling of life cycle inventory and air dispersion is via integrated waste management(IWM)-2 and Screening Air Dispersion Model (Version 3.0) (SCREEN3). The health risk assessment is in accordance with United States Environmental Protection Agency guidance Risk Assessment Guidance for Superfund (RAGS), Volume I: Human Health Evaluation Manual (Part F, Supplemental Guidance for Inhalation Risk Assessment). The exposure concentration is based on long-term exposure to the maximum ground level contaminant in air under the 'reasonable worst situation' emissions and then directly compared with reference for concentration and unit risk factor/cancer slope factor derived from the national air quality standard (for a conventional pollutant) and toxicological studies (for a specific pollutant). Results from this study suggest that the option of compost with material recovery facility treatment may pose less negative health impacts than other options; the sensitivity analysis shows that the landfill integrated waste management collection rate has a great influence on the impact results. Further investigation is needed to validate or challenge the findings of this study.

  12. Municipal solid waste management health risk assessment from air emissions for China by applying life cycle analysis.

    PubMed

    Li, Hua; Nitivattananon, Vilas; Li, Peng

    2015-05-01

    This study is to quantify and objectively evaluate the extent of environmental health risks from three waste treatment options suggested by the national municipal solid waste management enhancing strategy (No [2011] 9 of the State Council, promulgated on 19 April 2011), which includes sanitary landfill, waste-to-energy incineration and compost, together with the material recovery facility through a case study in Zhangqiu City of China. It addresses potential chronic health risks from air emissions to residential receptors in the impacted area. It combines field survey, analogue survey, design documents and life cycle inventory methods in defining the source strength of chemicals of potential concern. The modelling of life cycle inventory and air dispersion is via integrated waste management(IWM)-2 and Screening Air Dispersion Model (Version 3.0) (SCREEN3). The health risk assessment is in accordance with United States Environmental Protection Agency guidance Risk Assessment Guidance for Superfund (RAGS), Volume I: Human Health Evaluation Manual (Part F, Supplemental Guidance for Inhalation Risk Assessment). The exposure concentration is based on long-term exposure to the maximum ground level contaminant in air under the 'reasonable worst situation' emissions and then directly compared with reference for concentration and unit risk factor/cancer slope factor derived from the national air quality standard (for a conventional pollutant) and toxicological studies (for a specific pollutant). Results from this study suggest that the option of compost with material recovery facility treatment may pose less negative health impacts than other options; the sensitivity analysis shows that the landfill integrated waste management collection rate has a great influence on the impact results. Further investigation is needed to validate or challenge the findings of this study. PMID:25908094

  13. Why income inequality indexes do not apply to health risks.

    PubMed

    Cox, Louis Anthony

    2012-02-01

    Several recent papers have sought to apply inequality measures from economics, such as the Atkinson Index (AI) for inequality of income distributions, to compare the risk inequality of different mortality risk distributions in an effort to help promote efficiency and environmental justice in pollution-reducing interventions. Closer analysis suggests that such applications are neither logically coherent nor necessarily ethically desirable. Risk inequality comparisons should be based on axioms that apply to probabilistic risks, and should consider the multidimensional and time-varying nature of individual and community risks in order to increase efficiency and justice over time and generations. In light of the limitations of the AI applied to mortality risk distributions, it has not been demonstrated to have ethical or practical value in helping policymakers to identify air pollution management interventions that reduce (or minimize) risk and risk inequity.

  14. Exploring Students at Risk for Reading Comprehension Difficulties in South Korea: The RTI Approach Applying Latent Class Growth Analysis

    ERIC Educational Resources Information Center

    Kim, Dongil; Kim, Woori; Koh, Hyejung; Lee, Jaeho; Shin, Jaehyun; Kim, Heeju

    2014-01-01

    The purpose of this study was to identify students at risk of reading comprehension difficulties by using the responsiveness to intervention (RTI) approach. The participants were 177 students in Grades 1-3 in three elementary schools in South Korea. The students received Tier 1 instruction of RTI from March to May 2011, and their performance was…

  15. Risk Analysis

    NASA Technical Reports Server (NTRS)

    Morring, Frank, Jr.

    2004-01-01

    A National Academies panel says the Hubble Space Telescope is too valuable ;or gamblingon a long-shot robotic mission to extend its service life, and urges Directly contradicting Administrator Sean O'Keefe, who killed a planned fifth shuttle servicing mission to the telescope on grounds it was too dangerous for a human crew in the post-Challenger environment, the expert committee found that upgrades to shuttle safety actually should make it less hazardous to fly to the telescope than it was before Columbia was lost. Risks of a telescope-servicing mission are only marginally greater than the planned missions to the International Space Station (ISS) O'Keefe has authorized, the panel found. After comparing those risks to the dangers inherent in trying to develop a complex space robot in the 39 months remaining in the Hubble s estimated service life, the panel opted for the human mission to save one of the major achievements of the American space program, in the words of Louis J. Lanzerotti, its chairman.

  16. The Relative Importance of the Vadose Zone in Multimedia Risk Assessment Modeling Applied at a National Scale: An Analysis of Benzene Using 3MRA

    NASA Astrophysics Data System (ADS)

    Babendreier, J. E.

    2002-05-01

    Evaluating uncertainty and parameter sensitivity in environmental models can be a difficult task, even for low-order, single-media constructs driven by a unique set of site-specific data. The challenge of examining ever more complex, integrated, higher-order models is a formidable one, particularly in regulatory settings applied on a national scale. Quantitative assessment of uncertainty and sensitivity within integrated, multimedia models that simulate hundreds of sites, spanning multiple geographical and ecological regions, will ultimately require a systematic, comparative approach coupled with sufficient computational power. The Multimedia, Multipathway, and Multireceptor Risk Assessment Model (3MRA) is an important code being developed by the United States Environmental Protection Agency for use in site-scale risk assessment (e.g. hazardous waste management facilities). The model currently entails over 700 variables, 185 of which are explicitly stochastic. The 3MRA can start with a chemical concentration in a waste management unit (WMU). It estimates the release and transport of the chemical throughout the environment, and predicts associated exposure and risk. The 3MRA simulates multimedia (air, water, soil, sediments), pollutant fate and transport, multipathway exposure routes (food ingestion, water ingestion, soil ingestion, air inhalation, etc.), multireceptor exposures (resident, gardener, farmer, fisher, ecological habitats and populations), and resulting risk (human cancer and non-cancer effects, ecological population and community effects). The 3MRA collates the output for an overall national risk assessment, offering a probabilistic strategy as a basis for regulatory decisions. To facilitate model execution of 3MRA for purposes of conducting uncertainty and sensitivity analysis, a PC-based supercomputer cluster was constructed. Design of SuperMUSE, a 125 GHz Windows-based Supercomputer for Model Uncertainty and Sensitivity Evaluation is described

  17. Applying risk adjusted cost-effectiveness (RAC-E) analysis to hospitals: estimating the costs and consequences of variation in clinical practice.

    PubMed

    Karnon, Jonathan; Caffrey, Orla; Pham, Clarabelle; Grieve, Richard; Ben-Tovim, David; Hakendorf, Paul; Crotty, Maria

    2013-06-01

    Cost-effectiveness analysis is well established for pharmaceuticals and medical technologies but not for evaluating variations in clinical practice. This paper describes a novel methodology--risk adjusted cost-effectiveness (RAC-E)--that facilitates the comparative evaluation of applied clinical practice processes. In this application, risk adjustment is undertaken with a multivariate matching algorithm that balances the baseline characteristics of patients attending different settings (e.g., hospitals). Linked, routinely collected data are used to analyse patient-level costs and outcomes over a 2-year period, as well as to extrapolate costs and survival over patient lifetimes. The study reports the relative cost-effectiveness of alternative forms of clinical practice, including a full representation of the statistical uncertainty around the mean estimates. The methodology is illustrated by a case study that evaluates the relative cost-effectiveness of services for patients presenting with acute chest pain across the four main public hospitals in South Australia. The evaluation finds that services provided at two hospitals were dominated, and of the remaining services, the more effective hospital gained life years at a low mean additional cost and had an 80% probability of being the most cost-effective hospital at realistic cost-effectiveness thresholds. Potential determinants of the estimated variation in costs and effects were identified, although more detailed analyses to identify specific areas of variation in clinical practice are required to inform improvements at the less cost-effective institutions.

  18. Simulation Assisted Risk Assessment Applied to Launch Vehicle Conceptual Design

    NASA Technical Reports Server (NTRS)

    Mathias, Donovan L.; Go, Susie; Gee, Ken; Lawrence, Scott

    2008-01-01

    A simulation-based risk assessment approach is presented and is applied to the analysis of abort during the ascent phase of a space exploration mission. The approach utilizes groupings of launch vehicle failures, referred to as failure bins, which are mapped to corresponding failure environments. Physical models are used to characterize the failure environments in terms of the risk due to blast overpressure, resulting debris field, and the thermal radiation due to a fireball. The resulting risk to the crew is dynamically modeled by combining the likelihood of each failure, the severity of the failure environments as a function of initiator and time of the failure, the robustness of the crew module, and the warning time available due to early detection. The approach is shown to support the launch vehicle design process by characterizing the risk drivers and identifying regions where failure detection would significantly reduce the risk to the crew.

  19. Applying STAMP in Accident Analysis

    NASA Technical Reports Server (NTRS)

    Leveson, Nancy; Daouk, Mirna; Dulac, Nicolas; Marais, Karen

    2003-01-01

    Accident models play a critical role in accident investigation and analysis. Most traditional models are based on an underlying chain of events. These models, however, have serious limitations when used for complex, socio-technical systems. Previously, Leveson proposed a new accident model (STAMP) based on system theory. In STAMP, the basic concept is not an event but a constraint. This paper shows how STAMP can be applied to accident analysis using three different views or models of the accident process and proposes a notation for describing this process.

  20. Arctic Risk Management (ARMNet) Network: Linking Risk Management Practitioners and Researchers Across the Arctic Regions of Canada and Alaska To Improve Risk, Emergency and Disaster Preparedness and Mitigation Through Comparative Analysis and Applied Research

    NASA Astrophysics Data System (ADS)

    Garland, A.

    2015-12-01

    The Arctic Risk Management Network (ARMNet) was conceived as a trans-disciplinary hub to encourage and facilitate greater cooperation, communication and exchange among American and Canadian academics and practitioners actively engaged in the research, management and mitigation of risks, emergencies and disasters in the Arctic regions. Its aim is to assist regional decision-makers through the sharing of applied research and best practices and to support greater inter-operability and bilateral collaboration through improved networking, joint exercises, workshops, teleconferences, radio programs, and virtual communications (eg. webinars). Most importantly, ARMNet is a clearinghouse for all information related to the management of the frequent hazards of Arctic climate and geography in North America, including new and emerging challenges arising from climate change, increased maritime polar traffic and expanding economic development in the region. ARMNet is an outcome of the Arctic Observing Network (AON) for Long Term Observations, Governance, and Management Discussions, www.arcus.org/search-program. The AON goals continue with CRIOS (www.ariesnonprofit.com/ARIESprojects.php) and coastal erosion research (www.ariesnonprofit.com/webinarCoastalErosion.php) led by the North Slope Borough Risk Management Office with assistance from ARIES (Applied Research in Environmental Sciences Nonprofit, Inc.). The constituency for ARMNet will include all northern academics and researchers, Arctic-based corporations, First Responders (FRs), Emergency Management Offices (EMOs) and Risk Management Offices (RMOs), military, Coast Guard, northern police forces, Search and Rescue (SAR) associations, boroughs, territories and communities throughout the Arctic. This presentation will be of interest to all those engaged in Arctic affairs, describe the genesis of ARMNet and present the results of stakeholder meetings and webinars designed to guide the next stages of the Project.

  1. Budget Risk & Prioritization Analysis Tool

    2010-12-31

    BRPAtool performs the following: •Assists managers in making solid decisions on what scope/activities to reduce and/or eliminate, to meet constrained budgets, based on multiple risk factors •Enables analysis of different budget scenarios •Can analyze risks and cost for each activity based on technical, quantifiable risk criteria and management-determined risks •Real-time analysis •Enables managers to determine the multipliers and where funding is best applied •Promotes solid budget defense

  2. Quantitative risk analysis applied to innocuity and potency tests on the oil-adjuvanted vaccine against foot and mouth disease in Argentina.

    PubMed

    Cané, B G; Rodríguez Toledo, J; Falczuk, A; Leanes, L F; Manetti, J C; Maradei, E; Verde, P

    1995-12-01

    The authors describe the method used in Argentina for quantification of risk in controls of the potency and innocuity of foot and mouth disease vaccine. Quantitative risk analysis is a relatively new tool in the animal health field, and is in line with the principles of transparency and equivalency of the Sanitary and Phytosanitary Agreement of the Uruguay Round of the General Agreement on Tariffs and Trade (GATT: now World Trade Organisation [WTO]). The risk assessment is presented through a description of the steps involved in manufacturing the vaccine, and the controls performed by the manufacturer and by the National Health Animal Service (Servicio Nacional de Sanidad Animal: SENASA). The adverse situation is considered as the lack of potency or innocuity of the vaccine, and the risk is estimated using a combination of the Monte Carlo simulation and the application of a Bayesian model.

  3. Conversation Analysis and Applied Linguistics.

    ERIC Educational Resources Information Center

    Schegloff, Emanuel A.; Koshik, Irene; Jacoby, Sally; Olsher, David

    2002-01-01

    Offers biographical guidance on several major areas of conversation-analytic work--turn-taking, repair, and word selection--and indicates past or potential points of contact with applied linguistics. Also discusses areas of applied linguistic work. (Author/VWL)

  4. Quantitative environmental risk analysis

    SciTech Connect

    Klovning, J.; Nilsen, E.F.

    1995-12-31

    According to regulations relating to implementation and rise of risk analysis in the petroleum activities issued by the Norwegian Petroleum Directorate, it is mandatory for an operator on the Norwegian Continental Shelf to establish acceptance criteria for environmental risk in the activities and carry out environmental risk analysis. This paper presents a {open_quotes}new{close_quotes} method for environmental risk analysis developed by the company. The objective has been to assist the company to meet rules and regulations and to assess and describe the environmental risk in a systematic manner. In the environmental risk analysis the most sensitive biological resource in the affected area is used to assess the environmental damage. The analytical method is based on the methodology for quantitative risk analysis related to loss of life. In addition it incorporates the effect of seasonal fluctuations in the environmental risk evaluations. The paper is describing the function of the main analytical sequences exemplified through an analysis of environmental risk related to exploration drilling in an environmental sensitive area on the Norwegian Continental Shelf.

  5. Risk analysis methodology survey

    NASA Technical Reports Server (NTRS)

    Batson, Robert G.

    1987-01-01

    NASA regulations require that formal risk analysis be performed on a program at each of several milestones as it moves toward full-scale development. Program risk analysis is discussed as a systems analysis approach, an iterative process (identification, assessment, management), and a collection of techniques. These techniques, which range from simple to complex network-based simulation were surveyed. A Program Risk Analysis Handbook was prepared in order to provide both analyst and manager with a guide for selection of the most appropriate technique.

  6. Evaluation of Cardiovascular Risk Scores Applied to NASA's Astronant Corps

    NASA Technical Reports Server (NTRS)

    Jain, I.; Charvat, J. M.; VanBaalen, M.; Lee, L.; Wear, M. L.

    2014-01-01

    In an effort to improve cardiovascular disease (CVD) risk prediction, this analysis evaluates and compares the applicability of multiple CVD risk scores to the NASA Astronaut Corps which is extremely healthy at selection.

  7. Risk/Stress Analysis.

    ERIC Educational Resources Information Center

    Schwerdtfeger, Don; Howell, Richard E.

    1986-01-01

    Identifies stress as a definite health hazard and risk factor involved in a variety of health situations. Proposes that stress identification efforts be considered in environmental analysis so that a more complete approach to risk assessment and management and health hazard prevention can occur. (ML)

  8. Is risk analysis scientific?

    PubMed

    Hansson, Sven Ove; Aven, Terje

    2014-07-01

    This article discusses to what extent risk analysis is scientific in view of a set of commonly used definitions and criteria. We consider scientific knowledge to be characterized by its subject matter, its success in developing the best available knowledge in its fields of study, and the epistemic norms and values that guide scientific investigations. We proceed to assess the field of risk analysis according to these criteria. For this purpose, we use a model for risk analysis in which science is used as a base for decision making on risks, which covers the five elements evidence, knowledge base, broad risk evaluation, managerial review and judgment, and the decision; and that relates these elements to the domains experts and decisionmakers, and to the domains fact-based or value-based. We conclude that risk analysis is a scientific field of study, when understood as consisting primarily of (i) knowledge about risk-related phenomena, processes, events, etc., and (ii) concepts, theories, frameworks, approaches, principles, methods and models to understand, assess, characterize, communicate, and manage risk, in general and for specific applications (the instrumental part).

  9. Applying a weed risk assessment approach to GM crops.

    PubMed

    Keese, Paul K; Robold, Andrea V; Myers, Ruth C; Weisman, Sarah; Smith, Joe

    2014-12-01

    Current approaches to environmental risk assessment of genetically modified (GM) plants are modelled on chemical risk assessment methods, which have a strong focus on toxicity. There are additional types of harms posed by plants that have been extensively studied by weed scientists and incorporated into weed risk assessment methods. Weed risk assessment uses robust, validated methods that are widely applied to regulatory decision-making about potentially problematic plants. They are designed to encompass a broad variety of plant forms and traits in different environments, and can provide reliable conclusions even with limited data. The knowledge and experience that underpin weed risk assessment can be harnessed for environmental risk assessment of GM plants. A case study illustrates the application of the Australian post-border weed risk assessment approach to a representative GM plant. This approach is a valuable tool to identify potential risks from GM plants.

  10. Program risk analysis handbook

    NASA Technical Reports Server (NTRS)

    Batson, R. G.

    1987-01-01

    NASA regulations specify that formal risk analysis be performed on a program at each of several milestones. Program risk analysis is discussed as a systems analysis approach, an iterative process (identification, assessment, management), and a collection of techniques. These techniques, which range from extremely simple to complex network-based simulation, are described in this handbook in order to provide both analyst and manager with a guide for selection of the most appropriate technique. All program risk assessment techniques are shown to be based on elicitation and encoding of subjective probability estimates from the various area experts on a program. Techniques to encode the five most common distribution types are given. Then, a total of twelve distinct approaches to risk assessment are given. Steps involved, good and bad points, time involved, and degree of computer support needed are listed. Why risk analysis should be used by all NASA program managers is discussed. Tools available at NASA-MSFC are identified, along with commercially available software. Bibliography (150 entries) and a program risk analysis check-list are provided.

  11. Targeted assets risk analysis.

    PubMed

    Bouwsema, Barry

    2013-01-01

    Risk assessments utilising the consolidated risk assessment process as described by Public Safety Canada and the Centre for Security Science utilise the five threat categories of natural, human accidental, technological, human intentional and chemical, biological, radiological, nuclear or explosive (CBRNE). The categories of human intentional and CBRNE indicate intended actions against specific targets. It is therefore necessary to be able to identify which pieces of critical infrastructure represent the likely targets of individuals with malicious intent. Using the consolidated risk assessment process and the target capabilities list, coupled with the CARVER methodology and a security vulnerability analysis, it is possible to identify these targeted assets and their weaknesses. This process can help emergency managers to identify where resources should be allocated and funding spent. Targeted Assets Risk Analysis (TARA) presents a new opportunity to improve how risk is measured, monitored, managed and minimised through the four phases of emergency management, namely, prevention, preparation, response and recovery. To reduce risk throughout Canada, Defence Research and Development Canada is interested in researching the potential benefits of a comprehensive approach to risk assessment and management. The TARA provides a framework against which potential human intentional threats can be measured and quantified, thereby improving safety for all Canadians.

  12. Targeted assets risk analysis.

    PubMed

    Bouwsema, Barry

    2013-01-01

    Risk assessments utilising the consolidated risk assessment process as described by Public Safety Canada and the Centre for Security Science utilise the five threat categories of natural, human accidental, technological, human intentional and chemical, biological, radiological, nuclear or explosive (CBRNE). The categories of human intentional and CBRNE indicate intended actions against specific targets. It is therefore necessary to be able to identify which pieces of critical infrastructure represent the likely targets of individuals with malicious intent. Using the consolidated risk assessment process and the target capabilities list, coupled with the CARVER methodology and a security vulnerability analysis, it is possible to identify these targeted assets and their weaknesses. This process can help emergency managers to identify where resources should be allocated and funding spent. Targeted Assets Risk Analysis (TARA) presents a new opportunity to improve how risk is measured, monitored, managed and minimised through the four phases of emergency management, namely, prevention, preparation, response and recovery. To reduce risk throughout Canada, Defence Research and Development Canada is interested in researching the potential benefits of a comprehensive approach to risk assessment and management. The TARA provides a framework against which potential human intentional threats can be measured and quantified, thereby improving safety for all Canadians. PMID:23615063

  13. How to ensure that the results of climate risk analysis make a difference? - Experience from applied research addressing the challenges of climate change

    NASA Astrophysics Data System (ADS)

    Schneiderbauer, Stefan; Zebisch, Marc; Becker, Daniel; Pedoth, Lydia; Renner, Kathrin; Kienberger, Stefan

    2016-04-01

    Changing climate conditions may have beneficial or adverse effects on the social-ecological systems we are living in. In any case, the possible effects result from complex and interlinked physical and social processes embedded in these systems. Traditional research addresses these bio-physical and societal issues in a separate way. Therefore, in general, studies on risks related to climate change are still mono-disciplinary in nature with an increasing amount of work following a multi-disciplinary approach. The quality and usefulness of the results of such research for policy or decision making in practice may further be limited by study designs that do not acknowledge appropriately the significance of integrating or at least mixing qualitative and quantitative information and knowledge. Finally, the acceptance of study results - particularly when containing some kind of assessments - is often endangered by insufficient and / or late involvement of stakeholders and users. The above mentioned limitations have often been brought up in the recent past. However, despite that a certain consensus could be achieved in the last years recognising the need to tackle these issues, little progress has been made in terms of implementation within the context of (research) studies. This paper elaborates in detail on reasons that hamper the application of - interdisciplinary (i.e. natural and social science), - trans-disciplinary (i.e. co-production of knowledge) and - integrative (i.e. combining qualitative and quantitative approaches) work. It is based on the experience gained through a number of applied climate change vulnerability studies carried out within the context of various GIZ-financed development cooperation projects, a consultancy project for the German Environment Agency as well as the workshop series INQUIMUS, which tackles particularly the issues of mixing qualitative and quantitative research approaches. Potentials and constraints of possible attempts for

  14. The Andrews’ Principles of Risk, Need, and Responsivity as Applied in Drug Abuse Treatment Programs: Meta-Analysis of Crime and Drug Use Outcomes

    PubMed Central

    Prendergast, Michael L.; Pearson, Frank S.; Podus, Deborah; Hamilton, Zachary K.; Greenwell, Lisa

    2013-01-01

    Objectives The purpose of the present meta-analysis was to answer the question: Can the Andrews principles of risk, needs, and responsivity, originally developed for programs that treat offenders, be extended to programs that treat drug abusers? Methods Drawing from a dataset that included 243 independent comparisons, we conducted random-effects meta-regression and ANOVA-analog meta-analyses to test the Andrews principles by averaging crime and drug use outcomes over a diverse set of programs for drug abuse problems. Results For crime outcomes, in the meta-regressions the point estimates for each of the principles were substantial, consistent with previous studies of the Andrews principles. There was also a substantial point estimate for programs exhibiting a greater number of the principles. However, almost all of the 95% confidence intervals included the zero point. For drug use outcomes, in the meta-regressions the point estimates for each of the principles was approximately zero; however, the point estimate for programs exhibiting a greater number of the principles was somewhat positive. All of the estimates for the drug use principles had confidence intervals that included the zero point. Conclusions This study supports previous findings from primary research studies targeting the Andrews principles that those principles are effective in reducing crime outcomes, here in meta-analytic research focused on drug treatment programs. By contrast, programs that follow the principles appear to have very little effect on drug use outcomes. Primary research studies that experimentally test the Andrews principles in drug treatment programs are recommended. PMID:24058325

  15. DWPF risk analysis summary

    SciTech Connect

    Shedrow, C.B.

    1990-10-01

    This document contains selected risk analysis data from Chapter 9 (Safety Analysis) of the Defense Waste Processing Facility Safety Analysis Report DWPF SAR and draft Addendum 1 to the Waste Tank Farms SAR. Although these data may be revised prior to finalization of the draft SAR and the draft addendum, they are presently the best available information and were therefore used in preparing the risk analysis portion of the DWPF Environmental Analysis (DWPF EA). This information has been extracted from those draft documents an approved under separate cover so that it can be used as reference material for the DWPF EA when it is placed in the public reading rooms. 9 refs., 4 tabs.

  16. Risk analysis of exploration plays

    SciTech Connect

    Rose, P.R.

    1996-08-01

    The most difficult and crucial decision in petroleum exploration is not which prospect to drill, but rather, which new play to enter. Such a decision, whether ultimately profitable or not, commits the Organization to years of involvement, expenditures of $millions, and hundreds of man-years of effort. Even though uncertainties and risks are high, organizations commonly make the new-play decision in a disjointed, non-analytic, even superficial way. The economic consequences of a bad play choice can be disastrous. Using established principles of prospect risk analysis, modern petroleum exploration organizations routinely assign economic value to individual prospects, but they actually operate via exploration programs in plays and trends. Accordingly, the prospect is the economic unit of exploration, whereas the play is the operational unit. Plays can be successfully analyzed as full-cycle economic risk ventures, however, using many principles of prospect risk analysis. Economic measures such as Expected Present Value, DCFROR, etc. apply equally to plays or prospects. The predicted field-size distribution of the play is analogous to the forecast prospect reserves distribution, Economic truncation applies to both. Variance of play reserves is usually much greater than for prospect reserves. Geologic chance factors such as P{sub reservoir}, P{sub generation}, etc., must be distinguished as independent or shared among prospects in the play, so they should be defined so as to apply equally to the play and to its constituent prospects. They are analogous to multiple objectives on a prospect, and are handled differently in performing the risk analysis.

  17. Risk analysis of exploration plays

    SciTech Connect

    Rose, P.R.

    1996-06-01

    The most difficult and crucial decision in petroleum exploration is not which prospect to drill, but rather, which new play to enter. Such a decision, whether ultimately profitable or not, commits the Organization to years of involvement, expenditures of $millions, and hundreds of man-years of effort. Even though uncertainties and risks are high, organizations commonly make the new-play decision in a disjointed, non-analytic, even superficial way. The economic consequences of a bad play choice can be disastrous. Using established principles of prospect risk analysis, modern petroleum exploration organizations routinely assign economic value to individual prospects, but they actually operate via exploration programs in plays and trends. Accordingly, the prospect is the economic unit of exploration, whereas the play is the operational unit. Plays can be successfully analyzed as full-cycle economic risk ventures, however, using many principles of prospect risk analysis. Economic measures such as Expected Present Value, DCFROR, etc. apply equally to plays or prospects. The predicted field-size distribution of the play is analogous to the forecast prospect reserves distribution. Economic truncation applies to both. Variance of play reserves is usually much greater than for prospect reserves. Geologic chance factors such as P{sub reservoir}, P{sub generation}, etc., must be distinguished as independent or shared among prospects in the play, so they should be defined so as to apply equally to the play and to its constituent prospects. They are analogous to multiple objectives on a prospect, and are handled differently in performing the risk analysis.

  18. Applied Behavior Analysis and Statistical Process Control?

    ERIC Educational Resources Information Center

    Hopkins, B. L.

    1995-01-01

    Incorporating statistical process control (SPC) methods into applied behavior analysis is discussed. It is claimed that SPC methods would likely reduce applied behavior analysts' intimate contacts with problems and would likely yield poor treatment and research decisions. Cases and data presented by Pfadt and Wheeler (1995) are cited as examples.…

  19. Concept analysis of culture applied to nursing.

    PubMed

    Marzilli, Colleen

    2014-01-01

    Culture is an important concept, especially when applied to nursing. A concept analysis of culture is essential to understanding the meaning of the word. This article applies Rodgers' (2000) concept analysis template and provides a definition of the word culture as it applies to nursing practice. This article supplies examples of the concept of culture to aid the reader in understanding its application to nursing and includes a case study demonstrating components of culture that must be respected and included when providing health care.

  20. Applying risk perception theory to public health workforce preparedness training.

    PubMed

    Barnett, Daniel J; Balicer, Ran D; Blodgett, David W; Everly, George S; Omer, Saad B; Parker, Cindy L; Links, Jonathan M

    2005-11-01

    Since 9/11, public health has seen a progressive culture change toward a 24/7 emergency response organizational model. This transition entails new expectations for public health workers, including (1) a readiness and willingness to report to duty in emergencies and (2) an ability to effectively communicate risk to an anxious public about terrorism or naturally occurring disasters. To date, however, research on readiness education for health department workers has focused little attention upon the risk perceptions that may influence their willingness to report to duty during disasters, as well as their ability to provide effective emergency risk communication to the public. Here, we apply risk perception factors to explore the potential barriers and remedies to effective public health workforce emergency response.

  1. Risks, scientific uncertainty and the approach of applying precautionary principle.

    PubMed

    Lo, Chang-fa

    2009-03-01

    The paper intends to clarify the nature and aspects of risks and scientific uncertainty and also to elaborate the approach of application of precautionary principle for the purpose of handling the risk arising from scientific uncertainty. It explains the relations between risks and the application of precautionary principle at international and domestic levels. In the situations where an international treaty has admitted the precautionary principle and in the situation where there is no international treaty admitting the precautionary principle or enumerating the conditions to take measures, the precautionary principle has a role to play. The paper proposes a decision making tool, containing questions to be asked, to help policymakers to apply the principle. It also proposes a "weighing and balancing" procedure to help them decide the contents of the measure to cope with the potential risk and to avoid excessive measures.

  2. Competing Uses of Underground Systems Related to Energy Supply: Applying Single- and Multiphase Simulations for Site Characterization and Risk-Analysis

    NASA Astrophysics Data System (ADS)

    Kissinger, A.; Walter, L.; Darcis, M.; Flemisch, B.; Class, H.

    2012-04-01

    Global climate change, shortage of resources and the resulting turn towards renewable sources of energy lead to a growing demand for the utilization of subsurface systems. Among these competing uses are Carbon Capture and Storage (CCS), geothermal energy, nuclear waste disposal, "renewable" methane or hydrogen storage as well as the ongoing production of fossil resources like oil, gas, and coal. Besides competing among themselves, these technologies may also create conflicts with essential public interests like water supply. For example, the injection of CO2 into the underground causes an increase in pressure reaching far beyond the actual radius of influence of the CO2 plume, potentially leading to large amounts of displaced salt water. Finding suitable sites is a demanding task for several reasons. Natural systems as opposed to technical systems are always characterized by heterogeneity. Therefore, parameter uncertainty impedes reliable predictions towards capacity and safety of a site. State of the art numerical simulations combined with stochastic approaches need to be used to obtain a more reliable assessment of the involved risks and the radii of influence of the different processes. These simulations may include the modeling of single- and multiphase non-isothermal flow, geo-chemical and geo-mechanical processes in order to describe all relevant physical processes adequately. Stochastic approaches have the aim to estimate a bandwidth of the key output parameters based on uncertain input parameters. Risks for these different underground uses can then be made comparable with each other. Along with the importance and the urgency of the competing processes this may lead to a more profound basis for a decision. Communicating risks to stake holders and a concerned public is crucial for the success of finding a suitable site for CCS (or other subsurface utilization). We present and discuss first steps towards an approach for addressing the issue of competitive

  3. Applying the partitioned multiobjective risk method (PMRM) to portfolio selection.

    PubMed

    Reyes Santos, Joost; Haimes, Yacov Y

    2004-06-01

    The analysis of risk-return tradeoffs and their practical applications to portfolio analysis paved the way for Modern Portfolio Theory (MPT), which won Harry Markowitz a 1992 Nobel Prize in Economics. A typical approach in measuring a portfolio's expected return is based on the historical returns of the assets included in a portfolio. On the other hand, portfolio risk is usually measured using volatility, which is derived from the historical variance-covariance relationships among the portfolio assets. This article focuses on assessing portfolio risk, with emphasis on extreme risks. To date, volatility is a major measure of risk owing to its simplicity and validity for relatively small asset price fluctuations. Volatility is a justified measure for stable market performance, but it is weak in addressing portfolio risk under aberrant market fluctuations. Extreme market crashes such as that on October 19, 1987 ("Black Monday") and catastrophic events such as the terrorist attack of September 11, 2001 that led to a four-day suspension of trading on the New York Stock Exchange (NYSE) are a few examples where measuring risk via volatility can lead to inaccurate predictions. Thus, there is a need for a more robust metric of risk. By invoking the principles of the extreme-risk-analysis method through the partitioned multiobjective risk method (PMRM), this article contributes to the modeling of extreme risks in portfolio performance. A measure of an extreme portfolio risk, denoted by f(4), is defined as the conditional expectation for a lower-tail region of the distribution of the possible portfolio returns. This article presents a multiobjective problem formulation consisting of optimizing expected return and f(4), whose solution is determined using Evolver-a software that implements a genetic algorithm. Under business-as-usual market scenarios, the results of the proposed PMRM portfolio selection model are found to be compatible with those of the volatility-based model

  4. Applying the partitioned multiobjective risk method (PMRM) to portfolio selection.

    PubMed

    Reyes Santos, Joost; Haimes, Yacov Y

    2004-06-01

    The analysis of risk-return tradeoffs and their practical applications to portfolio analysis paved the way for Modern Portfolio Theory (MPT), which won Harry Markowitz a 1992 Nobel Prize in Economics. A typical approach in measuring a portfolio's expected return is based on the historical returns of the assets included in a portfolio. On the other hand, portfolio risk is usually measured using volatility, which is derived from the historical variance-covariance relationships among the portfolio assets. This article focuses on assessing portfolio risk, with emphasis on extreme risks. To date, volatility is a major measure of risk owing to its simplicity and validity for relatively small asset price fluctuations. Volatility is a justified measure for stable market performance, but it is weak in addressing portfolio risk under aberrant market fluctuations. Extreme market crashes such as that on October 19, 1987 ("Black Monday") and catastrophic events such as the terrorist attack of September 11, 2001 that led to a four-day suspension of trading on the New York Stock Exchange (NYSE) are a few examples where measuring risk via volatility can lead to inaccurate predictions. Thus, there is a need for a more robust metric of risk. By invoking the principles of the extreme-risk-analysis method through the partitioned multiobjective risk method (PMRM), this article contributes to the modeling of extreme risks in portfolio performance. A measure of an extreme portfolio risk, denoted by f(4), is defined as the conditional expectation for a lower-tail region of the distribution of the possible portfolio returns. This article presents a multiobjective problem formulation consisting of optimizing expected return and f(4), whose solution is determined using Evolver-a software that implements a genetic algorithm. Under business-as-usual market scenarios, the results of the proposed PMRM portfolio selection model are found to be compatible with those of the volatility-based model

  5. Quantitative Microbial Risk Assessment Tutorial: Pour Point Analysis of Land-applied Microbial Loadings and Comparison of Simulated and Gaging Station Results

    EPA Science Inventory

    This tutorial demonstrates a pour point analysis • Initiates execution of the SDMPB.• Navigates the SDMPB.• Chooses a pour point within a watershed, delineates the sub-area that contributes to that pour point, and collects data for it.• Considers land applicat...

  6. Defining applied behavior analysis: An historical analogy

    PubMed Central

    Deitz, Samuel M.

    1982-01-01

    This article examines two criteria for a definition of applied behavior analysis. The criteria are derived from a 19th century attempt to establish medicine as a scientific field. The first criterion, experimental determinism, specifies the methodological boundaries of an experimental science. The second criterion, philosophic doubt, clarifies the tentative nature of facts and theories derived from those facts. Practices which will advance the science of behavior are commented upon within each criteria. To conclude, the problems of a 19th century form of empiricism in medicine are related to current practices in applied behavior analysis. PMID:22478557

  7. Probabilistic risk analysis and terrorism risk.

    PubMed

    Ezell, Barry Charles; Bennett, Steven P; von Winterfeldt, Detlof; Sokolowski, John; Collins, Andrew J

    2010-04-01

    Since the terrorist attacks of September 11, 2001, and the subsequent establishment of the U.S. Department of Homeland Security (DHS), considerable efforts have been made to estimate the risks of terrorism and the cost effectiveness of security policies to reduce these risks. DHS, industry, and the academic risk analysis communities have all invested heavily in the development of tools and approaches that can assist decisionmakers in effectively allocating limited resources across the vast array of potential investments that could mitigate risks from terrorism and other threats to the homeland. Decisionmakers demand models, analyses, and decision support that are useful for this task and based on the state of the art. Since terrorism risk analysis is new, no single method is likely to meet this challenge. In this article we explore a number of existing and potential approaches for terrorism risk analysis, focusing particularly on recent discussions regarding the applicability of probabilistic and decision analytic approaches to bioterrorism risks and the Bioterrorism Risk Assessment methodology used by the DHS and criticized by the National Academies and others.

  8. Applied mathematics analysis of the multibody systems

    NASA Astrophysics Data System (ADS)

    Sahin, H.; Kar, A. K.; Tacgin, E.

    2012-08-01

    A methodology is developed for the analysis of the multibody systems that is applied on the vehicle as a case study. The previous study emphasizes the derivation of the multibody dynamics equations of motion for bogie [2]. In this work, we have developed a guide-way for the analysis of the dynamical behavior of the multibody systems for mainly validation, verification of the realistic mathematical model and partly for the design of the alternative optimum vehicle parameters.

  9. Applied behavior analysis and statistical process control?

    PubMed Central

    Hopkins, B L

    1995-01-01

    This paper examines Pfadt and Wheeler's (1995) suggestions that the methods of statistical process control (SPC) be incorporated into applied behavior analysis. The research strategies of SPC are examined and compared to those of applied behavior analysis. I argue that the statistical methods that are a part of SPC would likely reduce applied behavior analysts' intimate contacts with the problems with which they deal and would, therefore, likely yield poor treatment and research decisions. Examples of these kinds of results and decisions are drawn from the cases and data Pfadt and Wheeler present. This paper also describes and clarifies many common misconceptions about SPC, including W. Edwards Deming's involvement in its development, its relationship to total quality management, and its confusion with various other methods designed to detect sources of unwanted variability. PMID:7592156

  10. Positive Behavior Support and Applied Behavior Analysis

    ERIC Educational Resources Information Center

    Johnston, J. M.; Foxx, R. M.; Jacobson, J. W.; Green, G.; Mulick, J. A.

    2006-01-01

    This article reviews the origins and characteristics of the positive behavior support (PBS) movement and examines those features in the context of the field of applied behavior analysis (ABA). We raise a number of concerns about PBS as an approach to delivery of behavioral services and its impact on how ABA is viewed by those in human services. We…

  11. Risk Analysis Virtual ENvironment

    SciTech Connect

    2014-02-10

    RAVEN has 3 major functionalities: 1. Provides a Graphical User Interface for the pre- and post-processing of the RELAP-7 input and output. 2. Provides the capability to model nuclear power plants control logic for the RELAP-7 code and dynamic control of the accident scenario evolution. This capability is based on a software structure that realizes a direct connection between the RELAP-7 solver engine (MOOSE) and a python environment where the variables describing the plant status are accessible in a scripting environment. RAVEN support the generation of the probabilistic scenario control by supplying a wide range of probability and cumulative distribution functions and their inverse functions. 3. Provides a general environment to perform probability risk analysis for RELAP-7, RELAP-5 and any generic MOOSE based applications. The probabilistic analysis is performed by sampling the input space of the coupled code parameters and it is enhanced by using modern artificial intelligence algorithms that accelerate the identification of the areas of major risk (in the input parameter space). This environment also provides a graphical visualization capability to analyze the outcomes. Among other approaches, the classical Monte Carlo and Latin Hypercube sampling algorithms are available. For the acceleration of the convergence of the sampling methodologies, Support Vector Machines, Bayesian regression, and collocation stochastic polynomials chaos are implemented. The same methodologies here described could be used to solve optimization and uncertainties propagation problems using the RAVEN framework.

  12. Risk Analysis Virtual ENvironment

    2014-02-10

    RAVEN has 3 major functionalities: 1. Provides a Graphical User Interface for the pre- and post-processing of the RELAP-7 input and output. 2. Provides the capability to model nuclear power plants control logic for the RELAP-7 code and dynamic control of the accident scenario evolution. This capability is based on a software structure that realizes a direct connection between the RELAP-7 solver engine (MOOSE) and a python environment where the variables describing the plant statusmore » are accessible in a scripting environment. RAVEN support the generation of the probabilistic scenario control by supplying a wide range of probability and cumulative distribution functions and their inverse functions. 3. Provides a general environment to perform probability risk analysis for RELAP-7, RELAP-5 and any generic MOOSE based applications. The probabilistic analysis is performed by sampling the input space of the coupled code parameters and it is enhanced by using modern artificial intelligence algorithms that accelerate the identification of the areas of major risk (in the input parameter space). This environment also provides a graphical visualization capability to analyze the outcomes. Among other approaches, the classical Monte Carlo and Latin Hypercube sampling algorithms are available. For the acceleration of the convergence of the sampling methodologies, Support Vector Machines, Bayesian regression, and collocation stochastic polynomials chaos are implemented. The same methodologies here described could be used to solve optimization and uncertainties propagation problems using the RAVEN framework.« less

  13. Caldwell University's Department of Applied Behavior Analysis.

    PubMed

    Reeve, Kenneth F; Reeve, Sharon A

    2016-05-01

    Since 2004, faculty members at Caldwell University have developed three successful graduate programs in Applied Behavior Analysis (i.e., PhD, MA, non-degree programs), increased program faculty from two to six members, developed and operated an on-campus autism center, and begun a stand-alone Applied Behavior Analysis Department. This paper outlines a number of strategies used to advance these initiatives, including those associated with an extensive public relations campaign. We also outline challenges that have limited our programs' growth. These strategies, along with a consideration of potential challenges, might prove useful in guiding academicians who are interested in starting their own programs in behavior analysis. PMID:27606194

  14. Applying RESRAD-CHEM for chemical risk assessment

    SciTech Connect

    Cheng, J.J.; Yu, C.

    1995-07-01

    RESRAD-CHEM is a multiple pathway analysis computer code to evaluate chemically contaminated sites; it was developed at Argonne National Laboratory for the US Department of Energy. The code is designed to predict human health risks from exposure to hazardous chemicals and to derive cleanup criteria for chemically contaminated soils. It consists of environmental fate and transport models and is capable of predicting chemical concentrations over time in different environmental media. The methodology used in RESRAD-CHEM for exposure assessment and risk characterization follows the US Environmental Protection Agency`s guidance on Human Health Evaluation for Superfund. A user-friendly interface is incorporated for entering data, operating the code, and displaying results. RESRAD-CHEM is easy to use and is a powerful tool to assess chemical risk from environmental exposure.

  15. Probabilistic Exposure Analysis for Chemical Risk Characterization

    PubMed Central

    Bogen, Kenneth T.; Cullen, Alison C.; Frey, H. Christopher; Price, Paul S.

    2009-01-01

    This paper summarizes the state of the science of probabilistic exposure assessment (PEA) as applied to chemical risk characterization. Current probabilistic risk analysis methods applied to PEA are reviewed. PEA within the context of risk-based decision making is discussed, including probabilistic treatment of related uncertainty, interindividual heterogeneity, and other sources of variability. Key examples of recent experience gained in assessing human exposures to chemicals in the environment, and other applications to chemical risk characterization and assessment, are presented. It is concluded that, although improvements continue to be made, existing methods suffice for effective application of PEA to support quantitative analyses of the risk of chemically induced toxicity that play an increasing role in key decision-making objectives involving health protection, triage, civil justice, and criminal justice. Different types of information required to apply PEA to these different decision contexts are identified, and specific PEA methods are highlighted that are best suited to exposure assessment in these separate contexts. PMID:19223660

  16. Security risk assessment: applying the concepts of fuzzy logic.

    PubMed

    Bajpai, Shailendra; Sachdeva, Anish; Gupta, J P

    2010-01-15

    Chemical process industries (CPI) handling hazardous chemicals in bulk can be attractive targets for deliberate adversarial actions by terrorists, criminals and disgruntled employees. It is therefore imperative to have comprehensive security risk management programme including effective security risk assessment techniques. In an earlier work, it has been shown that security risk assessment can be done by conducting threat and vulnerability analysis or by developing Security Risk Factor Table (SRFT). HAZOP type vulnerability assessment sheets can be developed that are scenario based. In SRFT model, important security risk bearing factors such as location, ownership, visibility, inventory, etc., have been used. In this paper, the earlier developed SRFT model has been modified using the concepts of fuzzy logic. In the modified SRFT model, two linguistic fuzzy scales (three-point and four-point) are devised based on trapezoidal fuzzy numbers. Human subjectivity of different experts associated with previous SRFT model is tackled by mapping their scores to the newly devised fuzzy scale. Finally, the fuzzy score thus obtained is defuzzyfied to get the results. A test case of a refinery is used to explain the method and compared with the earlier work.

  17. What happened to analysis in applied behavior analysis?

    PubMed

    Pierce, W D; Epling, W F

    1980-01-01

    This paper addresses the current help-oriented focus of researchers in applied behavior analysis. Evidence from a recent volume of JABA suggests that analytic behavior is at low levels in applied analysis while cure-help behavior is at high strength. This low proportion of scientific behavior is apparantly related to cure-help contingencies set by institutions and agencies of help and the editorial policies of JABA itself. These contingencies have favored the flight to real people and a concern with client gains, evaluation and outcome strategies rather than the analysis of contingencies of reinforcement controlling human behavior. In this regard, the paper documents the current separation of applied behavior analysis from the experimental analysis of behavior. There is limited use of basic principles in applied analysis today and almost no reference to the current research in the experimental analysis of behavior involving concurrent operants and adjunctive behavior. This divorce of applied behavior research and the experimental analysis of behavior will mitigate against progress toward a powerful technology of behavior. In order to encourage a return to analysis in applied research, there is a need to consider the objectives of applied behavior analysis. The original purpose of behavioral technology is examined and a re-definition of the concept of "social importance" is presented which can direct applied researchers toward an analytic focus. At the same time a change in the publication policies of applied journals such as JABA toward analytic research and the design of new educational contingencies for students will insure the survival of analysis in applied behavior analysis. PMID:22478471

  18. Applied Pharmaceutical Analysis India 2014 conference report.

    PubMed

    Kole, Prashant; Barot, Deepak; Kotecha, Jignesh; Raina, Vijay; Rao, Mukkavilli; Yadav, Manish

    2014-01-01

    Applied Pharmaceutical Analysis (APA) India 23-26 February 2014, Ahmedabad, India The fifth Applied Pharmaceutical Analysis (APA) India meeting was held in February 2014 at Hyatt Ahmedabad, India. With the theme of 'The Science of Measurement: Current status and Future trends in Bioanalysis, Biotransformation and Drug Discovery Platforms', the conference was attended by over 160 delegates. The agenda comprised advanced and relevant research topics in the key areas of bioanalysis and drug metabolism. APA India 2014 provided a unique platform for networking and professional linking to participants, innovators and policy-makers. As part of the global research community, APA India continues to grow and receive considerable attention from the drug discovery and development community of India.

  19. Tropospheric Delay Raytracing Applied in VLBI Analysis

    NASA Astrophysics Data System (ADS)

    MacMillan, D. S.; Eriksson, D.; Gipson, J. M.

    2013-12-01

    Tropospheric delay modeling error continues to be one of the largest sources of error in VLBI analysis. For standard operational solutions, we use the VMF1 elevation-dependent mapping functions derived from ECMWF data. These mapping functions assume that tropospheric delay at a site is azimuthally symmetric. As this assumption does not reflect reality, we have determined the raytrace delay along the signal path through the troposphere for each VLBI quasar observation. We determined the troposphere refractivity fields from the pressure, temperature, specific humidity and geopotential height fields of the NASA GSFC GEOS-5 numerical weather model. We discuss results from analysis of the CONT11 R&D and the weekly operational R1+R4 experiment sessions. When applied in VLBI analysis, baseline length repeatabilities were better for 66-72% of baselines with raytraced delays than with VMF1 mapping functions. Vertical repeatabilities were better for 65% of sites.

  20. Positive Behavior Support and Applied Behavior Analysis

    PubMed Central

    Johnston, J.M; Foxx, Richard M; Jacobson, John W; Green, Gina; Mulick, James A

    2006-01-01

    This article reviews the origins and characteristics of the positive behavior support (PBS) movement and examines those features in the context of the field of applied behavior analysis (ABA). We raise a number of concerns about PBS as an approach to delivery of behavioral services and its impact on how ABA is viewed by those in human services. We also consider the features of PBS that have facilitated its broad dissemination and how ABA might benefit from emulating certain practices of the PBS movement. PMID:22478452

  1. Wavelet analysis applied to the IRAS cirrus

    NASA Technical Reports Server (NTRS)

    Langer, William D.; Wilson, Robert W.; Anderson, Charles H.

    1994-01-01

    The structure of infrared cirrus clouds is analyzed with Laplacian pyramid transforms, a form of non-orthogonal wavelets. Pyramid and wavelet transforms provide a means to decompose images into their spatial frequency components such that all spatial scales are treated in an equivalent manner. The multiscale transform analysis is applied to IRAS 100 micrometer maps of cirrus emission in the north Galactic pole region to extract features on different scales. In the maps we identify filaments, fragments and clumps by separating all connected regions. These structures are analyzed with respect to their Hausdorff dimension for evidence of the scaling relationships in the cirrus clouds.

  2. Sneak analysis applied to process systems

    NASA Astrophysics Data System (ADS)

    Whetton, Cris

    Traditional safety analyses, such as HAZOP, FMEA, FTA, and MORT, are less than effective at identifying hazards resulting from incorrect 'flow' - whether this be flow of information, actions, electric current, or even the literal flow of process fluids. Sneak Analysis (SA) has existed since the mid nineteen-seventies as a means of identifying such conditions in electric circuits; in which area, it is usually known as Sneak Circuit Analysis (SCA). This paper extends the ideas of Sneak Circuit Analysis to a general method of Sneak Analysis applied to process plant. The methods of SA attempt to capitalize on previous work in the electrical field by first producing a pseudo-electrical analog of the process and then analyzing the analog by the existing techniques of SCA, supplemented by some additional rules and clues specific to processes. The SA method is not intended to replace any existing method of safety analysis; instead, it is intended to supplement such techniques as HAZOP and FMEA by providing systematic procedures for the identification of a class of potential problems which are not well covered by any other method.

  3. Multidimensional Risk Analysis: MRISK

    NASA Technical Reports Server (NTRS)

    McCollum, Raymond; Brown, Douglas; O'Shea, Sarah Beth; Reith, William; Rabulan, Jennifer; Melrose, Graeme

    2015-01-01

    Multidimensional Risk (MRISK) calculates the combined multidimensional score using Mahalanobis distance. MRISK accounts for covariance between consequence dimensions, which de-conflicts the interdependencies of consequence dimensions, providing a clearer depiction of risks. Additionally, in the event the dimensions are not correlated, Mahalanobis distance reduces to Euclidean distance normalized by the variance and, therefore, represents the most flexible and optimal method to combine dimensions. MRISK is currently being used in NASA's Environmentally Responsible Aviation (ERA) project o assess risk and prioritize scarce resources.

  4. Applied spectrophotometry: analysis of a biochemical mixture.

    PubMed

    Trumbo, Toni A; Schultz, Emeric; Borland, Michael G; Pugh, Michael Eugene

    2013-01-01

    Spectrophotometric analysis is essential for determining biomolecule concentration of a solution and is employed ubiquitously in biochemistry and molecular biology. The application of the Beer-Lambert-Bouguer Lawis routinely used to determine the concentration of DNA, RNA or protein. There is however a significant difference in determining the concentration of a given species (RNA, DNA, protein) in isolation (a contrived circumstance) as opposed to determining that concentration in the presence of other species (a more realistic situation). To present the student with a more realistic laboratory experience and also to fill a hole that we believe exists in student experience prior to reaching a biochemistry course, we have devised a three week laboratory experience designed so that students learn to: connect laboratory practice with theory, apply the Beer-Lambert-Bougert Law to biochemical analyses, demonstrate the utility and limitations of example quantitative colorimetric assays, demonstrate the utility and limitations of UV analyses for biomolecules, develop strategies for analysis of a solution of unknown biomolecular composition, use digital micropipettors to make accurate and precise measurements, and apply graphing software. PMID:23625877

  5. Applied spectrophotometry: analysis of a biochemical mixture.

    PubMed

    Trumbo, Toni A; Schultz, Emeric; Borland, Michael G; Pugh, Michael Eugene

    2013-01-01

    Spectrophotometric analysis is essential for determining biomolecule concentration of a solution and is employed ubiquitously in biochemistry and molecular biology. The application of the Beer-Lambert-Bouguer Lawis routinely used to determine the concentration of DNA, RNA or protein. There is however a significant difference in determining the concentration of a given species (RNA, DNA, protein) in isolation (a contrived circumstance) as opposed to determining that concentration in the presence of other species (a more realistic situation). To present the student with a more realistic laboratory experience and also to fill a hole that we believe exists in student experience prior to reaching a biochemistry course, we have devised a three week laboratory experience designed so that students learn to: connect laboratory practice with theory, apply the Beer-Lambert-Bougert Law to biochemical analyses, demonstrate the utility and limitations of example quantitative colorimetric assays, demonstrate the utility and limitations of UV analyses for biomolecules, develop strategies for analysis of a solution of unknown biomolecular composition, use digital micropipettors to make accurate and precise measurements, and apply graphing software.

  6. Multiattribute risk analysis in nuclear emergency management.

    PubMed

    Hämäläinen, R P; Lindstedt, M R; Sinkko, K

    2000-08-01

    Radiation protection authorities have seen a potential for applying multiattribute risk analysis in nuclear emergency management and planning to deal with conflicting objectives, different parties involved, and uncertainties. This type of approach is expected to help in the following areas: to ensure that all relevant attributes are considered in decision making; to enhance communication between the concerned parties, including the public; and to provide a method for explicitly including risk analysis in the process. A multiattribute utility theory analysis was used to select a strategy for protecting the population after a simulated nuclear accident. The value-focused approach and the use of a neutral facilitator were identified as being useful. PMID:11051070

  7. Reliability analysis applied to structural tests

    NASA Technical Reports Server (NTRS)

    Diamond, P.; Payne, A. O.

    1972-01-01

    The application of reliability theory to predict, from structural fatigue test data, the risk of failure of a structure under service conditions because its load-carrying capability is progressively reduced by the extension of a fatigue crack, is considered. The procedure is applicable to both safe-life and fail-safe structures and, for a prescribed safety level, it will enable an inspection procedure to be planned or, if inspection is not feasible, it will evaluate the life to replacement. The theory has been further developed to cope with the case of structures with initial cracks, such as can occur in modern high-strength materials which are susceptible to the formation of small flaws during the production process. The method has been applied to a structure of high-strength steel and the results are compared with those obtained by the current life estimation procedures. This has shown that the conventional methods can be unconservative in certain cases, depending on the characteristics of the structure and the design operating conditions. The suitability of the probabilistic approach to the interpretation of the results from full-scale fatigue testing of aircraft structures is discussed and the assumptions involved are examined.

  8. Object Oriented Risk Analysis Workshop

    NASA Astrophysics Data System (ADS)

    Pons, M. Güell I.; Jaboyedoff, M.

    2009-04-01

    In the framework of the RISET Project (Interfaculty Network of Support to Education and Technology) an educational tool for introducing risk analysis has been developed. This workshop enables to carry a group of students (role-play game) through a step-by-step process of risk identification and quantification. The aim is to assess risk in a characteristic alpine village regarding natural hazards (rockfall, snow avalanche, flooding…) and is oriented to affected objects such as buildings, infrastructures... The workshop contains the following steps: 1.- Planning of the study and definition of stakeholders 2.- Hazard identification 3.- Risk analysis 4.- Risk assessment 5.- Proposition of mitigation measures 6- Risk management and cost-benefit analysis. During the process, information related to past events and useful concepts are provided in order to bring up discussion and decision making. The Risk Matrix and other graphical tools allow having a visual representation of the risk level and help to prioritize counter measures. At the end of the workshop, there is the possibility to compare the results between different groups and print out a summarizing report. This approach provides a rapid and comprehensible risk evaluation. The workshop is accessible from the internet and will be used for educational purposes at bachelor and master level as well as for external persons dealing with risk analysis.

  9. The Components of Microbiological Risk Analysis

    PubMed Central

    Bentley, Stefano; Giacometti, Federica; Serraino, Andrea

    2015-01-01

    The paper describes the process of risk analysis in a food safety perspective. The steps of risk analysis defined as a process consisting of three interconnected components (risk assessment, risk management, and risk communication) are analysed. The different components of the risk assessment, risk management and risk communication are further described. PMID:27800384

  10. Risk/benefit analysis

    SciTech Connect

    Crouch, E.A.C.; Wilson, R.

    1982-01-01

    The Reagan administration is intent on rolling back regulations it considers unwise to give new life to American industry, but regulations were instituted to protect individuals against long-term hazards. The authors believe these hazards must be assessed before a regulation is modified, suspended, or implemented. They point out the problems inherent in defining, perceiving, and estimating risk. Throughout, they combine theoretical discussions with actual case studies covering the risk associated with nuclear power plants, saccharin use, mass chest radiography, and others. They believe that risk assessment should be distinct from decision making, with the risk assessor supplying clear and objective information about hazards and the probability of damage as well as pointing out the uncertainties to policy makers. 149 references, 29 figures, 8 tables.

  11. On differentiation in applied behavior analysis

    PubMed Central

    Fawcett, Stephen B.

    1985-01-01

    Distinct types of activity in the field of applied behavior analysis are noted and discussed. Four metaphorical types of activity are considered: prospecting, farming, building, and guiding. Prospecting consists of time-limited exploration of a variety of beaviors, populations, or settings. Farming consists of producing new behaviors in the same setting using independent variables provided by the researchers or normally available in the setting. Building consists of combining procedural elements to create new programs or systems or to rehabilitate aspects of existing programs. Guiding involves pointing out connections between the principles of human behavior and the problems, populations, settings, and procedures with which researchers are (or could be) working. Advantages of each sphere are noted, and benefits of this division of labor to the field as a whole are discussed. PMID:22478631

  12. Artificial intelligence technologies applied to terrain analysis

    SciTech Connect

    Wright, J.C. ); Powell, D.R. )

    1990-01-01

    The US Army Training and Doctrine Command is currently developing, in cooperation with Los Alamos National Laboratory, a Corps level combat simulation to support military analytical studies. This model emphasizes high resolution modeling of the command and control processes, with particular attention to architectural considerations that enable extension of the model. A planned future extension is the inclusion of an computer based planning capability for command echelons that can be dynamical invoked during the execution of then model. Command and control is the process through which the activities of military forces are directed, coordinated, and controlled to achieve the stated mission. To perform command and control the commander must understand the mission, perform terrain analysis, understand his own situation and capabilities as well as the enemy situation and his probable actions. To support computer based planning, data structures must be available to support the computer's ability to understand'' the mission, terrain, own capabilities, and enemy situation. The availability of digitized terrain makes it feasible to apply artificial intelligence technologies to emulate the terrain analysis process, producing data structures for uses in planning. The work derived thus for to support the understanding of terrain is the topic of this paper. 13 refs., 5 figs., 6 tabs.

  13. Magnetic Analysis Techniques Applied to Desert Varnish

    NASA Technical Reports Server (NTRS)

    Schmidgall, E. R.; Moskowitz, B. M.; Dahlberg, E. D.; Kuhlman, K. R.

    2003-01-01

    Desert varnish is a black or reddish coating commonly found on rock samples from arid regions. Typically, the coating is very thin, less than half a millimeter thick. Previous research has shown that the primary components of desert varnish are silicon oxide clay minerals (60%), manganese and iron oxides (20-30%), and trace amounts of other compounds [1]. Desert varnish is thought to originate when windborne particles containing iron and manganese oxides are deposited onto rock surfaces where manganese oxidizing bacteria concentrate the manganese and form the varnish [4,5]. If desert varnish is indeed biogenic, then the presence of desert varnish on rock surfaces could serve as a biomarker, indicating the presence of microorganisms. This idea has considerable appeal, especially for Martian exploration [6]. Magnetic analysis techniques have not been extensively applied to desert varnish. The only previous magnetic study reported that based on room temperature demagnetization experiments, there were noticeable differences in magnetic properties between a sample of desert varnish and the substrate sandstone [7]. Based upon the results of the demagnetization experiments, the authors concluded that the primary magnetic component of desert varnish was either magnetite (Fe3O4) or maghemite ( Fe2O3).

  14. Analysis of the interaction between experimental and applied behavior analysis.

    PubMed

    Virues-Ortega, Javier; Hurtado-Parrado, Camilo; Cox, Alison D; Pear, Joseph J

    2014-01-01

    To study the influences between basic and applied research in behavior analysis, we analyzed the coauthorship interactions of authors who published in JABA and JEAB from 1980 to 2010. We paid particular attention to authors who published in both JABA and JEAB (dual authors) as potential agents of cross-field interactions. We present a comprehensive analysis of dual authors' coauthorship interactions using social networks methodology and key word analysis. The number of dual authors more than doubled (26 to 67) and their productivity tripled (7% to 26% of JABA and JEAB articles) between 1980 and 2010. Dual authors stood out in terms of number of collaborators, number of publications, and ability to interact with multiple groups within the field. The steady increase in JEAB and JABA interactions through coauthors and the increasing range of topics covered by dual authors provide a basis for optimism regarding the progressive integration of basic and applied behavior analysis.

  15. Applying Four Different Risk Models in Local Ore Selection

    SciTech Connect

    Richmond, Andrew

    2002-12-15

    Given the uncertainty in grade at a mine location, a financially risk-averse decision-maker may prefer to incorporate this uncertainty into the ore selection process. A FORTRAN program risksel is presented to calculate local risk-adjusted optimal ore selections using a negative exponential utility function and three dominance models: mean-variance, mean-downside risk, and stochastic dominance. All four methods are demonstrated in a grade control environment. In the case study, optimal selections range with the magnitude of financial risk that a decision-maker is prepared to accept. Except for the stochastic dominance method, the risk models reassign material from higher cost to lower cost processing options as the aversion to financial risk increases. The stochastic dominance model usually was unable to determine the optimal local selection.

  16. Natural hazard management high education: laboratory of hydrologic and hydraulic risk management and applied geomorphology

    NASA Astrophysics Data System (ADS)

    Giosa, L.; Margiotta, M. R.; Sdao, F.; Sole, A.; Albano, R.; Cappa, G.; Giammatteo, C.; Pagliuca, R.; Piccolo, G.; Statuto, D.

    2009-04-01

    The Environmental Engineering Faculty of University of Basilicata have higher-level course for students in the field of natural hazard. The curriculum provides expertise in the field of prediction, prevention and management of earthquake risk, hydrologic-hydraulic risk, and geomorphological risk. These skills will contribute to the training of specialists, as well as having a thorough knowledge of the genesis and the phenomenology of natural risks, know how to interpret, evaluate and monitor the dynamic of environment and of territory. In addition to basic training in the fields of mathematics and physics, the course of study provides specific lessons relating to seismic and structural dynamics of land, environmental and computational hydraulics, hydrology and applied hydrogeology. In particular in this course there are organized two connected examination arguments: Laboratory of hydrologic and hydraulic risk management and Applied geomorphology. These course foresee the development and resolution of natural hazard problems through the study of a real natural disaster. In the last year, the work project has regarded the collapse of two decantation basins of fluorspar, extracted from some mines in Stava Valley, 19 July 1985, northern Italy. During the development of the course, data and event information has been collected, a guided tour to the places of the disaster has been organized, and finally the application of mathematical models to simulate the disaster and analysis of the results has been carried out. The student work has been presented in a public workshop.

  17. Moving Forward: Positive Behavior Support and Applied Behavior Analysis

    ERIC Educational Resources Information Center

    Tincani, Matt

    2007-01-01

    A controversy has emerged about the relationship between positive behavior support and applied behavior analysis. Some behavior analysts suggest that positive behavior support and applied behavior analysis are the same (e.g., Carr & Sidener, 2002). Others argue that positive behavior support is harmful to applied behavior analysis (e.g., Johnston,…

  18. Risk and value analysis of SETI.

    PubMed

    Billingham, J

    1990-01-01

    This paper attempts to apply a traditional risk and value analysis to the Search for Extraterrestrial Intelligence--SETI. In view of the difficulties of assessing the probability of success, a comparison is made between SETI and a previous search for extraterrestrial life, the biological component of Project Viking. Our application of simple Utility Theory, given some reasonable assumptions, suggests that SETI is at least as worthwhile as the biological experiment on Viking.

  19. Risk and value analysis of SETI

    NASA Technical Reports Server (NTRS)

    Billingham, J.

    1990-01-01

    This paper attempts to apply a traditional risk and value analysis to the Search for Extraterrestrial Intelligence--SETI. In view of the difficulties of assessing the probability of success, a comparison is made between SETI and a previous search for extraterrestrial life, the biological component of Project Viking. Our application of simple Utility Theory, given some reasonable assumptions, suggests that SETI is at least as worthwhile as the biological experiment on Viking.

  20. Applying risk assessment principles to a batch distillation column

    SciTech Connect

    Woodward, J.L.; Moosemiller, M.D.

    1996-12-31

    Some distillation columns in the chemical industry are operated in batch mode with a fairly short operating cycle. At the end of each cycle the columns are cooled and recharged. During the cooling cycle, air will be drawn into the column by the action of a vacuum relief valve. Consequently, for a finite portion of the operating cycle a flammable mixture will exist in the column. Here we evaluate the risk posed by such an operation to see if a mitigation measure is justified. We develop a fault tree and estimate the frequency of ignition by all possible ignition sources. By comparing the risk reduction attainable by installing a lightning protection system with that attainable by using an inert blanketing system the lightning protection system is found to be the preferred solution. It provides about the same risk reduction at a lower overall cost. 2 refs., 3 figs., 4 tabs.

  1. Applying the lessons of high risk industries to health care.

    PubMed

    Hudson, P

    2003-12-01

    High risk industries such as commercial aviation and the oil and gas industry have achieved exemplary safety performance. This paper reviews how they have managed to do that. The primary reasons are the positive attitudes towards safety and the operation of effective formal safety management systems. The safety culture provides an important explanation of why such organisations perform well. An evolutionary model of safety culture is provided in which there is a range of cultures from the pathological through the reactive to the calculative. Later, the proactive culture can evolve towards the generative organisation, an alternative description of the high reliability organisation. The current status of health care is reviewed, arguing that it has a much higher level of accidents and has a reactive culture, lagging behind both high risk industries studied in both attitude and systematic management of patient risks. PMID:14645741

  2. Applying the lessons of high risk industries to health care

    PubMed Central

    Hudson, P

    2003-01-01

    High risk industries such as commercial aviation and the oil and gas industry have achieved exemplary safety performance. This paper reviews how they have managed to do that. The primary reasons are the positive attitudes towards safety and the operation of effective formal safety management systems. The safety culture provides an important explanation of why such organisations perform well. An evolutionary model of safety culture is provided in which there is a range of cultures from the pathological through the reactive to the calculative. Later, the proactive culture can evolve towards the generative organisation, an alternative description of the high reliability organisation. The current status of health care is reviewed, arguing that it has a much higher level of accidents and has a reactive culture, lagging behind both high risk industries studied in both attitude and systematic management of patient risks. PMID:14645741

  3. Initial Decision and Risk Analysis

    SciTech Connect

    Engel, David W.

    2012-02-29

    Decision and Risk Analysis capabilities will be developed for industry consideration and possible adoption within Year 1. These tools will provide a methodology for merging qualitative ranking of technology maturity and acknowledged risk contributors with quantitative metrics that drive investment decision processes. Methods and tools will be initially introduced as applications to the A650.1 case study, but modular spreadsheets and analysis routines will be offered to industry collaborators as soon as possible to stimulate user feedback and co-development opportunities.

  4. Introduction: Conversation Analysis in Applied Linguistics

    ERIC Educational Resources Information Center

    Sert, Olcay; Seedhouse, Paul

    2011-01-01

    This short, introductory paper presents an up-to-date account of works within the field of Applied Linguistics which have been influenced by a Conversation Analytic paradigm. The article reviews recent studies in classroom interaction, materials development, proficiency assessment and language teacher education. We believe that the publication of…

  5. Experiences of Uav Surveys Applied to Environmental Risk Management

    NASA Astrophysics Data System (ADS)

    Caprioli, M.; Trizzino, R.; Mazzone, F.; Scarano, M.

    2016-06-01

    In this paper the results of some surveys carried out in an area of Apulian territory affected by serious environmental hazard are presented. Unmanned Aerial Vehicles (UAV) are emerging as a key engineering tool for future environmental survey tasks. UAVs are increasingly seen as an attractive low-cost alternative or supplement to aerial and terrestrial photogrammetry due to their low cost, flexibility, availability and readiness for duty. In addition, UAVs can be operated in hazardous or temporarily inaccessible locations, that makes them very suitable for the assessment and management of environmental risk conditions. In order to verify the reliability of these technologies an UAV survey and A LIDAR survey have been carried outalong about 1 km of coast in the Salento peninsula, near the towns of San Foca, Torre dellOrso and SantAndrea( Lecce, Southern Italy). This area is affected by serious environmental risks due to the presence of dangerous rocky cliffs named falesie. The UAV platform was equipped with a photogrammetric measurement system that allowed us to obtain a mobile mapping of the fractured fronts of dangerous rocky cliffs. UAV-images data have been processed using dedicated software (AgisoftPhotoscan). The point clouds obtained from both the UAV and LIDAR surveys have been processed using Cloud Compare software, with the aim of testing the UAV results with respect to the LIDAR ones. The total error obtained was of centimeter-order that is a very satisfactory result. The environmental information has been arranged in an ArcGIS platform in order to assess the risk levels. The possibility to repeat the survey at time intervals more or less close together depending on the measured levels of risk and to compare the output allows following the trend of the dangerous phenomena. In conclusion, for inaccessible locations of dangerous rocky bodies the UAV survey coupled with GIS methodology proved to be a key engineering tool for the management of environmental

  6. Cancer Risk Assessment: Should New Science be Applied? Workgroup summary

    SciTech Connect

    Richard J. Bull; Antone L. Brooks

    2002-12-15

    OAK-B135 A symposium discussing the implications of certain phenomena observed in radiation biology for cancer risk assessment in general. In July of 2002 a workshop was convened that explored some of the intercellular phenomena that appear to condition responses to carcinogen exposure. Effects that result from communication between cells that appear to either increase the sphere of damage or to modify the sensitivity of cells to further damage were of particular interest. Much of the discussion focused on the effects of ionizing radiation that were transmitted from cells directly hit to cells not receiving direct exposure to radiation (bystander cells). In cell culture, increased rates of mutation, chromosomal aberration, apoptosis, genomic instability, and decreased clonogenic survival have all been observed in cells that have experienced no direct radiation. In addition, there is evidence that low doses of radiation or certain chemicals give rise to adaptive responses in which the treated cells develop resistance to the effects of high doses given in subsequent exposures. Data were presented at the workshop indicating that low dose exposure of animals to radiation and some chemicals frequently reduces the spontaneous rate of mutation in vitro and tumor responses in vivo. Finally, it was concluded that considerable improvement in understanding of how genetic variation may modify the impact of these phenomena is necessary before the risk implications can be fully appreciated. The workshop participants discussed the substantive challenge that these data present with respect to simple linear methodologies that are currently used in cancer risk assessment and attempted to identify broad strategies by which these phenomena may start to be used to refine cancer risk assessment methods in the future.

  7. Applying personal genetic data to injury risk assessment in athletes.

    PubMed

    Goodlin, Gabrielle T; Roos, Andrew K; Roos, Thomas R; Hawkins, Claire; Beache, Sydney; Baur, Stephen; Kim, Stuart K

    2014-01-01

    Recent studies have identified genetic markers associated with risk for certain sports-related injuries and performance-related conditions, with the hope that these markers could be used by individual athletes to personalize their training and diet regimens. We found that we could greatly expand the knowledge base of sports genetic information by using published data originally found in health and disease studies. For example, the results from large genome-wide association studies for low bone mineral density in elderly women can be re-purposed for low bone mineral density in young endurance athletes. In total, we found 124 single-nucleotide polymorphisms associated with: anterior cruciate ligament tear, Achilles tendon injury, low bone mineral density and stress fracture, osteoarthritis, vitamin/mineral deficiencies, and sickle cell trait. Of these single nucleotide polymorphisms, 91% have not previously been used in sports genetics. We conducted a pilot program on fourteen triathletes using this expanded knowledge base of genetic variants associated with sports injury. These athletes were genotyped and educated about how their individual genetic make-up affected their personal risk profile during an hour-long personal consultation. Overall, participants were favorable of the program, found it informative, and most acted upon their genetic results. This pilot program shows that recent genetic research provides valuable information to help reduce sports injuries and to optimize nutrition. There are many genetic studies for health and disease that can be mined to provide useful information to athletes about their individual risk for relevant injuries.

  8. Applying Personal Genetic Data to Injury Risk Assessment in Athletes

    PubMed Central

    Goodlin, Gabrielle T.; Roos, Andrew K.; Roos, Thomas R.; Hawkins, Claire; Beache, Sydney; Baur, Stephen; Kim, Stuart K.

    2015-01-01

    Recent studies have identified genetic markers associated with risk for certain sports-related injuries and performance-related conditions, with the hope that these markers could be used by individual athletes to personalize their training and diet regimens. We found that we could greatly expand the knowledge base of sports genetic information by using published data originally found in health and disease studies. For example, the results from large genome-wide association studies for low bone mineral density in elderly women can be re-purposed for low bone mineral density in young endurance athletes. In total, we found 124 single-nucleotide polymorphisms associated with: anterior cruciate ligament tear, Achilles tendon injury, low bone mineral density and stress fracture, osteoarthritis, vitamin/mineral deficiencies, and sickle cell trait. Of these single nucleotide polymorphisms, 91% have not previously been used in sports genetics. We conducted a pilot program on fourteen triathletes using this expanded knowledge base of genetic variants associated with sports injury. These athletes were genotyped and educated about how their individual genetic make-up affected their personal risk profile during an hour-long personal consultation. Overall, participants were favorable of the program, found it informative, and most acted upon their genetic results. This pilot program shows that recent genetic research provides valuable information to help reduce sports injuries and to optimize nutrition. There are many genetic studies for health and disease that can be mined to provide useful information to athletes about their individual risk for relevant injuries. PMID:25919592

  9. Applying the Gender Lens to Risk Factors and Outcome after Adult Cardiac Surgery

    PubMed Central

    Eifert, Sandra; Guethoff, Sonja; Kaczmarek, Ingo; Beiras-Fernandez, Andres; Seeland, Ute; Gulbins, Helmut; Seeburger, Jörg; Deutsch, Oliver; Jungwirth, Bettina; Katsari, Elpiniki; Dohmen, Pascal; Pfannmueller, Bettina; Hultgren, Rebecka; Schade, Ina; Kublickiene, Karolina; Mohr, Friedrich W.; Gansera, Brigitte

    2014-01-01

    Summary Background Applying the gender lens to risk factors and outcome after adult cardiac surgery is of major clinical interest, as the inclusion of sex and gender in research design and analysis may guarantee more comprehensive cardiovascular science and may consecutively result in a more effective surgical treatment as well as cost savings in cardiac surgery. Methods We have reviewed classical cardiovascular risk factors (diabetes, arterial hypertension, hyperlipidemia, smoking) according to a gender-based approach. Furthermore, we have examined comorbidities such as depression, renal insufficiency, and hormonal influences in regard to gender. Gender-sensitive economic aspects have been evaluated, surgical outcome has been analyzed, and cardiovascular research has been considered from a gender perspective. Results The influence of typical risk factors and outcome after cardiac surgery has been evaluated from a gender perspective, and the gender-specific distribution of these risk factors is reported on. The named comorbidities are listed. Economic aspects demonstrated a gender gap. Outcome after coronary and valvular surgeries as well as after heart transplantation are displayed in this regard. Results after postoperative use of intra-aortic balloon pump are shown. Gender-related aspects of clinical and biomedical cardiosurgical research are reported. Conclusions Female gender has become an independent risk factor of survival after the majority of cardiosurgical procedures. Severely impaired left ventricular ejection fraction independently predicts survival in men, whereas age does in females. PMID:26288584

  10. Statistical Uncertainty Analysis Applied to Criticality Calculation

    SciTech Connect

    Hartini, Entin; Andiwijayakusuma, Dinan; Susmikanti, Mike; Nursinta, A. W.

    2010-06-22

    In this paper, we present an uncertainty methodology based on a statistical approach, for assessing uncertainties in criticality prediction using monte carlo method due to uncertainties in the isotopic composition of the fuel. The methodology has been applied to criticality calculations with MCNP5 with additional stochastic input of the isotopic fuel composition. The stochastic input were generated using the latin hypercube sampling method based one the probability density function of each nuclide composition. The automatic passing of the stochastic input to the MCNP and the repeated criticality calculation is made possible by using a python script to link the MCNP and our latin hypercube sampling code.

  11. Applying texture analysis to materials engineering problems

    NASA Astrophysics Data System (ADS)

    Knorr, D. B.; Weiland, H.; Szpunar, J. A.

    1994-09-01

    Textures in materials have been studied extensively since the 1930s following the pioneering work of Wassermann.1,2 The modern era of texture measurement started in 1949 with the development of the x-ray pole figure technique for texture measurement by Schultz.3 Finally, modern texture analysis was initiated with the publication by Bunge4 and Roe5 of a mathematical method of pole figure inversion, which is now used to calculate the orientation distribution function (ODF). This article cannot summarize such an extensive body of work, but it does endeavor to provide the background necessary to understand texture analysis; it also illustrates several applications of texture.

  12. Cost Utility Analysis Applied to Library Budgeting.

    ERIC Educational Resources Information Center

    Stitleman, Leonard

    Cost Utility Analysis (CUA) is, basically, an administrative tool to be used in situations where making a choice among meaningful programs is necessary. It does not replace the administrator, but can provide a significant source of data for the decision maker. CUA can be a guide to the selection of an optimal program in terms of available funds,…

  13. Applied Spectrophotometry: Analysis of a Biochemical Mixture

    ERIC Educational Resources Information Center

    Trumbo, Toni A.; Schultz, Emeric; Borland, Michael G.; Pugh, Michael Eugene

    2013-01-01

    Spectrophotometric analysis is essential for determining biomolecule concentration of a solution and is employed ubiquitously in biochemistry and molecular biology. The application of the Beer-Lambert-Bouguer Lawis routinely used to determine the concentration of DNA, RNA or protein. There is however a significant difference in determining the…

  14. Science, Skepticism, and Applied Behavior Analysis

    PubMed Central

    Normand, Matthew P

    2008-01-01

    Pseudoscientific claims concerning medical and psychological treatments of all varieties are commonplace. As behavior analysts, a sound skeptical approach to our science and practice is essential. The present paper offers an overview of science and skepticism and discusses the relationship of skepticism to behavior analysis, with an emphasis on the types of issues concerning behavior analysts in practice. PMID:22477687

  15. Putting problem formulation at the forefront of GMO risk analysis.

    PubMed

    Tepfer, Mark; Racovita, Monica; Craig, Wendy

    2013-01-01

    When applying risk assessment and the broader process of risk analysis to decisions regarding the dissemination of genetically modified organisms (GMOs), the process has a tendency to become remarkably complex. Further, as greater numbers of countries consider authorising the large-scale dissemination of GMOs, and as GMOs with more complex traits reach late stages of development, there has been increasing concern about the burden posed by the complexity of risk analysis. We present here an improved approach for GMO risk analysis that gives a central role to problem formulation. Further, the risk analysis strategy has been clarified and simplified in order to make rigorously scientific risk assessment and risk analysis more broadly accessible to diverse stakeholder groups. PMID:23160540

  16. Putting problem formulation at the forefront of GMO risk analysis.

    PubMed

    Tepfer, Mark; Racovita, Monica; Craig, Wendy

    2013-01-01

    When applying risk assessment and the broader process of risk analysis to decisions regarding the dissemination of genetically modified organisms (GMOs), the process has a tendency to become remarkably complex. Further, as greater numbers of countries consider authorising the large-scale dissemination of GMOs, and as GMOs with more complex traits reach late stages of development, there has been increasing concern about the burden posed by the complexity of risk analysis. We present here an improved approach for GMO risk analysis that gives a central role to problem formulation. Further, the risk analysis strategy has been clarified and simplified in order to make rigorously scientific risk assessment and risk analysis more broadly accessible to diverse stakeholder groups.

  17. Thermal analysis applied to irradiated propolis

    NASA Astrophysics Data System (ADS)

    Matsuda, Andrea Harumi; Machado, Luci Brocardo; del Mastro, Nélida Lucia

    2002-03-01

    Propolis is a resinous hive product, collected by bees. Raw propolis requires a decontamination procedure and irradiation appears as a promising technique for this purpose. The valuable properties of propolis for food and pharmaceutical industries have led to increasing interest in its technological behavior. Thermal analysis is a chemical analysis that gives information about changes on heating of great importance for technological applications. Ground propolis samples were 60Co gamma irradiated with 0 and 10 kGy. Thermogravimetry curves shown a similar multi-stage decomposition pattern for both irradiated and unirradiated samples up to 600°C. Similarly, through differential scanning calorimetry , a coincidence of melting point of irradiated and unirradiated samples was found. The results suggest that the irradiation process do not interfere on the thermal properties of propolis when irradiated up to 10 kGy.

  18. Sensitivity Analysis Using Risk Measures.

    PubMed

    Tsanakas, Andreas; Millossovich, Pietro

    2016-01-01

    In a quantitative model with uncertain inputs, the uncertainty of the output can be summarized by a risk measure. We propose a sensitivity analysis method based on derivatives of the output risk measure, in the direction of model inputs. This produces a global sensitivity measure, explicitly linking sensitivity and uncertainty analyses. We focus on the case of distortion risk measures, defined as weighted averages of output percentiles, and prove a representation of the sensitivity measure that can be evaluated on a Monte Carlo sample, as a weighted average of gradients over the input space. When the analytical model is unknown or hard to work with, nonparametric techniques are used for gradient estimation. This process is demonstrated through the example of a nonlinear insurance loss model. Furthermore, the proposed framework is extended in order to measure sensitivity to constant model parameters, uncertain statistical parameters, and random factors driving dependence between model inputs.

  19. Artificial intelligence applied to process signal analysis

    NASA Technical Reports Server (NTRS)

    Corsberg, Dan

    1988-01-01

    Many space station processes are highly complex systems subject to sudden, major transients. In any complex process control system, a critical aspect of the human/machine interface is the analysis and display of process information. Human operators can be overwhelmed by large clusters of alarms that inhibit their ability to diagnose and respond to a disturbance. Using artificial intelligence techniques and a knowledge base approach to this problem, the power of the computer can be used to filter and analyze plant sensor data. This will provide operators with a better description of the process state. Once a process state is recognized, automatic action could be initiated and proper system response monitored.

  20. Scanning methods applied to bitemark analysis

    NASA Astrophysics Data System (ADS)

    Bush, Peter J.; Bush, Mary A.

    2010-06-01

    The 2009 National Academy of Sciences report on forensics focused criticism on pattern evidence subdisciplines in which statements of unique identity are utilized. One principle of bitemark analysis is that the human dentition is unique to the extent that a perpetrator may be identified based on dental traits in a bitemark. Optical and electron scanning methods were used to measure dental minutia and to investigate replication of detail in human skin. Results indicated that being a visco-elastic substrate, skin effectively reduces the resolution of measurement of dental detail. Conclusions indicate caution in individualization statements.

  1. Applying programmatic risk assessment to nuclear materials stabilization R and D planning

    SciTech Connect

    Kenley, C.R.; Brown-van Hoozer, S.A.

    1997-10-01

    A systems engineering approach to programmatic risk assessment, derived from the aerospace industry, was applied to various stabilization technologies to assess their relative maturity and availability for use in stabilizing nuclear materials. The assessment provided valuable information for trading off available technologies and identified the at-risk technologies that will require close tracking by the Department of Energy (DOE) to mitigate programmatic risks.

  2. Multivariate analysis applied to tomato hybrid production.

    PubMed

    Balasch, S; Nuez, F; Palomares, G; Cuartero, J

    1984-11-01

    Twenty characters were measured on 60 tomato varieties cultivated in the open-air and in polyethylene plastic-house. Data were analyzed by means of principal components, factorial discriminant methods, Mahalanobis D(2) distances and principal coordinate techniques. Factorial discriminant and Mahalanobis D(2) distances methods, both of which require collecting data plant by plant, lead to similar conclusions as the principal components method that only requires taking data by plots. Characters that make up the principal components in both environments studied are the same, although the relative importance of each one of them varies within the principal components. By combining information supplied by multivariate analysis with the inheritance mode of characters, crossings among cultivars can be experimented with that will produce heterotic hybrids showing characters within previously established limits.

  3. Toward applied behavior analysis of life aloft

    NASA Technical Reports Server (NTRS)

    Brady, J. V.

    1990-01-01

    This article deals with systems at multiple levels, at least from cell to organization. It also deals with learning, decision making, and other behavior at multiple levels. Technological development of a human behavioral ecosystem appropriate to space environments requires an analytic and synthetic orientation, explicitly experimental in nature, dictated by scientific and pragmatic considerations, and closely approximating procedures of established effectiveness in other areas of natural science. The conceptual basis of such an approach has its roots in environmentalism which has two main features: (1) knowledge comes from experience rather than from innate ideas, divine revelation, or other obscure sources; and (2) action is governed by consequences rather than by instinct, reason, will, beliefs, attitudes or even the currently fashionable cognitions. Without an experimentally derived data base founded upon such a functional analysis of human behavior, the overgenerality of "ecological systems" approaches render them incapable of ensuring the successful establishment of enduring space habitats. Without an experimentally derived function account of individual behavioral variability, a natural science of behavior cannot exist. And without a natural science of behavior, the social sciences will necessarily remain in their current status as disciplines of less than optimal precision or utility. Such a functional analysis of human performance should provide an operational account of behavior change in a manner similar to the way in which Darwin's approach to natural selection accounted for the evolution of phylogenetic lines (i.e., in descriptive, nonteleological terms). Similarly, as Darwin's account has subsequently been shown to be consonant with information obtained at the cellular level, so too should behavior principles ultimately prove to be in accord with an account of ontogenetic adaptation at a biochemical level. It would thus seem obvious that the most

  4. Toward applied behavior analysis of life aloft.

    PubMed

    Brady, J V

    1990-01-01

    This article deals with systems at multiple levels, at least from cell to organization. It also deals with learning, decision making, and other behavior at multiple levels. Technological development of a human behavioral ecosystem appropriate to space environments requires an analytic and synthetic orientation, explicitly experimental in nature, dictated by scientific and pragmatic considerations, and closely approximating procedures of established effectiveness in other areas of natural science. The conceptual basis of such an approach has its roots in environmentalism which has two main features: (1) knowledge comes from experience rather than from innate ideas, divine revelation, or other obscure sources; and (2) action is governed by consequences rather than by instinct, reason, will, beliefs, attitudes or even the currently fashionable cognitions. Without an experimentally derived data base founded upon such a functional analysis of human behavior, the overgenerality of "ecological systems" approaches render them incapable of ensuring the successful establishment of enduring space habitats. Without an experimentally derived function account of individual behavioral variability, a natural science of behavior cannot exist. And without a natural science of behavior, the social sciences will necessarily remain in their current status as disciplines of less than optimal precision or utility. Such a functional analysis of human performance should provide an operational account of behavior change in a manner similar to the way in which Darwin's approach to natural selection accounted for the evolution of phylogenetic lines (i.e., in descriptive, nonteleological terms). Similarly, as Darwin's account has subsequently been shown to be consonant with information obtained at the cellular level, so too should behavior principles ultimately prove to be in accord with an account of ontogenetic adaptation at a biochemical level. It would thus seem obvious that the most

  5. Toward applied behavior analysis of life aloft.

    PubMed

    Brady, J V

    1990-01-01

    This article deals with systems at multiple levels, at least from cell to organization. It also deals with learning, decision making, and other behavior at multiple levels. Technological development of a human behavioral ecosystem appropriate to space environments requires an analytic and synthetic orientation, explicitly experimental in nature, dictated by scientific and pragmatic considerations, and closely approximating procedures of established effectiveness in other areas of natural science. The conceptual basis of such an approach has its roots in environmentalism which has two main features: (1) knowledge comes from experience rather than from innate ideas, divine revelation, or other obscure sources; and (2) action is governed by consequences rather than by instinct, reason, will, beliefs, attitudes or even the currently fashionable cognitions. Without an experimentally derived data base founded upon such a functional analysis of human behavior, the overgenerality of "ecological systems" approaches render them incapable of ensuring the successful establishment of enduring space habitats. Without an experimentally derived function account of individual behavioral variability, a natural science of behavior cannot exist. And without a natural science of behavior, the social sciences will necessarily remain in their current status as disciplines of less than optimal precision or utility. Such a functional analysis of human performance should provide an operational account of behavior change in a manner similar to the way in which Darwin's approach to natural selection accounted for the evolution of phylogenetic lines (i.e., in descriptive, nonteleological terms). Similarly, as Darwin's account has subsequently been shown to be consonant with information obtained at the cellular level, so too should behavior principles ultimately prove to be in accord with an account of ontogenetic adaptation at a biochemical level. It would thus seem obvious that the most

  6. Semantic annotation of Web data applied to risk in food.

    PubMed

    Hignette, Gaëlle; Buche, Patrice; Couvert, Olivier; Dibie-Barthélemy, Juliette; Doussot, David; Haemmerlé, Ollivier; Mettler, Eric; Soler, Lydie

    2008-11-30

    A preliminary step to risk in food assessment is the gathering of experimental data. In the framework of the Sym'Previus project (http://www.symprevius.org), a complete data integration system has been designed, grouping data provided by industrial partners and data extracted from papers published in the main scientific journals of the domain. Those data have been classified by means of a predefined vocabulary, called ontology. Our aim is to complement the database with data extracted from the Web. In the framework of the WebContent project (www.webcontent.fr), we have designed a semi-automatic acquisition tool, called @WEB, which retrieves scientific documents from the Web. During the @WEB process, data tables are extracted from the documents and then annotated with the ontology. We focus on the data tables as they contain, in general, a synthesis of data published in the documents. In this paper, we explain how the columns of the data tables are automatically annotated with data types of the ontology and how the relations represented by the table are recognised. We also give the results of our experimentation to assess the quality of such an annotation.

  7. Command Process Modeling & Risk Analysis

    NASA Technical Reports Server (NTRS)

    Meshkat, Leila

    2011-01-01

    Commanding Errors may be caused by a variety of root causes. It's important to understand the relative significance of each of these causes for making institutional investment decisions. One of these causes is the lack of standardized processes and procedures for command and control. We mitigate this problem by building periodic tables and models corresponding to key functions within it. These models include simulation analysis and probabilistic risk assessment models.

  8. Applying Genomic Analysis to Newborn Screening

    PubMed Central

    Solomon, B.D.; Pineda-Alvarez, D.E.; Bear, K.A.; Mullikin, J.C.; Evans, J.P.

    2012-01-01

    Large-scale genomic analysis such as whole-exome and whole-genome sequencing is becoming increasingly prevalent in the research arena. Clinically, many potential uses of this technology have been proposed. One such application is the extension or augmentation of newborn screening. In order to explore this application, we examined data from 3 children with normal newborn screens who underwent whole-exome sequencing as part of research participation. We analyzed sequence information for 151 selected genes associated with conditions ascertained by newborn screening. We compared findings with publicly available databases and results from over 500 individuals who underwent whole-exome sequencing at the same facility. Novel variants were confirmed through bidirectional dideoxynucleotide sequencing. High-density microarrays (Illumina Omni1-Quad) were also performed to detect potential copy number variations affecting these genes. We detected an average of 87 genetic variants per individual. After excluding artifacts, 96% of the variants were found to be reported in public databases and have no evidence of pathogenicity. No variants were identified that would predict disease in the tested individuals, which is in accordance with their normal newborn screens. However, we identified 6 previously reported variants and 2 novel variants that, according to published literature, could result in affected offspring if the reproductive partner were also a mutation carrier; other specific molecular findings highlight additional means by which genomic testing could augment newborn screening. PMID:23112750

  9. Digital photoelastic analysis applied to implant dentistry

    NASA Astrophysics Data System (ADS)

    Ramesh, K.; Hariprasad, M. P.; Bhuvanewari, S.

    2016-12-01

    Development of improved designs of implant systems in dentistry have necessitated the study of stress fields in the implant regions of the mandible/maxilla for better understanding of the biomechanics involved. Photoelasticity has been used for various studies related to dental implants in view of whole field visualization of maximum shear stress in the form of isochromatic contours. The potential of digital photoelasticity has not been fully exploited in the field of implant dentistry. In this paper, the fringe field in the vicinity of the connected implants (All-On-Four® concept) is analyzed using recent advances in digital photoelasticity. Initially, a novel 3-D photoelastic model making procedure, to closely mimic all the anatomical features of the human mandible is proposed. By choosing appropriate orientation of the model with respect to the light path, the essential region of interest were sought to be analysed while keeping the model under live loading conditions. Need for a sophisticated software module to carefully identify the model domain has been brought out. For data extraction, five-step method is used and isochromatics are evaluated by twelve fringe photoelasticity. In addition to the isochromatic fringe field, whole field isoclinic data is also obtained for the first time in implant dentistry, which could throw important information in improving the structural stability of the implant systems. Analysis is carried out for the implant in the molar as well as the incisor region. In addition, the interaction effects of loaded molar implant on the incisor area are also studied.

  10. Goals Analysis Procedure Guidelines for Applying the Goals Analysis Process

    NASA Technical Reports Server (NTRS)

    Motley, Albert E., III

    2000-01-01

    One of the key elements to successful project management is the establishment of the "right set of requirements", requirements that reflect the true customer needs and are consistent with the strategic goals and objectives of the participating organizations. A viable set of requirements implies that each individual requirement is a necessary element in satisfying the stated goals and that the entire set of requirements, taken as a whole, is sufficient to satisfy the stated goals. Unfortunately, it is the author's experience that during project formulation phases' many of the Systems Engineering customers do not conduct a rigorous analysis of the goals and objectives that drive the system requirements. As a result, the Systems Engineer is often provided with requirements that are vague, incomplete, and internally inconsistent. To complicate matters, most systems development methodologies assume that the customer provides unambiguous, comprehensive and concise requirements. This paper describes the specific steps of a Goals Analysis process applied by Systems Engineers at the NASA Langley Research Center during the formulation of requirements for research projects. The objective of Goals Analysis is to identify and explore all of the influencing factors that ultimately drive the system's requirements.

  11. Tsunamis: Global Exposure and Local Risk Analysis

    NASA Astrophysics Data System (ADS)

    Harbitz, C. B.; Løvholt, F.; Glimsdal, S.; Horspool, N.; Griffin, J.; Davies, G.; Frauenfelder, R.

    2014-12-01

    The 2004 Indian Ocean tsunami led to a better understanding of the likelihood of tsunami occurrence and potential tsunami inundation, and the Hyogo Framework for Action (HFA) was one direct result of this event. The United Nations International Strategy for Disaster Risk Reduction (UN-ISDR) adopted HFA in January 2005 in order to reduce disaster risk. As an instrument to compare the risk due to different natural hazards, an integrated worldwide study was implemented and published in several Global Assessment Reports (GAR) by UN-ISDR. The results of the global earthquake induced tsunami hazard and exposure analysis for a return period of 500 years are presented. Both deterministic and probabilistic methods (PTHA) are used. The resulting hazard levels for both methods are compared quantitatively for selected areas. The comparison demonstrates that the analysis is rather rough, which is expected for a study aiming at average trends on a country level across the globe. It is shown that populous Asian countries account for the largest absolute number of people living in tsunami prone areas, more than 50% of the total exposed people live in Japan. Smaller nations like Macao and the Maldives are among the most exposed by population count. Exposed nuclear power plants are limited to Japan, China, India, Taiwan, and USA. On the contrary, a local tsunami vulnerability and risk analysis applies information on population, building types, infrastructure, inundation, flow depth for a certain tsunami scenario with a corresponding return period combined with empirical data on tsunami damages and mortality. Results and validation of a GIS tsunami vulnerability and risk assessment model are presented. The GIS model is adapted for optimal use of data available for each study. Finally, the importance of including landslide sources in the tsunami analysis is also discussed.

  12. Applying Qualitative Hazard Analysis to Support Quantitative Safety Analysis for Proposed Reduced Wake Separation Conops

    NASA Technical Reports Server (NTRS)

    Shortle, John F.; Allocco, Michael

    2005-01-01

    This paper describes a scenario-driven hazard analysis process to identify, eliminate, and control safety-related risks. Within this process, we develop selective criteria to determine the applicability of applying engineering modeling to hypothesized hazard scenarios. This provides a basis for evaluating and prioritizing the scenarios as candidates for further quantitative analysis. We have applied this methodology to proposed concepts of operations for reduced wake separation for closely spaced parallel runways. For arrivals, the process identified 43 core hazard scenarios. Of these, we classified 12 as appropriate for further quantitative modeling, 24 that should be mitigated through controls, recommendations, and / or procedures (that is, scenarios not appropriate for quantitative modeling), and 7 that have the lowest priority for further analysis.

  13. FORTRAN computer program for seismic risk analysis

    USGS Publications Warehouse

    McGuire, Robin K.

    1976-01-01

    A program for seismic risk analysis is described which combines generality of application, efficiency and accuracy of operation, and the advantage of small storage requirements. The theoretical basis for the program is first reviewed, and the computational algorithms used to apply this theory are described. The information required for running the program is listed. Published attenuation functions describing the variation with earthquake magnitude and distance of expected values for various ground motion parameters are summarized for reference by the program user. Finally, suggestions for use of the program are made, an example problem is described (along with example problem input and output) and the program is listed.

  14. Robust regression applied to fractal/multifractal analysis.

    NASA Astrophysics Data System (ADS)

    Portilla, F.; Valencia, J. L.; Tarquis, A. M.; Saa-Requejo, A.

    2012-04-01

    Fractal and multifractal are concepts that have grown increasingly popular in recent years in the soil analysis, along with the development of fractal models. One of the common steps is to calculate the slope of a linear fit commonly using least squares method. This shouldn't be a special problem, however, in many situations using experimental data the researcher has to select the range of scales at which is going to work neglecting the rest of points to achieve the best linearity that in this type of analysis is necessary. Robust regression is a form of regression analysis designed to circumvent some limitations of traditional parametric and non-parametric methods. In this method we don't have to assume that the outlier point is simply an extreme observation drawn from the tail of a normal distribution not compromising the validity of the regression results. In this work we have evaluated the capacity of robust regression to select the points in the experimental data used trying to avoid subjective choices. Based on this analysis we have developed a new work methodology that implies two basic steps: • Evaluation of the improvement of linear fitting when consecutive points are eliminated based on R p-value. In this way we consider the implications of reducing the number of points. • Evaluation of the significance of slope difference between fitting with the two extremes points and fitted with the available points. We compare the results applying this methodology and the common used least squares one. The data selected for these comparisons are coming from experimental soil roughness transect and simulated based on middle point displacement method adding tendencies and noise. The results are discussed indicating the advantages and disadvantages of each methodology. Acknowledgements Funding provided by CEIGRAM (Research Centre for the Management of Agricultural and Environmental Risks) and by Spanish Ministerio de Ciencia e Innovación (MICINN) through project no

  15. Risk analysis of heat recovery steam generator with semi quantitative risk based inspection API 581

    NASA Astrophysics Data System (ADS)

    Prayogo, Galang Sandy; Haryadi, Gunawan Dwi; Ismail, Rifky; Kim, Seon Jin

    2016-04-01

    Corrosion is a major problem that most often occurs in the power plant. Heat recovery steam generator (HRSG) is an equipment that has a high risk to the power plant. The impact of corrosion damage causing HRSG power plant stops operating. Furthermore, it could be threaten the safety of employees. The Risk Based Inspection (RBI) guidelines by the American Petroleum Institute (API) 58 has been used to risk analysis in the HRSG 1. By using this methodology, the risk that caused by unexpected failure as a function of the probability and consequence of failure can be estimated. This paper presented a case study relating to the risk analysis in the HRSG, starting with a summary of the basic principles and procedures of risk assessment and applying corrosion RBI for process industries. The risk level of each HRSG equipment were analyzed: HP superheater has a medium high risk (4C), HP evaporator has a medium-high risk (4C), and the HP economizer has a medium risk (3C). The results of the risk assessment using semi-quantitative method of standard API 581 based on the existing equipment at medium risk. In the fact, there is no critical problem in the equipment components. Damage mechanisms were prominent throughout the equipment is thinning mechanism. The evaluation of the risk approach was done with the aim of reducing risk by optimizing the risk assessment activities.

  16. Modeling environmental and human health risks of veterinary medicinal products applied in pond aquaculture.

    PubMed

    Rico, Andreu; Geng, Yue; Focks, Andreas; Van den Brink, Paul J

    2013-04-01

    A model called ERA-AQUA was developed to assess the risks posed by the use of veterinary medicinal products (VMPs) applied in aquaculture ponds for the targeted produce, surrounding aquatic ecosystems, consumers, and trade of the aquaculture produce. The model calculates risks by following a risk quotient approach, calculating predicted exposure concentrations (exposure assessment) and predicted no-effect concentrations (effect assessment) for the endpoint under study. The exposure assessment is performed by combining information on the environmental characteristics of the aquaculture pond, characteristics of the cultured species, aquaculture management practices, and physicochemical properties of the compound under study. The model predicts concentrations of VMPs in the pond water, pond sediment, cultured species, and watercourse receiving pond effluent discharges by mass balance equations. The effect assessment is performed by combining (eco)toxicological information and food safety threshold concentrations for the studied compound. In the present study, the scientific background, strengths, and limitations of the ERA-AQUA model are presented together with a sensitivity analysis and an example showing its potential applications.

  17. [The management of risks by the global risk analysis].

    PubMed

    Desroches, A

    2013-05-01

    After a reminder on the fundamental concepts of the management of risk, the author describes the overall analysis of risk (AGR), name given by the author to the up-to-date APR method which after several changes of the initial process aims to cover a perimeter of analysis and broader management both at the level of structural that business risks of any kind throughout the system development life cycle, of the study of its feasibility to dismantling.

  18. Negative reinforcement in applied behavior analysis: an emerging technology.

    PubMed

    Iwata, B A

    1987-01-01

    Although the effects of negative reinforcement on human behavior have been studied for a number of years, a comprehensive body of applied research does not exist at this time. This article describes three aspects of negative reinforcement as it relates to applied behavior analysis: behavior acquired or maintained through negative reinforcement, the treatment of negatively reinforced behavior, and negative reinforcement as therapy. A consideration of research currently being done in these areas suggests the emergence of an applied technology on negative reinforcement.

  19. Negative Reinforcement in Applied Behavior Analysis: An Emerging Technology.

    ERIC Educational Resources Information Center

    Iwata, Brian A.

    1987-01-01

    The article describes three aspects of negative reinforcement as it relates to applied behavior analysis: behavior acquired or maintained through negative reinforcement, the treatment of negatively reinforced behavior, and negative reinforcement as therapy. Current research suggests the emergence of an applied technology on negative reinforcement.…

  20. Animal Research in the "Journal of Applied Behavior Analysis"

    ERIC Educational Resources Information Center

    Edwards, Timothy L.; Poling, Alan

    2011-01-01

    This review summarizes the 6 studies with nonhuman animal subjects that have appeared in the "Journal of Applied Behavior Analysis" and offers suggestions for future research in this area. Two of the reviewed articles described translational research in which pigeons were used to illustrate and examine behavioral phenomena of applied significance…

  1. B. F. Skinner's contributions to applied behavior analysis

    PubMed Central

    Morris, Edward K.; Smith, Nathaniel G.; Altus, Deborah E.

    2005-01-01

    Our paper reviews and analyzes B. F. Skinner's contributions to applied behavior analysis in order to assess his role as the field's originator and founder. We found, first, that his contributions fall into five categorizes: the style and content of his science, his interpretations of typical and atypical human behavior, the implications he drew from his science for application, his descriptions of possible applications, and his own applications to nonhuman and human behavior. Second, we found that he explicitly or implicitly addressed all seven dimensions of applied behavior analysis. These contributions and the dimensions notwithstanding, he neither incorporated the field's scientific (e.g., analytic) and social dimensions (e.g., applied) into any program of published research such that he was its originator, nor did he systematically integrate, advance, and promote the dimensions so to have been its founder. As the founder of behavior analysis, however, he was the father of applied behavior analysis. PMID:22478444

  2. RAMS (Risk Analysis - Modular System) methodology

    SciTech Connect

    Stenner, R.D.; Strenge, D.L.; Buck, J.W.

    1996-10-01

    The Risk Analysis - Modular System (RAMS) was developed to serve as a broad scope risk analysis tool for the Risk Assessment of the Hanford Mission (RAHM) studies. The RAHM element provides risk analysis support for Hanford Strategic Analysis and Mission Planning activities. The RAHM also provides risk analysis support for the Hanford 10-Year Plan development activities. The RAMS tool draws from a collection of specifically designed databases and modular risk analysis methodologies and models. RAMS is a flexible modular system that can be focused on targeted risk analysis needs. It is specifically designed to address risks associated with overall strategy, technical alternative, and `what if` questions regarding the Hanford cleanup mission. RAMS is set up to address both near-term and long-term risk issues. Consistency is very important for any comparative risk analysis, and RAMS is designed to efficiently and consistently compare risks and produce risk reduction estimates. There is a wide range of output information that can be generated by RAMS. These outputs can be detailed by individual contaminants, waste forms, transport pathways, exposure scenarios, individuals, populations, etc. However, they can also be in rolled-up form to support high-level strategy decisions.

  3. Starlink corn: a risk analysis.

    PubMed Central

    Bucchini, Luca; Goldman, Lynn R

    2002-01-01

    Modern biotechnology has dramatically increased our ability to alter the agronomic traits of plants. Among the novel traits that biotechnology has made available, an important group includes Bacillus thuringiensis-derived insect resistance. This technology has been applied to potatoes, cotton, and corn. Benefits of Bt crops, and biotechnology generally, can be realized only if risks are assessed and managed properly. The case of Starlink corn, a plant modified with a gene that encodes the Bt protein Cry9c, was a severe test of U.S. regulatory agencies. The U.S. Environmental Protection Agency had restricted its use to animal feed due to concern about the potential for allergenicity. However, Starlink corn was later found throughout the human food supply, resulting in food recalls by the Food and Drug Administration and significant disruption of the food supply. Here we examine the regulatory history of Starlink, the assessment framework employed by the U.S. government, assumptions and information gaps, and the key elements of government efforts to manage the product. We explore the impacts on regulations, science, and society and conclude that only significant advances in our understanding of food allergies and improvements in monitoring and enforcement will avoid similar events in the future. Specifically, we need to develop a stronger fundamental basis for predicting allergic sensitization and reactions if novel proteins are to be introduced in this fashion. Mechanisms are needed to assure that worker and community aeroallergen risks are considered. Requirements are needed for the development of valid assays so that enforcement and post market surveillance activities can be conducted. PMID:11781159

  4. The possibilities of applying a risk-oriented approach to the NPP reliability and safety enhancement problem

    NASA Astrophysics Data System (ADS)

    Komarov, Yu. A.

    2014-10-01

    An analysis and some generalizations of approaches to risk assessments are presented. Interconnection between different interpretations of the "risk" notion is shown, and the possibility of applying the fuzzy set theory to risk assessments is demonstrated. A generalized formulation of the risk assessment notion is proposed in applying risk-oriented approaches to the problem of enhancing reliability and safety in nuclear power engineering. The solution of problems using the developed risk-oriented approaches aimed at achieving more reliable and safe operation of NPPs is described. The results of studies aimed at determining the need (advisability) to modernize/replace NPP elements and systems are presented together with the results obtained from elaborating the methodical principles of introducing the repair concept based on the equipment technical state. The possibility of reducing the scope of tests and altering the NPP systems maintenance strategy is substantiated using the risk-oriented approach. A probabilistic model for estimating the validity of boric acid concentration measurements is developed.

  5. Applying programmatic risk assessment to nuclear materials stabilization R and D planning

    SciTech Connect

    Brown-Van Hoozer, S.A.; Kenley, C.R.

    1997-10-01

    A systems engineering approach to programmatic risk assessment, derived from the aerospace industry, was applied to various stabilization technologies to assess their relative maturity and availability for use in stabilizing nuclear materials. The assessment provided valuable information for trading off available technologies and identified the at-risk technologies that will require close tracking by the Department of Energy (DOE) to mitigate programmatic risks. This paper presents the programmatic risk assessment methodology developed for the 1995 R and D Plan and updated for the 1996 R and D Plan. Results of the 1996 assessment also are presented (DOE/ID-10561, 1996).

  6. The Significance of Regional Analysis in Applied Geography.

    ERIC Educational Resources Information Center

    Sommers, Lawrence M.

    Regional analysis is central to applied geographic research, contributing to better planning and policy development for a variety of societal problems facing the United States. The development of energy policy serves as an illustration of the capabilities of this type of analysis. The United States has had little success in formulating a national…

  7. Quantitative Analysis of the Interdisciplinarity of Applied Mathematics

    PubMed Central

    Zhang, Pengyuan

    2015-01-01

    The increasing use of mathematical techniques in scientific research leads to the interdisciplinarity of applied mathematics. This viewpoint is validated quantitatively here by statistical and network analysis on the corpus PNAS 1999–2013. A network describing the interdisciplinary relationships between disciplines in a panoramic view is built based on the corpus. Specific network indicators show the hub role of applied mathematics in interdisciplinary research. The statistical analysis on the corpus content finds that algorithms, a primary topic of applied mathematics, positively correlates, increasingly co-occurs, and has an equilibrium relationship in the long-run with certain typical research paradigms and methodologies. The finding can be understood as an intrinsic cause of the interdisciplinarity of applied mathematics. PMID:26352604

  8. Negative reinforcement in applied behavior analysis: an emerging technology.

    PubMed Central

    Iwata, B A

    1987-01-01

    Although the effects of negative reinforcement on human behavior have been studied for a number of years, a comprehensive body of applied research does not exist at this time. This article describes three aspects of negative reinforcement as it relates to applied behavior analysis: behavior acquired or maintained through negative reinforcement, the treatment of negatively reinforced behavior, and negative reinforcement as therapy. A consideration of research currently being done in these areas suggests the emergence of an applied technology on negative reinforcement. PMID:3323157

  9. Seismic risk assessment as applied to the Zion Nuclear Generating Station

    SciTech Connect

    Wells, J.

    1984-08-01

    To assist the US Nuclear Regulatory Commission (NRC) in its licensing and evaluation role, the NRC funded the Seismic Safety Margins Research Program (SSMRP) at Lawrence Livermore National Laboratory (LLNL) with the goal of developing tools and data bases to evaluate the risk of earthquake caused radioactive release from a commercial nuclear power plant. This paper describes the SSMRP risk assessment methodology and the results generated by applying this methodology to the Zion Nuclear Generating Station. In addition to describing the failure probabilities and risk values, the effects of assumptions about plant configuration, plant operation, and dependence will be given.

  10. Analytic concepts for assessing risk as applied to human space flight

    SciTech Connect

    Garrick, B.J.

    1997-04-30

    Quantitative risk assessment (QRA) principles provide an effective framework for quantifying individual elements of risk, including the risk to astronauts and spacecraft of the radiation environment of space flight. The concept of QRA is based on a structured set of scenarios that could lead to different damage states initiated by either hardware failure, human error, or external events. In the context of a spacecraft risk assessment, radiation may be considered as an external event and analyzed in the same basic way as any other contributor to risk. It is possible to turn up the microscope on any particular contributor to risk and ask more detailed questions than might be necessary to simply assess safety. The methods of QRA allow for as much fine structure in the analysis as is desired. For the purpose of developing a basis for comprehensive risk management and considering the tendency to {open_quotes}fear anything nuclear,{close_quotes} radiation risk is a prime candidate for examination beyond that necessary to answer the basic question of risk. Thus, rather than considering only the customary damage states of fatalities or loss of a spacecraft, it is suggested that the full range of damage be analyzed to quantify radiation risk. Radiation dose levels in the form of a risk curve accomplish such a result. If the risk curve is the complementary cumulative distribution function, then it answers the extended question of what is the likelihood of receiving a specific dose of radiation or greater. Such results can be converted to specific health effects as desired. Knowing the full range of the radiation risk of a space mission and the contributors to that risk provides the information necessary to take risk management actions [operational, design, scheduling of missions around solar particle events (SPE), etc.] that clearly control radiation exposure.

  11. Applied behavior analysis: New directions from the laboratory

    PubMed Central

    Epling, W. Frank; Pierce, W. David

    1983-01-01

    Applied behavior analysis began when laboratory based principles were extended to humans inorder to change socially significant behavior. Recent laboratory findings may have applied relevance; however, the majority of basic researchers have not clearly communicated the practical implications of their work. The present paper samples some of the new findings and attempts to demonstrate their applied importance. Schedule-induced behavior which occurs as a by-product of contingencies of reinforcement is discussed. Possible difficulties in treatment and management of induced behaviors are considered. Next, the correlation-based law of effect and the implications of relative reinforcement are explored in terms of applied examples. Relative rate of reinforcement is then extended to the literature dealing with concurrent operants. Concurrent operant models may describe human behavior of applied importance, and several techniques for modification of problem behavior are suggested. As a final concern, the paper discusses several new paradigms. While the practical importance of these models is not clear at the moment, it may be that new practical advantages will soon arise. Thus, it is argued that basic research continues to be of theoretical and practical importance to applied behavior analysis. PMID:22478574

  12. Overview of MSFC's Applied Fluid Dynamics Analysis Group Activities

    NASA Technical Reports Server (NTRS)

    Garcia, Roberto; Wang, Tee-See; Griffin, Lisa; Turner, James E. (Technical Monitor)

    2001-01-01

    This document is a presentation graphic which reviews the activities of the Applied Fluid Dynamics Analysis Group at Marshall Space Flight Center (i.e., Code TD64). The work of this group focused on supporting the space transportation programs. The work of the group is in Computational Fluid Dynamic tool development. This development is driven by hardware design needs. The major applications for the design and analysis tools are: turbines, pumps, propulsion-to-airframe integration, and combustion devices.

  13. Overview of MSFC's Applied Fluid Dynamics Analysis Group Activities

    NASA Technical Reports Server (NTRS)

    Garcia, Roberto; Griffin, Lisa; Williams, Robert

    2002-01-01

    This viewgraph report presents an overview of activities and accomplishments of NASA's Marshall Space Flight Center's Applied Fluid Dynamics Analysis Group. Expertise in this group focuses on high-fidelity fluids design and analysis with application to space shuttle propulsion and next generation launch technologies. Topics covered include: computational fluid dynamics research and goals, turbomachinery research and activities, nozzle research and activities, combustion devices, engine systems, MDA development and CFD process improvements.

  14. Risk Analysis for Resource Planning Optimization

    NASA Technical Reports Server (NTRS)

    Chueng, Kar-Ming

    2008-01-01

    The main purpose of this paper is to introduce a risk management approach that allows planners to quantify the risk and efficiency tradeoff in the presence of uncertainties, and to make forward-looking choices in the development and execution of the plan. Demonstrate a planning and risk analysis framework that tightly integrates mathematical optimization, empirical simulation, and theoretical analysis techniques to solve complex problems.

  15. Risk Analysis and Uncertainty: Implications for Counselling

    ERIC Educational Resources Information Center

    Hassenzahl, David

    2004-01-01

    Over the past two decades, the risk analysis community has made substantial advances in understanding and describing uncertainty. Uncertainty is ubiquitous, complex, both quantitative and qualitative in nature, and often irreducible. Uncertainty thus creates a challenge when using risk analysis to evaluate the rationality of group and individual…

  16. 38 CFR 75.115 - Risk analysis.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... preparation of the risk analysis may include data mining if necessary for the development of relevant...) INFORMATION SECURITY MATTERS Data Breaches § 75.115 Risk analysis. If a data breach involving sensitive... possible after the data breach, a non-VA entity with relevant expertise in data breach assessment and...

  17. 38 CFR 75.115 - Risk analysis.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... preparation of the risk analysis may include data mining if necessary for the development of relevant...) INFORMATION SECURITY MATTERS Data Breaches § 75.115 Risk analysis. If a data breach involving sensitive... possible after the data breach, a non-VA entity with relevant expertise in data breach assessment and...

  18. 38 CFR 75.115 - Risk analysis.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... preparation of the risk analysis may include data mining if necessary for the development of relevant...) INFORMATION SECURITY MATTERS Data Breaches § 75.115 Risk analysis. If a data breach involving sensitive... possible after the data breach, a non-VA entity with relevant expertise in data breach assessment and...

  19. 38 CFR 75.115 - Risk analysis.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... preparation of the risk analysis may include data mining if necessary for the development of relevant...) INFORMATION SECURITY MATTERS Data Breaches § 75.115 Risk analysis. If a data breach involving sensitive... possible after the data breach, a non-VA entity with relevant expertise in data breach assessment and...

  20. Carbon Fiber Risk Analysis. [conference

    NASA Technical Reports Server (NTRS)

    1979-01-01

    The scope and status of the effort to assess the risks associated with the accidental release of carbon/graphite fibers from civil aircraft is presented. Vulnerability of electrical and electronic equipment to carbon fibers, dispersal of carbon fibers, effectiveness of filtering systems, impact of fiber induced failures, and risk methodology are among the topics covered.

  1. Applied Behavior Analysis Is a Science And, Therefore, Progressive

    ERIC Educational Resources Information Center

    Leaf, Justin B.; Leaf, Ronald; McEachin, John; Taubman, Mitchell; Ala'i-Rosales, Shahla; Ross, Robert K.; Smith, Tristram; Weiss, Mary Jane

    2016-01-01

    Applied behavior analysis (ABA) is a science and, therefore, involves progressive approaches and outcomes. In this commentary we argue that the spirit and the method of science should be maintained in order to avoid reductionist procedures, stifled innovation, and rote, unresponsive protocols that become increasingly removed from meaningful…

  2. Progressive-Ratio Schedules and Applied Behavior Analysis

    ERIC Educational Resources Information Center

    Poling, Alan

    2010-01-01

    Establishing appropriate relations between the basic and applied areas of behavior analysis has been of long and persistent interest to the author. In this article, the author illustrates that there is a direct relation between how hard an organism will work for access to an object or activity, as indexed by the largest ratio completed under a…

  3. Context, Cognition, and Biology in Applied Behavior Analysis.

    ERIC Educational Resources Information Center

    Morris, Edward K.

    Behavior analysts are having their professional identities challenged by the roles that cognition and biology are said to play in the conduct and outcome of applied behavior analysis and behavior therapy. For cogniphiliacs, cognition and biology are central to their interventions because cognition and biology are said to reflect various processes,…

  4. Overview af MSFC's Applied Fluid Dynamics Analysis Group Activities

    NASA Technical Reports Server (NTRS)

    Garcia, Roberto; Griffin, Lisa; Williams, Robert

    2004-01-01

    This paper presents viewgraphs on NASA Marshall Space Flight Center's Applied Fluid Dynamics Analysis Group Activities. The topics include: 1) Status of programs at MSFC; 2) Fluid Mechanics at MSFC; 3) Relevant Fluid Dynamics Activities at MSFC; and 4) Shuttle Return to Flight.

  5. Measurement uncertainty analysis techniques applied to PV performance measurements

    SciTech Connect

    Wells, C.

    1992-10-01

    The purpose of this presentation is to provide a brief introduction to measurement uncertainty analysis, outline how it is done, and illustrate uncertainty analysis with examples drawn from the PV field, with particular emphasis toward its use in PV performance measurements. The uncertainty information we know and state concerning a PV performance measurement or a module test result determines, to a significant extent, the value and quality of that result. What is measurement uncertainty analysis It is an outgrowth of what has commonly been called error analysis. But uncertainty analysis, a more recent development, gives greater insight into measurement processes and tests, experiments, or calibration results. Uncertainty analysis gives us an estimate of the I interval about a measured value or an experiment's final result within which we believe the true value of that quantity will lie. Why should we take the time to perform an uncertainty analysis A rigorous measurement uncertainty analysis: Increases the credibility and value of research results; allows comparisons of results from different labs; helps improve experiment design and identifies where changes are needed to achieve stated objectives (through use of the pre-test analysis); plays a significant role in validating measurements and experimental results, and in demonstrating (through the post-test analysis) that valid data have been acquired; reduces the risk of making erroneous decisions; demonstrates quality assurance and quality control measures have been accomplished; define Valid Data as data having known and documented paths of: Origin, including theory; measurements; traceability to measurement standards; computations; uncertainty analysis of results.

  6. Predicting pathogen transport and risk of infection from land-applied biosolids

    NASA Astrophysics Data System (ADS)

    Olson, M. S.; Teng, J.; Kumar, A.; Gurian, P.

    2011-12-01

    Biosolids have been recycled as fertilizer to sustainably improve and maintain productive soils and to stimulate plant growth for over forty years, but may contain low levels of microbial pathogens. The Spreadsheet Microbial Assessment of Risk: Tool for Biosolids ("SMART Biosolids") is an environmental transport, exposure and risk model that compiles knowledge on the occurrence, environmental dispersion and attenuation of biosolids-associated pathogens to estimate microbial risk from biosolids land application. The SMART Biosolids model calculates environmental pathogen concentrations and assesses risk associated with exposure to pathogens from land-applied biosolids through five pathways: 1) inhalation of aerosols from land application sites, 2) consumption of groundwater contaminated by land-applied biosolids, 3) direct ingestion of biosolids-amended soils, 4) ingestion of plants contaminated by land-applied biosolids, and 5) consumption of surface water contaminated by runoff from a land application site. The SMART Biosolids model can be applied under a variety of scenarios, thereby providing insight into effective management practices. This study presents example results of the SMART Biosolids model, focusing on the groundwater and surface water pathways, following biosolids application to a typical site in Michigan. Volumes of infiltration and surface water runoff are calculated following a 100-year storm event. Pathogen transport and attenuation through the subsurface and via surface runoff are modeled, and pathogen concentrations in a downstream well and an adjacent pond are calculated. Risks are calculated for residents of nearby properties. For a 100-year storm event occurring immediately after biosolids application, the surface water pathway produces risks that may be of some concern, but best estimates do not exceed the bounds of what has been considered acceptable risk for recreational water use (Table 1); groundwater risks are very uncertain and at the

  7. Research in applied mathematics, numerical analysis, and computer science

    NASA Technical Reports Server (NTRS)

    1984-01-01

    Research conducted at the Institute for Computer Applications in Science and Engineering (ICASE) in applied mathematics, numerical analysis, and computer science is summarized and abstracts of published reports are presented. The major categories of the ICASE research program are: (1) numerical methods, with particular emphasis on the development and analysis of basic numerical algorithms; (2) control and parameter identification; (3) computational problems in engineering and the physical sciences, particularly fluid dynamics, acoustics, and structural analysis; and (4) computer systems and software, especially vector and parallel computers.

  8. Characterization of anomalies by applying methods of fractal analysis

    SciTech Connect

    Sakuma, M.; Kozma, R.; Kitamura, M.

    1996-01-01

    Fractal analysis is applied in a variety of research fields to characterize nonstationary data. Here, fractal analysis is used as a tool of characterization in time series. The fractal dimension is calculated by Higuchi`s method, and the effect of small data size on accuracy is studied in detail. Three types of fractal-based anomaly indicators are adopted: (a) the fractal dimension, (b) the error of the fractal dimension, and (c) the chi-square value of the linear fitting of the fractal curve in the wave number domain. Fractal features of time series can be characterized by introducing these three measures. The proposed method is applied to various simulated fractal time series with ramp, random, and periodic noise anomalies and also to neutron detector signals acquired in a nuclear reactor. Fractal characterization can successfully supplement conventional signal analysis methods especially if nonstationary and non-Gaussian features of the signal become important.

  9. Carbon Fiber Risk Analysis: Conclusions

    NASA Technical Reports Server (NTRS)

    Huston, R. J.

    1979-01-01

    It was concluded that preliminary estimates indicate that the public risk due to accidental release of carbon fiber from air transport aircraft is small. It was also concluded that further work is required to increase confidence in these estimates.

  10. Recent reinforcement-schedule research and applied behavior analysis

    PubMed Central

    Lattal, Kennon A.; Neef, Nancy A.

    1996-01-01

    Reinforcement schedules are considered in relation to applied behavior analysis by examining several recent laboratory experiments with humans and other animals. The experiments are drawn from three areas of contemporary schedule research: behavioral history effects on schedule performance, the role of instructions in schedule performance of humans, and dynamic schedules of reinforcement. All of the experiments are discussed in relation to the role of behavioral history in current schedule performance. The paper concludes by extracting from the experiments some more general issues concerning reinforcement schedules in applied research and practice. PMID:16795888

  11. Applying Social Psychological Models to Predicting HIV-Related Sexual Risk Behaviors Among African Americans.

    PubMed

    Cochran, Susan D; Mays, Vickie M

    1993-05-01

    Existing models of attitude-behavior relationships, including the Health Belief Model, the Theory of Reasoned Action, and the Self-Efficacy Theory, are increasingly being used by psychologists to predict human immunodeficiency virus (HIV)-related risk behaviors. The authors briefly highlight some of the difficulties that might arise in applying these models to predicting the risk behaviors of African Americans. These social psychological models tend to emphasize the importance of individualistic, direct control of behavioral choices and deemphasize factors, such as racism and poverty, particularly relevant to that segment of the African American population most at risk for HIV infection. Applications of these models without taking into account the unique issues associated with behavioral choices within the African American community may fail to capture the relevant determinants of risk behaviors.

  12. Bridging the two cultures of risk analysis

    SciTech Connect

    Jasanoff, S. )

    1993-04-01

    During the past 15 years, risk analysis has come of age as an interdisciplinary field of remarkable breadth, nurturing connections among fields as diverse as mathematics, biostatistics, toxicology, and engineering on one hand, and law, psychology, sociology, and economics on the other hand. In this editorial, the author addresses the question: What has the presence of social scientists in the network meant to the substantive development of the field of risk analysis The answers offered here discuss the substantial progress in bridging the two cultures of risk analysis. Emphasis is made of the continual need for monitoring risk analysis. Topics include: the micro-worlds of risk assessment; constraining assumptions; and exchange programs. 14 refs.

  13. Automating risk analysis of software design models.

    PubMed

    Frydman, Maxime; Ruiz, Guifré; Heymann, Elisa; César, Eduardo; Miller, Barton P

    2014-01-01

    The growth of the internet and networked systems has exposed software to an increased amount of security threats. One of the responses from software developers to these threats is the introduction of security activities in the software development lifecycle. This paper describes an approach to reduce the need for costly human expertise to perform risk analysis in software, which is common in secure development methodologies, by automating threat modeling. Reducing the dependency on security experts aims at reducing the cost of secure development by allowing non-security-aware developers to apply secure development with little to no additional cost, making secure development more accessible. To automate threat modeling two data structures are introduced, identification trees and mitigation trees, to identify threats in software designs and advise mitigation techniques, while taking into account specification requirements and cost concerns. These are the components of our model for automated threat modeling, AutSEC. We validated AutSEC by implementing it in a tool based on data flow diagrams, from the Microsoft security development methodology, and applying it to VOMS, a grid middleware component, to evaluate our model's performance. PMID:25136688

  14. Automating Risk Analysis of Software Design Models

    PubMed Central

    Ruiz, Guifré; Heymann, Elisa; César, Eduardo; Miller, Barton P.

    2014-01-01

    The growth of the internet and networked systems has exposed software to an increased amount of security threats. One of the responses from software developers to these threats is the introduction of security activities in the software development lifecycle. This paper describes an approach to reduce the need for costly human expertise to perform risk analysis in software, which is common in secure development methodologies, by automating threat modeling. Reducing the dependency on security experts aims at reducing the cost of secure development by allowing non-security-aware developers to apply secure development with little to no additional cost, making secure development more accessible. To automate threat modeling two data structures are introduced, identification trees and mitigation trees, to identify threats in software designs and advise mitigation techniques, while taking into account specification requirements and cost concerns. These are the components of our model for automated threat modeling, AutSEC. We validated AutSEC by implementing it in a tool based on data flow diagrams, from the Microsoft security development methodology, and applying it to VOMS, a grid middleware component, to evaluate our model's performance. PMID:25136688

  15. Initial Risk Analysis and Decision Making Framework

    SciTech Connect

    Engel, David W.

    2012-02-01

    Commercialization of new carbon capture simulation initiative (CCSI) technology will include two key elements of risk management, namely, technical risk (will process and plant performance be effective, safe, and reliable) and enterprise risk (can project losses and costs be controlled within the constraints of market demand to maintain profitability and investor confidence). Both of these elements of risk are incorporated into the risk analysis subtask of Task 7. Thus far, this subtask has developed a prototype demonstration tool that quantifies risk based on the expected profitability of expenditures when retrofitting carbon capture technology on a stylized 650 MW pulverized coal electric power generator. The prototype is based on the selection of specific technical and financial factors believed to be important determinants of the expected profitability of carbon capture, subject to uncertainty. The uncertainty surrounding the technical performance and financial variables selected thus far is propagated in a model that calculates the expected profitability of investments in carbon capture and measures risk in terms of variability in expected net returns from these investments. Given the preliminary nature of the results of this prototype, additional work is required to expand the scope of the model to include additional risk factors, additional information on extant and proposed risk factors, the results of a qualitative risk factor elicitation process, and feedback from utilities and other interested parties involved in the carbon capture project. Additional information on proposed distributions of these risk factors will be integrated into a commercial implementation framework for the purpose of a comparative technology investment analysis.

  16. Boston Society's 11th Annual Applied Pharmaceutical Analysis conference.

    PubMed

    Lee, Violet; Liu, Ang; Groeber, Elizabeth; Moghaddam, Mehran; Schiller, James; Tweed, Joseph A; Walker, Gregory S

    2016-02-01

    Boston Society's 11th Annual Applied Pharmaceutical Analysis conference, Hyatt Regency Hotel, Cambridge, MA, USA, 14-16 September 2015 The Boston Society's 11th Annual Applied Pharmaceutical Analysis (APA) conference took place at the Hyatt Regency hotel in Cambridge, MA, on 14-16 September 2015. The 3-day conference affords pharmaceutical professionals, academic researchers and industry regulators the opportunity to collectively participate in meaningful and relevant discussions impacting the areas of pharmaceutical drug development. The APA conference was organized in three workshops encompassing the disciplines of regulated bioanalysis, discovery bioanalysis (encompassing new and emerging technologies) and biotransformation. The conference included a short course titled 'Bioanalytical considerations for the clinical development of antibody-drug conjugates (ADCs)', an engaging poster session, several panel and round table discussions and over 50 diverse talks from leading industry and academic scientists. PMID:26853375

  17. Measurement uncertainty analysis techniques applied to PV performance measurements

    SciTech Connect

    Wells, C

    1992-10-01

    The purpose of this presentation is to provide a brief introduction to measurement uncertainty analysis, outline how it is done, and illustrate uncertainty analysis with examples drawn from the PV field, with particular emphasis toward its use in PV performance measurements. The uncertainty information we know and state concerning a PV performance measurement or a module test result determines, to a significant extent, the value and quality of that result. What is measurement uncertainty analysis? It is an outgrowth of what has commonly been called error analysis. But uncertainty analysis, a more recent development, gives greater insight into measurement processes and tests, experiments, or calibration results. Uncertainty analysis gives us an estimate of the I interval about a measured value or an experiment`s final result within which we believe the true value of that quantity will lie. Why should we take the time to perform an uncertainty analysis? A rigorous measurement uncertainty analysis: Increases the credibility and value of research results; allows comparisons of results from different labs; helps improve experiment design and identifies where changes are needed to achieve stated objectives (through use of the pre-test analysis); plays a significant role in validating measurements and experimental results, and in demonstrating (through the post-test analysis) that valid data have been acquired; reduces the risk of making erroneous decisions; demonstrates quality assurance and quality control measures have been accomplished; define Valid Data as data having known and documented paths of: Origin, including theory; measurements; traceability to measurement standards; computations; uncertainty analysis of results.

  18. Conference report: summary of the 2010 Applied Pharmaceutical Analysis Conference.

    PubMed

    Unger, Steve E

    2011-01-01

    This year, the Applied Pharmaceutical Analysis meeting changed its venue to the Grand Tremont Hotel in Baltimore, MD, USA. Proximity to Washington presented the opportunity to have four speakers from the US FDA. The purpose of the 4-day conference is to provide a forum in which pharmaceutical and CRO scientists can discuss and develop best practices for scientific challenges in bioanalysis and drug metabolism. This year's theme was 'Bioanalytical and Biotransformation Challenges in Meeting Global Regulatory Expectations & New Technologies for Drug Discovery Challenges'. Applied Pharmaceutical Analysis continued its tradition of highlighting new technologies and its impact on drug discovery, drug metabolism and small molecule-regulated bioanalysis. This year, the meeting included an integrated focus on metabolism in drug discovery and development. Middle and large molecule (biotherapeutics) drug development, immunoassay, immunogenicity and biomarkers were also integrated into the forum. Applied Pharmaceutical Analysis offered an enhanced diversity of topics this year while continuing to share experiences of discovering and developing new medicines. PMID:21175361

  19. Risk analysis approach. [of carbon fiber release

    NASA Technical Reports Server (NTRS)

    Huston, R. J.

    1979-01-01

    The assessment of the carbon fiber hazard is outlined. Program objectives, requirements of the risk analysis, and elements associated with the physical phenomena of the accidental release are described.

  20. 38 CFR 75.115 - Risk analysis.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ...) INFORMATION SECURITY MATTERS Data Breaches § 75.115 Risk analysis. If a data breach involving sensitive... affected; (5) Ease of logical data access to the lost, stolen or improperly accessed data in light of...

  1. Quantitative Microbial Risk Assessment in Occupational Settings Applied to the Airborne Human Adenovirus Infection

    PubMed Central

    Carducci, Annalaura; Donzelli, Gabriele; Cioni, Lorenzo; Verani, Marco

    2016-01-01

    Quantitative Microbial Risk Assessment (QMRA) methodology, which has already been applied to drinking water and food safety, may also be applied to risk assessment and management at the workplace. The present study developed a preliminary QMRA model to assess microbial risk that is associated with inhaling bioaerosols that are contaminated with human adenovirus (HAdV). This model has been applied to air contamination data from different occupational settings, including wastewater systems, solid waste landfills, and toilets in healthcare settings and offices, with different exposure times. Virological monitoring showed the presence of HAdVs in all the evaluated settings, thus confirming that HAdV is widespread, but with different average concentrations of the virus. The QMRA results, based on these concentrations, showed that toilets had the highest probability of viral infection, followed by wastewater treatment plants and municipal solid waste landfills. Our QMRA approach in occupational settings is novel, and certain caveats should be considered. Nonetheless, we believe it is worthy of further discussions and investigations. PMID:27447658

  2. Quantitative Microbial Risk Assessment in Occupational Settings Applied to the Airborne Human Adenovirus Infection.

    PubMed

    Carducci, Annalaura; Donzelli, Gabriele; Cioni, Lorenzo; Verani, Marco

    2016-01-01

    Quantitative Microbial Risk Assessment (QMRA) methodology, which has already been applied to drinking water and food safety, may also be applied to risk assessment and management at the workplace. The present study developed a preliminary QMRA model to assess microbial risk that is associated with inhaling bioaerosols that are contaminated with human adenovirus (HAdV). This model has been applied to air contamination data from different occupational settings, including wastewater systems, solid waste landfills, and toilets in healthcare settings and offices, with different exposure times. Virological monitoring showed the presence of HAdVs in all the evaluated settings, thus confirming that HAdV is widespread, but with different average concentrations of the virus. The QMRA results, based on these concentrations, showed that toilets had the highest probability of viral infection, followed by wastewater treatment plants and municipal solid waste landfills. Our QMRA approach in occupational settings is novel, and certain caveats should be considered. Nonetheless, we believe it is worthy of further discussions and investigations. PMID:27447658

  3. Pathogen risk assessment of land applied wastewater and biosolids: A fuzzy set approach

    SciTech Connect

    Dahab, M.F.; Fuerhacker, M.; Zibuschka, F.

    1998-07-01

    There are major concerns associated with land application of wastewater and biosolids including the potential risk to public health from water-borne pollutants that may enter the food chain and from pathogens that may be present in the wastewater. These risks are of particular concern when wastewater is applied to land where crops are grown as part of the human food chain or when direct human contact with the wastewater may occur. In many communities, toxic chemicals may not be present in the biosolids, or their concentrations may be reduced through source control measures. However, pathogens that enter wastewater from infected individuals cannot be controlled at the source and are often found in wastewater or biosolids applied to land. Public health officials have emphasized that microbial pathogens (or pathogen indicators) should not occur in areas where exposure to humans is likely. Under this criteria, the concept of risk assessment which requires the characterization of the occurrence of pathogens, almost seems to be contradictory to basic public health goals. As the understanding of pathogen and pathogen indicator occurrence becomes better refined, the arguments for finding practical application of risk assessment for pathogenic organisms become more compelling.

  4. Probabilistic risk assessment of veterinary medicines applied to four major aquaculture species produced in Asia.

    PubMed

    Rico, Andreu; Van den Brink, Paul J

    2014-01-15

    Aquaculture production constitutes one of the main sources of pollution with veterinary medicines into the environment. About 90% of the global aquaculture production is produced in Asia and the potential environmental risks associated with the use of veterinary medicines in Asian aquaculture have not yet been properly evaluated. In this study we performed a probabilistic risk assessment for eight different aquaculture production scenarios in Asia by combining up-to-date information on the use of veterinary medicines and aquaculture production characteristics. The ERA-AQUA model was used to perform mass balances of veterinary medicinal treatments applied to aquaculture ponds and to characterize risks for primary producers, invertebrates, and fish potentially exposed to chemical residues through aquaculture effluents. The mass balance calculations showed that, on average, about 25% of the applied drug mass to aquaculture ponds is released into the environment, although this percentage varies with the chemical's properties, the mode of application, the cultured species density, and the water exchange rates in the aquaculture pond scenario. In general, the highest potential environmental risks were calculated for parasitic treatments, followed by disinfection and antibiotic treatments. Pangasius catfish production in Vietnam, followed by shrimp production in China, constitute possible hot-spots for environmental pollution due to the intensity of the aquaculture production and considerable discharge of toxic chemical residues into surrounding aquatic ecosystems. A risk-based ranking of compounds is provided for each of the evaluated scenarios, which offers crucial information for conducting further chemical and biological field and laboratory monitoring research. In addition, we discuss general knowledge gaps and research priorities for performing refined risk assessments of aquaculture medicines in the near future.

  5. Adversarial risk analysis for counterterrorism modeling.

    PubMed

    Rios, Jesus; Rios Insua, David

    2012-05-01

    Recent large-scale terrorist attacks have raised interest in models for resource allocation against terrorist threats. The unifying theme in this area is the need to develop methods for the analysis of allocation decisions when risks stem from the intentional actions of intelligent adversaries. Most approaches to these problems have a game-theoretic flavor although there are also several interesting decision-analytic-based proposals. One of them is the recently introduced framework for adversarial risk analysis, which deals with decision-making problems that involve intelligent opponents and uncertain outcomes. We explore how adversarial risk analysis addresses some standard counterterrorism models: simultaneous defend-attack models, sequential defend-attack-defend models, and sequential defend-attack models with private information. For each model, we first assess critically what would be a typical game-theoretic approach and then provide the corresponding solution proposed by the adversarial risk analysis framework, emphasizing how to coherently assess a predictive probability model of the adversary's actions, in a context in which we aim at supporting decisions of a defender versus an attacker. This illustrates the application of adversarial risk analysis to basic counterterrorism models that may be used as basic building blocks for more complex risk analysis of counterterrorism problems. PMID:22150163

  6. Fuzzy risk analysis for nuclear safeguards

    SciTech Connect

    Zardecki, A.

    1993-05-01

    Analysis of a safeguards system, based on the notion of fuzzy sets and linguistic variables, concerns such as complexity and inherent imprecision in estimating the possibility of loss or compromise. The automated risk analysis allows the risk to be determined for an entire system based on estimates for lowest level components and the component proportion. In addition, for each component (asset) the most effective combination of protection mechanisms against a given set of threats is determined. A distinction between bar and featured risk is made.

  7. Fuzzy risk analysis for nuclear safeguards

    SciTech Connect

    Zardecki, A.

    1993-01-01

    Analysis of a safeguards system, based on the notion of fuzzy sets and linguistic variables, concerns such as complexity and inherent imprecision in estimating the possibility of loss or compromise. The automated risk analysis allows the risk to be determined for an entire system based on estimates for lowest level components and the component proportion. In addition, for each component (asset) the most effective combination of protection mechanisms against a given set of threats is determined. A distinction between bar and featured risk is made.

  8. Fire Risk Implications in Safety Analysis Reports

    SciTech Connect

    Blanchard, A.

    1999-03-31

    Fire can be a significant risk for facilities that store and handle radiological material. Such events must be evaluated as part of a comprehensive safety analysis. SRS has been developing methods to evaluate radiological fire risk in such facilities. These methods combined with the analysis techniques proposed by DOE-STD-3009-94 have provided a better understanding of how fire risks in nuclear facilities should be managed. To ensure that these new insights are properly disseminated the DOE Savannah River Office and the Defense Nuclear Facility Safety Board (DNFSB) requested Westinghouse Savannah River Company (WSRC) prepare this paper.

  9. Activity anorexia: An interplay between basic and applied behavior analysis

    PubMed Central

    Pierce, W. David; Epling, W. Frank; Dews, Peter B.; Estes, William K.; Morse, William H.; Van Orman, Willard; Herrnstein, Richard J.

    1994-01-01

    The relationship between basic research with nonhumans and applied behavior analysis is illustrated by our work on activity anorexia. When rats are fed one meal a day and allowed to run on an activity wheel, they run excessively, stop eating, and die of starvation. Convergent evidence, from several different research areas, indicates that the behavior of these animals and humans who self-starve is functionally similar. A biobehavioral theory of activity anorexia is presented that details the cultural contingencies, behavioral processes, and physiology of anorexia. Diagnostic criteria and a three-stage treatment program for activity-based anorexia are outlined. The animal model permits basic research on anorexia that for practical and ethical reasons cannot be conducted with humans. Thus, basic research can have applied importance. PMID:22478169

  10. Synchronisation and coupling analysis: applied cardiovascular physics in sleep medicine.

    PubMed

    Wessel, Niels; Riedl, Maik; Kramer, Jan; Muller, Andreas; Penzel, Thomas; Kurths, Jurgen

    2013-01-01

    Sleep is a physiological process with an internal program of a number of well defined sleep stages and intermediate wakefulness periods. The sleep stages modulate the autonomous nervous system and thereby the sleep stages are accompanied by different regulation regimes for the cardiovascular and respiratory system. The differences in regulation can be distinguished by new techniques of cardiovascular physics. The number of patients suffering from sleep disorders increases unproportionally with the increase of the human population and aging, leading to very high expenses in the public health system. Therefore, the challenge of cardiovascular physics is to develop highly-sophisticated methods which are able to, on the one hand, supplement and replace expensive medical devices and, on the other hand, improve the medical diagnostics with decreasing the patient's risk. Methods of cardiovascular physics are used to analyze heart rate, blood pressure and respiration to detect changes of the autonomous nervous system in different diseases. Data driven modeling analysis, synchronization and coupling analysis and their applications to biosignals in healthy subjects and patients with different sleep disorders are presented. Newly derived methods of cardiovascular physics can help to find indicators for these health risks.

  11. A risk assessment tool applied to the study of shale gas resources.

    PubMed

    Veiguela, Miguel; Hurtado, Antonio; Eguilior, Sonsoles; Recreo, Fernando; Roqueñi, Nieves; Loredo, Jorge

    2016-11-15

    The implementation of a risk assessment tool with the capacity to evaluate the risks for health, safety and the environment (HSE) from extraction of non-conventional fossil fuel resources by the hydraulic fracturing (fracking) technique can be a useful tool to boost development and progress of the technology and winning public trust and acceptance of this. At the early project stages, the lack of data related the selection of non-conventional gas deposits makes it difficult the use of existing approaches to risk assessment of fluids injected into geologic formations. The qualitative risk assessment tool developed in this work is based on the approach that shale gas exploitation risk is dependent on both the geologic site and the technological aspects. It follows from the Oldenburg's 'Screening and Ranking Framework (SRF)' developed to evaluate potential geologic carbon dioxide (CO2) storage sites. These two global characteristics: (1) characteristics centered on the natural aspects of the site and (2) characteristics centered on the technological aspects of the Project, have been evaluated through user input of Property values, which define Attributes, which define the Characteristics. In order to carry out an individual evaluation of each of the characteristics and the elements of the model, the tool has been implemented in a spreadsheet. The proposed model has been applied to a site with potential for the exploitation of shale gas in Asturias (northwestern Spain) with tree different technological options to test the approach.

  12. A risk assessment tool applied to the study of shale gas resources.

    PubMed

    Veiguela, Miguel; Hurtado, Antonio; Eguilior, Sonsoles; Recreo, Fernando; Roqueñi, Nieves; Loredo, Jorge

    2016-11-15

    The implementation of a risk assessment tool with the capacity to evaluate the risks for health, safety and the environment (HSE) from extraction of non-conventional fossil fuel resources by the hydraulic fracturing (fracking) technique can be a useful tool to boost development and progress of the technology and winning public trust and acceptance of this. At the early project stages, the lack of data related the selection of non-conventional gas deposits makes it difficult the use of existing approaches to risk assessment of fluids injected into geologic formations. The qualitative risk assessment tool developed in this work is based on the approach that shale gas exploitation risk is dependent on both the geologic site and the technological aspects. It follows from the Oldenburg's 'Screening and Ranking Framework (SRF)' developed to evaluate potential geologic carbon dioxide (CO2) storage sites. These two global characteristics: (1) characteristics centered on the natural aspects of the site and (2) characteristics centered on the technological aspects of the Project, have been evaluated through user input of Property values, which define Attributes, which define the Characteristics. In order to carry out an individual evaluation of each of the characteristics and the elements of the model, the tool has been implemented in a spreadsheet. The proposed model has been applied to a site with potential for the exploitation of shale gas in Asturias (northwestern Spain) with tree different technological options to test the approach. PMID:27453140

  13. Cladistic analysis applied to the classification of volcanoes

    NASA Astrophysics Data System (ADS)

    Hone, D. W. E.; Mahony, S. H.; Sparks, R. S. J.; Martin, K. T.

    2007-11-01

    Cladistics is a systematic method of classification that groups entities on the basis of sharing similar characteristics in the most parsimonious manner. Here cladistics is applied to the classification of volcanoes using a dataset of 59 Quaternary volcanoes and 129 volcanic edifices of the Tohoku region, Northeast Japan. Volcano and edifice characteristics recorded in the database include attributes of volcano size, chemical composition, dominant eruptive products, volcano morphology, dominant landforms, volcano age and eruptive history. Without characteristics related to time the volcanic edifices divide into two groups, with characters related to volcano size, dominant composition and edifice morphology being the most diagnostic. Analysis including time based characteristics yields four groups with a good correlation between these groups and the two groups from the analysis without time for 108 out of 129 volcanic edifices. Thus when characters are slightly changed the volcanoes still form similar groupings. Analysis of the volcanoes both with and without time yields three groups based on compositional, eruptive products and morphological characters. Spatial clusters of volcanic centres have been recognised in the Tohoku region by Tamura et al. ( Earth Planet Sci Lett 197:105 106, 2002). The groups identified by cladistic analysis are distributed unevenly between the clusters, indicating a tendency for individual clusters to form similar kinds of volcanoes with distinctive but coherent styles of volcanism. Uneven distribution of volcano types between clusters can be explained by variations in dominant magma compositions through time, which are reflected in eruption products and volcanic landforms. Cladistic analysis can be a useful tool for elucidating dynamic igneous processes that could be applied to other regions and globally. Our exploratory study indicates that cladistics has promise as a method for classifying volcanoes and potentially elucidating dynamic

  14. Expert opinion as 'validation' of risk assessment applied to calf welfare

    PubMed Central

    Bracke, Marc BM; Edwards, Sandra A; Engel, Bas; Buist, Willem G; Algers, Bo

    2008-01-01

    Background Recently, a Risk Assessment methodology was applied to animal welfare issues in a report of the European Food Safety Authority (EFSA) on intensively housed calves. Methods Because this is a new and potentially influential approach to derive conclusions on animal welfare issues, a so-called semantic-modelling type 'validation' study was conducted by asking expert scientists, who had been involved or quoted in the report, to give welfare scores for housing systems and for welfare hazards. Results Kendall's coefficient of concordance among experts (n = 24) was highly significant (P < 0.001), but low (0.29 and 0.18 for housing systems and hazards respectively). Overall correlations with EFSA scores were significant only for experts with a veterinary or mixed (veterinary and applied ethological) background. Significant differences in welfare scores were found between housing systems, between hazards, and between experts with different backgrounds. For example, veterinarians gave higher overall welfare scores for housing systems than ethologists did, probably reflecting a difference in their perception of animal welfare. Systems with the lowest scores were veal calves kept individually in so-called "baby boxes" (veal crates) or in small groups, and feedlots. A suckler herd on pasture was rated as the best for calf welfare. The main hazards were related to underfeeding, inadequate colostrum intake, poor stockperson education, insufficient space, inadequate roughage, iron deficiency, inadequate ventilation, poor floor conditions and no bedding. Points for improvement of the Risk Assessment applied to animal welfare include linking information, reporting uncertainty and transparency about underlying values. Conclusion The study provides novel information on expert opinion in relation to calf welfare and shows that Risk Assessment applied to animal welfare can benefit from a semantic modelling approach. PMID:18625048

  15. Compatibility of person-centered planning and applied behavior analysis

    PubMed Central

    Holburn, Steve

    2001-01-01

    In response to Osborne (1999), the aims and practices of person-centered planning (PCP) are compared to the basic principles of applied behavior analysis set forth by Baer, Wolf, and Risley (1968, 1987). The principal goal of PCP is social integration of people with disabilities; it qualifies as a socially important behavior, and its problems have been displayed sufficiently. However, social integration is a complex social problem whose solution requires access to system contingencies that influence lifestyles. Nearly all of the component goals of PCP proposed by O'Brien (1987b) have been reliably quantified, although concurrent measurement of outcomes such as friendship, autonomy, and respect presents a formidable challenge. Behavioral principles such as contingency and contextual control are operative within PCP, but problems in achieving reliable implementation appear to impede an experimental analysis. PMID:22478371

  16. Applying risk and resilience models to predicting the effects of media violence on development.

    PubMed

    Prot, Sara; Gentile, Douglas A

    2014-01-01

    Although the effects of media violence on children and adolescents have been studied for over 50 years, they remain controversial. Much of this controversy is driven by a misunderstanding of causality that seeks the cause of atrocities such as school shootings. Luckily, several recent developments in risk and resilience theories offer a way out of this controversy. Four risk and resilience models are described, including the cascade model, dose-response gradients, pathway models, and turning-point models. Each is described and applied to the existing media effects literature. Recommendations for future research are discussed with regard to each model. In addition, we examine current developments in theorizing that stressors have sensitizing versus steeling effects and recent interest in biological and gene by environment interactions. We also discuss several of the cultural aspects that have supported the polarization and misunderstanding of the literature, and argue that applying risk and resilience models to the theories and data offers a more balanced way to understand the subtle effects of media violence on aggression within a multicausal perspective. PMID:24851351

  17. Applying risk and resilience models to predicting the effects of media violence on development.

    PubMed

    Prot, Sara; Gentile, Douglas A

    2014-01-01

    Although the effects of media violence on children and adolescents have been studied for over 50 years, they remain controversial. Much of this controversy is driven by a misunderstanding of causality that seeks the cause of atrocities such as school shootings. Luckily, several recent developments in risk and resilience theories offer a way out of this controversy. Four risk and resilience models are described, including the cascade model, dose-response gradients, pathway models, and turning-point models. Each is described and applied to the existing media effects literature. Recommendations for future research are discussed with regard to each model. In addition, we examine current developments in theorizing that stressors have sensitizing versus steeling effects and recent interest in biological and gene by environment interactions. We also discuss several of the cultural aspects that have supported the polarization and misunderstanding of the literature, and argue that applying risk and resilience models to the theories and data offers a more balanced way to understand the subtle effects of media violence on aggression within a multicausal perspective.

  18. Empirical modal decomposition applied to cardiac signals analysis

    NASA Astrophysics Data System (ADS)

    Beya, O.; Jalil, B.; Fauvet, E.; Laligant, O.

    2010-01-01

    In this article, we present the method of empirical modal decomposition (EMD) applied to the electrocardiograms and phonocardiograms signals analysis and denoising. The objective of this work is to detect automatically cardiac anomalies of a patient. As these anomalies are localized in time, therefore the localization of all the events should be preserved precisely. The methods based on the Fourier Transform (TFD) lose the localization property [13] and in the case of Wavelet Transform (WT) which makes possible to overcome the problem of localization, but the interpretation remains still difficult to characterize the signal precisely. In this work we propose to apply the EMD (Empirical Modal Decomposition) which have very significant properties on pseudo periodic signals. The second section describes the algorithm of EMD. In the third part we present the result obtained on Phonocardiograms (PCG) and on Electrocardiograms (ECG) test signals. The analysis and the interpretation of these signals are given in this same section. Finally, we introduce an adaptation of the EMD algorithm which seems to be very efficient for denoising.

  19. Risk Assessment and Integration Team (RAIT) Portfolio Risk Analysis Strategy

    NASA Technical Reports Server (NTRS)

    Edwards, Michelle

    2010-01-01

    Impact at management level: Qualitative assessment of risk criticality in conjunction with risk consequence, likelihood, and severity enable development of an "investment policy" towards managing a portfolio of risks. Impact at research level: Quantitative risk assessments enable researchers to develop risk mitigation strategies with meaningful risk reduction results. Quantitative assessment approach provides useful risk mitigation information.

  20. Automated SEM Modal Analysis Applied to the Diogenites

    NASA Technical Reports Server (NTRS)

    Bowman, L. E.; Spilde, M. N.; Papike, James J.

    1996-01-01

    Analysis of volume proportions of minerals, or modal analysis, is routinely accomplished by point counting on an optical microscope, but the process, particularly on brecciated samples such as the diogenite meteorites, is tedious and prone to error by misidentification of very small fragments, which may make up a significant volume of the sample. Precise volume percentage data can be gathered on a scanning electron microscope (SEM) utilizing digital imaging and an energy dispersive spectrometer (EDS). This form of automated phase analysis reduces error, and at the same time provides more information than could be gathered using simple point counting alone, such as particle morphology statistics and chemical analyses. We have previously studied major, minor, and trace-element chemistry of orthopyroxene from a suite of diogenites. This abstract describes the method applied to determine the modes on this same suite of meteorites and the results of that research. The modal abundances thus determined add additional information on the petrogenesis of the diogenites. In addition, low-abundance phases such as spinels were located for further analysis by this method.

  1. Confronting deep uncertainties in risk analysis.

    PubMed

    Cox, Louis Anthony

    2012-10-01

    How can risk analysts help to improve policy and decision making when the correct probabilistic relation between alternative acts and their probable consequences is unknown? This practical challenge of risk management with model uncertainty arises in problems from preparing for climate change to managing emerging diseases to operating complex and hazardous facilities safely. We review constructive methods for robust and adaptive risk analysis under deep uncertainty. These methods are not yet as familiar to many risk analysts as older statistical and model-based methods, such as the paradigm of identifying a single "best-fitting" model and performing sensitivity analyses for its conclusions. They provide genuine breakthroughs for improving predictions and decisions when the correct model is highly uncertain. We demonstrate their potential by summarizing a variety of practical risk management applications. PMID:22489541

  2. The ABC’s of Suicide Risk Assessment: Applying a Tripartite Approach to Individual Evaluations

    PubMed Central

    Harris, Keith M.; Syu, Jia-Jia; Lello, Owen D.; Chew, Y. L. Eileen; Willcox, Christopher H.; Ho, Roger H. M.

    2015-01-01

    There is considerable need for accurate suicide risk assessment for clinical, screening, and research purposes. This study applied the tripartite affect-behavior-cognition theory, the suicidal barometer model, classical test theory, and item response theory (IRT), to develop a brief self-report measure of suicide risk that is theoretically-grounded, reliable and valid. An initial survey (n = 359) employed an iterative process to an item pool, resulting in the six-item Suicidal Affect-Behavior-Cognition Scale (SABCS). Three additional studies tested the SABCS and a highly endorsed comparison measure. Studies included two online surveys (Ns = 1007, and 713), and one prospective clinical survey (n = 72; Time 2, n = 54). Factor analyses demonstrated SABCS construct validity through unidimensionality. Internal reliability was high (α = .86-.93, split-half = .90-.94)). The scale was predictive of future suicidal behaviors and suicidality (r = .68, .73, respectively), showed convergent validity, and the SABCS-4 demonstrated clinically relevant sensitivity to change. IRT analyses revealed the SABCS captured more information than the comparison measure, and better defined participants at low, moderate, and high risk. The SABCS is the first suicide risk measure to demonstrate no differential item functioning by sex, age, or ethnicity. In all comparisons, the SABCS showed incremental improvements over a highly endorsed scale through stronger predictive ability, reliability, and other properties. The SABCS is in the public domain, with this publication, and is suitable for clinical evaluations, public screening, and research. PMID:26030590

  3. Quantitative Microbial Risk Assessment Tutorial: Land-applied Microbial Loadings within a 12-Digit HUC

    EPA Science Inventory

    This tutorial reviews screens, icons, and basic functions of the SDMProjectBuilder (SDMPB). It demonstrates how one chooses a 12-digit HUC for analysis, performs an assessment of land-applied microbes by simulating microbial fate and transport using HSPF, and analyzes and visuali...

  4. A method for determining weights for excess relative risk and excess absolute risk when applied in the calculation of lifetime risk of cancer from radiation exposure.

    PubMed

    Walsh, Linda; Schneider, Uwe

    2013-03-01

    Radiation-related risks of cancer can be transported from one population to another population at risk, for the purpose of calculating lifetime risks from radiation exposure. Transfer via excess relative risks (ERR) or excess absolute risks (EAR) or a mixture of both (i.e., from the life span study (LSS) of Japanese atomic bomb survivors) has been done in the past based on qualitative weighting. Consequently, the values of the weights applied and the method of application of the weights (i.e., as additive or geometric weighted means) have varied both between reports produced at different times by the same regulatory body and also between reports produced at similar times by different regulatory bodies. Since the gender and age patterns are often markedly different between EAR and ERR models, it is useful to have an evidence-based method for determining the relative goodness of fit of such models to the data. This paper identifies a method, using Akaike model weights, which could aid expert judgment and be applied to help to achieve consistency of approach and quantitative evidence-based results in future health risk assessments. The results of applying this method to recent LSS cancer incidence models are that the relative EAR weighting by cancer solid cancer site, on a scale of 0-1, is zero for breast and colon, 0.02 for all solid, 0.03 for lung, 0.08 for liver, 0.15 for thyroid, 0.18 for bladder and 0.93 for stomach. The EAR weighting for female breast cancer increases from 0 to 0.3, if a generally observed change in the trend between female age-specific breast cancer incidence rates and attained age, associated with menopause, is accounted for in the EAR model. Application of this method to preferred models from a study of multi-model inference from many models fitted to the LSS leukemia mortality data, results in an EAR weighting of 0. From these results it can be seen that lifetime risk transfer is most highly weighted by EAR only for stomach cancer. However

  5. Risk assessment of land-applied biosolids-borne triclocarban (TCC).

    PubMed

    Snyder, Elizabeth Hodges; O'Connor, George A

    2013-01-01

    Triclocarban (TCC) is monitored under the USEPA High Production Volume (HPV) chemical program and is predominantly used as the active ingredient in select antibacterial bar soaps and other personal care products. The compound commonly occurs at parts-per-million concentrations in processed wastewater treatment residuals (i.e. biosolids), which are frequently land-applied as fertilizers and soil conditioners. Human and ecological risk assessment parameters measured by the authors in previous studies were integrated with existing data to perform a two-tiered human health and ecological risk assessment of land-applied biosolids-borne TCC. The 14 exposure pathways identified in the Part 503 Biosolids Rule were expanded, and conservative screening-level hazard quotients (HQ values) were first calculated to estimate risk to humans and a variety of terrestrial and aquatic organisms (Tier 1). The majority of biosolids-borne TCC exposure pathways resulted in no screening-level HQ values indicative of significant risks to exposed organisms (including humans), even under worst-case land application scenarios. The two pathways for which the conservative screening-level HQ values exceeded one (i.e. Pathway 10: biosolids➔soil➔soil organism➔predator, and Pathway 16: biosolids➔soil➔surface water➔aquatic organism) were then reexamined using modified parameters and scenarios (Tier 2). Adjusted HQ values remained greater than one for Exposure Pathway 10, with the exception of the final adjusted HQ values under a one-time 5 Mg ha(-1) (agronomic) biosolids loading rate scenario for the American woodcock (Scolopax minor) and short-tailed shrew (Blarina brevicauda). Results were used to prioritize recommendations for future biosolids-borne TCC research, which include additional measurements of toxicological effects and TCC concentrations in environmental matrices at the field level.

  6. Risk assessment of land-applied biosolids-borne triclocarban (TCC).

    PubMed

    Snyder, Elizabeth Hodges; O'Connor, George A

    2013-01-01

    Triclocarban (TCC) is monitored under the USEPA High Production Volume (HPV) chemical program and is predominantly used as the active ingredient in select antibacterial bar soaps and other personal care products. The compound commonly occurs at parts-per-million concentrations in processed wastewater treatment residuals (i.e. biosolids), which are frequently land-applied as fertilizers and soil conditioners. Human and ecological risk assessment parameters measured by the authors in previous studies were integrated with existing data to perform a two-tiered human health and ecological risk assessment of land-applied biosolids-borne TCC. The 14 exposure pathways identified in the Part 503 Biosolids Rule were expanded, and conservative screening-level hazard quotients (HQ values) were first calculated to estimate risk to humans and a variety of terrestrial and aquatic organisms (Tier 1). The majority of biosolids-borne TCC exposure pathways resulted in no screening-level HQ values indicative of significant risks to exposed organisms (including humans), even under worst-case land application scenarios. The two pathways for which the conservative screening-level HQ values exceeded one (i.e. Pathway 10: biosolids➔soil➔soil organism➔predator, and Pathway 16: biosolids➔soil➔surface water➔aquatic organism) were then reexamined using modified parameters and scenarios (Tier 2). Adjusted HQ values remained greater than one for Exposure Pathway 10, with the exception of the final adjusted HQ values under a one-time 5 Mg ha(-1) (agronomic) biosolids loading rate scenario for the American woodcock (Scolopax minor) and short-tailed shrew (Blarina brevicauda). Results were used to prioritize recommendations for future biosolids-borne TCC research, which include additional measurements of toxicological effects and TCC concentrations in environmental matrices at the field level. PMID:23183124

  7. A risk analysis model for radioactive wastes.

    PubMed

    Külahcı, Fatih

    2011-07-15

    Hazardous wastes affect natural environmental systems to a significant extend, and therefore, it is necessary to control their harm through risk analysis. Herein, an effective risk methodology is proposed by considering their uncertain behaviors on stochastic, statistical and probabilistic bases. The basic element is attachment of a convenient probability distribution function (pdf) to a given waste quality measurement sequence. In this paper, (40)K contaminant measurements are adapted for risk assessment application after derivation of necessary fundamental formulations. The spatial contaminant distribution of (40)K is presented in the forms of maps and three-dimensional surfaces.

  8. Risk Interfaces to Support Integrated Systems Analysis and Development

    NASA Technical Reports Server (NTRS)

    Mindock, Jennifer; Lumpkins, Sarah; Shelhamer, Mark; Anton, Wilma; Havenhill, Maria

    2016-01-01

    Objectives for systems analysis capability: Develop integrated understanding of how a complex human physiological-socio-technical mission system behaves in spaceflight. Why? Support development of integrated solutions that prevent unwanted outcomes (Implementable approaches to minimize mission resources(mass, power, crew time, etc.)); Support development of tools for autonomy (need for exploration) (Assess and maintain resilience -individuals, teams, integrated system). Output of this exercise: -Representation of interfaces based on Human System Risk Board (HSRB) Risk Summary information and simple status based on Human Research Roadmap; Consolidated HSRB information applied to support communication; Point-of-Departure for HRP Element planning; Ability to track and communicate status of collaborations. 4

  9. Climate change, land slide risks and sustainable development, risk analysis and decision support process tool

    NASA Astrophysics Data System (ADS)

    Andersson-sköld, Y. B.; Tremblay, M.

    2011-12-01

    Climate change is in most parts of Sweden expected to result in increased precipitation and increased sea water levels causing flooding, erosion, slope instability and related secondary consequences. Landslide risks are expected to increase with climate change in large parts of Sweden due to increased annual precipitation, more intense precipitation and increased flows combined with dryer summers. In response to the potential climate related risks, and on the commission of the Ministry of Environment, the Swedish Geotechnical Institute (SGI) is at present performing a risk analysis project for the most prominent landslide risk area in Sweden: the Göta river valley. As part of this, a methodology for land slide ex-ante consequence analysis today, and in a future climate, has been developed and applied in the Göta river valley. Human life, settlements, industry, contaminated sites, infrastructure of national importance are invented and assessed important elements at risk. The goal of the consequence analysis is to produce a map of geographically distributed expected losses, which can be combined with a corresponding map displaying landslide probability to describe the risk (the combination of probability and consequence of a (negative) event). The risk analysis is GIS-aided in presenting and visualise the risk and using existing databases for quantification of the consequences represented by ex-ante estimated monetary losses. The results will be used on national, regional and as an indication of the risk on local level, to assess the need of measures to mitigate the risk. The costs and environmental and social impacts to mitigate the risk are expected to be very high but the costs and impacts of a severe landslide are expected to be even higher. Therefore, civil servants have pronounced a need of tools to assess both the vulnerability and a more holistic picture of impacts of climate change adaptation measures. At SGI a tool for the inclusion of sustainability

  10. Applying cluster analysis to physics education research data

    NASA Astrophysics Data System (ADS)

    Springuel, R. Padraic

    One major thrust of Physics Education Research (PER) is the identification of student ideas about specific physics concepts, both correct ideas and those that differ from the expert consensus. Typically the research process of eliciting the spectrum of student ideas involves the administration of specially designed questions to students. One major analysis task in PER is the sorting of these student responses into thematically coherent groups. This process is one which has previously been done by eye in PER. This thesis explores the possibility of using cluster analysis to perform the task in a more rigorous and less time-intensive fashion while making fewer assumptions about what the students are doing. Since this technique has not previously been used in PER, a summary of the various kinds of cluster analysis is included as well as a discussion of which might be appropriate for the task of sorting student responses into groups. Two example data sets (one based on the Force and Motion Conceptual Evaluation (DICE) the other looking at acceleration in two-dimensions (A2D) are examined in depth to demonstrate how cluster analysis can be applied to PER data and the various considerations which must be taken into account when doing so. In both cases, the techniques described in this thesis found 5 groups which contained about 90% of the students in the data set. The results of this application are compared to previous research on the topics covered by the two examples to demonstrate that cluster analysis can effectively uncover the same patterns in student responses that have already been identified.

  11. Towards secure virtual directories : a risk analysis framework.

    SciTech Connect

    Claycomb, William R.

    2010-07-01

    Directory services are used by almost every enterprise computing environment to provide data concerning users, computers, contacts, and other objects. Virtual directories are components that provide directory services in a highly customized manner. Unfortunately, though the use of virtual directory services are widespread, an analysis of risks posed by their unique position and architecture has not been completed. We present a detailed analysis of six attacks to virtual directory services, including steps for detection and prevention. We also describe various categories of attack risks, and discuss what is necessary to launch an attack on virtual directories. Finally, we present a framework to use in analyzing risks to individual enterprise computing virtual directory instances. We show how to apply this framework to an example implementation, and discuss the benefits of doing so.

  12. Causal modelling applied to the risk assessment of a wastewater discharge.

    PubMed

    Paul, Warren L; Rokahr, Pat A; Webb, Jeff M; Rees, Gavin N; Clune, Tim S

    2016-03-01

    Bayesian networks (BNs), or causal Bayesian networks, have become quite popular in ecological risk assessment and natural resource management because of their utility as a communication and decision-support tool. Since their development in the field of artificial intelligence in the 1980s, however, Bayesian networks have evolved and merged with structural equation modelling (SEM). Unlike BNs, which are constrained to encode causal knowledge in conditional probability tables, SEMs encode this knowledge in structural equations, which is thought to be a more natural language for expressing causal information. This merger has clarified the causal content of SEMs and generalised the method such that it can now be performed using standard statistical techniques. As it was with BNs, the utility of this new generation of SEM in ecological risk assessment will need to be demonstrated with examples to foster an understanding and acceptance of the method. Here, we applied SEM to the risk assessment of a wastewater discharge to a stream, with a particular focus on the process of translating a causal diagram (conceptual model) into a statistical model which might then be used in the decision-making and evaluation stages of the risk assessment. The process of building and testing a spatial causal model is demonstrated using data from a spatial sampling design, and the implications of the resulting model are discussed in terms of the risk assessment. It is argued that a spatiotemporal causal model would have greater external validity than the spatial model, enabling broader generalisations to be made regarding the impact of a discharge, and greater value as a tool for evaluating the effects of potential treatment plant upgrades. Suggestions are made on how the causal model could be augmented to include temporal as well as spatial information, including suggestions for appropriate statistical models and analyses. PMID:26832914

  13. Applied Behavior Analysis is a Science and, Therefore, Progressive.

    PubMed

    Leaf, Justin B; Leaf, Ronald; McEachin, John; Taubman, Mitchell; Ala'i-Rosales, Shahla; Ross, Robert K; Smith, Tristram; Weiss, Mary Jane

    2016-02-01

    Applied behavior analysis (ABA) is a science and, therefore, involves progressive approaches and outcomes. In this commentary we argue that the spirit and the method of science should be maintained in order to avoid reductionist procedures, stifled innovation, and rote, unresponsive protocols that become increasingly removed from meaningful progress for individuals diagnosed with autism spectrum disorder (ASD). We describe this approach as progressive. In a progressive approach to ABA, the therapist employs a structured yet flexible process, which is contingent upon and responsive to child progress. We will describe progressive ABA, contrast it to reductionist ABA, and provide rationales for both the substance and intent of ABA as a progressive scientific method for improving conditions of social relevance for individuals with ASD.

  14. Applied Behavior Analysis is a Science and, Therefore, Progressive.

    PubMed

    Leaf, Justin B; Leaf, Ronald; McEachin, John; Taubman, Mitchell; Ala'i-Rosales, Shahla; Ross, Robert K; Smith, Tristram; Weiss, Mary Jane

    2016-02-01

    Applied behavior analysis (ABA) is a science and, therefore, involves progressive approaches and outcomes. In this commentary we argue that the spirit and the method of science should be maintained in order to avoid reductionist procedures, stifled innovation, and rote, unresponsive protocols that become increasingly removed from meaningful progress for individuals diagnosed with autism spectrum disorder (ASD). We describe this approach as progressive. In a progressive approach to ABA, the therapist employs a structured yet flexible process, which is contingent upon and responsive to child progress. We will describe progressive ABA, contrast it to reductionist ABA, and provide rationales for both the substance and intent of ABA as a progressive scientific method for improving conditions of social relevance for individuals with ASD. PMID:26373767

  15. Applying machine learning techniques to DNA sequence analysis

    SciTech Connect

    Shavlik, J.W.

    1992-01-01

    We are developing a machine learning system that modifies existing knowledge about specific types of biological sequences. It does this by considering sample members and nonmembers of the sequence motif being learned. Using this information (which we call a domain theory''), our learning algorithm produces a more accurate representation of the knowledge needed to categorize future sequences. Specifically, the KBANN algorithm maps inference rules, such as consensus sequences, into a neural (connectionist) network. Neural network training techniques then use the training examples of refine these inference rules. We have been applying this approach to several problems in DNA sequence analysis and have also been extending the capabilities of our learning system along several dimensions.

  16. Finite Element Analysis Applied to Dentoalveolar Trauma: Methodology Description

    PubMed Central

    da Silva, B. R.; Moreira Neto, J. J. S.; da Silva, F. I.; de Aguiar, A. S. W.

    2011-01-01

    Dentoalveolar traumatic injuries are among the clinical conditions most frequently treated in dental practice. However, few studies so far have addressed the biomechanical aspects of these events, probably as a result of difficulties in carrying out satisfactory experimental and clinical studies as well as the unavailability of truly scientific methodologies. The aim of this paper was to describe the use of finite element analysis applied to the biomechanical evaluation of dentoalveolar trauma. For didactic purposes, the methodological process was divided into steps that go from the creation of a geometric model to the evaluation of final results, always with a focus on methodological characteristics, advantages, and disadvantages, so as to allow the reader to customize the methodology according to specific needs. Our description shows that the finite element method can faithfully reproduce dentoalveolar trauma, provided the methodology is closely followed and thoroughly evaluated. PMID:21991463

  17. Use, fate and ecological risks of antibiotics applied in tilapia cage farming in Thailand.

    PubMed

    Rico, Andreu; Oliveira, Rhaul; McDonough, Sakchai; Matser, Arrienne; Khatikarn, Jidapa; Satapornvanit, Kriengkrai; Nogueira, António J A; Soares, Amadeu M V M; Domingues, Inês; Van den Brink, Paul J

    2014-08-01

    The use, environmental fate and ecological risks of antibiotics applied in tilapia cage farming were investigated in the Tha Chin and Mun rivers in Thailand. Information on antibiotic use was collected through interviewing 29 farmers, and the concentrations of the most commonly used antibiotics, oxytetracycline (OTC) and enrofloxacin (ENR), were monitored in river water and sediment samples. Moreover, we assessed the toxicity of OTC and ENR on tropical freshwater invertebrates and performed a risk assessment for aquatic ecosystems. All interviewed tilapia farmers reported to routinely use antibiotics. Peak water concentrations for OTC and ENR were 49 and 1.6 μg/L, respectively. Antibiotics were most frequently detected in sediments with concentrations up to 6908 μg/kg d.w. for OTC, and 2339 μg/kg d.w. for ENR. The results of this study indicate insignificant short-term risks for primary producers and invertebrates, but suggest that the studied aquaculture farms constitute an important source of antibiotic pollution.

  18. Environmental risks of applying sewage sludge compost to vineyards: carbon, heavy metals, nitrogen, and phosphorus accumulation.

    PubMed

    Korboulewsky, Nathalie; Dupouyet, Sylvie; Bonin, Gilles

    2002-01-01

    Biosolids are applied to vineyards to supply organic matter. However, there is concern that this practice can increase the concentration of macronutrients and heavy metals in the soil, some of which can leach. We evaluated the environmental hazard of sewage sludge compost applied in March 1999 at 10, 30, and 90 Mg ha-1 fresh weight in a vineyard in southeastern France. Soil organic matter increased in all plots by 3 g kg-1 18 mo after the amendment. Neither total nor available heavy metal concentrations increased in the soil. Mineral nitrogen (N) in the topsoil of amended plots of 10, 30, and 90 Mg ha-1 increased by 5, 14, and 26 kg (NO3(-)-N + NH4(+)-N) ha-1, respectively, the first summer and by 2, 5, and 10 kg (NO3(-)-N + NH4(+)-N) ha-1, respectively, the second summer compared with controls. At the recommended rate, risks of N leaching is very low, but phosphorus (P) appeared to be the limiting factor. Phosphorus significantly increased only in plots amended with the highest rate in the topsoil and subsoil. At lower rates, although no significant differences were observed, P added was greater than the quantities absorbed by vines. In the long run, P will accumulate in the soil and may reach concentrations that will pose a risk to surface waters and ground water. Therefore, although the current recommended rate (10 Mg ha-1) increased soil organic matter without the risk of N leaching, total sewage sludge loading rates on vineyards should be based on P concentrations.

  19. Estimating an Applying Uncertainties in Probabilistic Tsunami Hazard Analysis (Invited)

    NASA Astrophysics Data System (ADS)

    Thio, H. K.

    2013-12-01

    An integral part of the a probabilistic analysis is the formal inclusion of uncertainties, both due to a limited understanding of the physics processes (epistemic) as well their natural variability (aleatory). Because of the strong non-linearity of the tsunami inundation process, it is also important to not only understand the extent of the uncertainties, but also how and where to apply them. We can divide up the uncertainties into several stages: the source, ocean propagation and nearshore/inundation. On the source side, many of the uncertainties are identical to those used in probabilistic seismic hazard analysis (PSHA). However, the details of slip distributions are very significant in tsunami excitation, especially for near-field tsunamis.. We will show several ways of including slip variability, both stochastic and non-stochastic, by developing a probabilistic set of source scenarios. The uncertainties in ocean propagation are less significant since modern algorithms are very successful in modeling open ocean tsunami propagation. However, in the near-shore regime and the inundation, the situation is much more complex. Here, errors in the local elevation models, variability in bottom friction and the omission of built environment can lead to significant errors. Details of the implementation of the tsunami algorithms can yield different results. We will discuss the most significant sources of uncertainty and the alternative ways to implement them using examples for the probabilistic tsunami hazard mapping that we are currently carrying out for the state of California and other regions.

  20. Acute proliferative retrolental fibroplasia: multivariate risk analysis.

    PubMed Central

    Flynn, J T

    1983-01-01

    This study has presented a two-way analysis of a data set consisting of demographic, diagnostic, and therapeutic variables against the risk of occurrence of APRLF and its location in the retina in a population of 639 infants in birthweights ranging from 600 to 1500 gm. Univariate and multivariate risk analysis techniques were employed to analyze the data. As established from previous studies, birthweight was a powerful predictor of the outcome variable. Oxygen therapy as defined and quantified in this study was not. Duration of ventilatory assistance did seem associated. The population was not uniform. Infants below 1000 gm birthweight had such a high incidence of APRLF that no other exogenous risk factors seemed of significance. Above 1000 gm birthweight, certain factors, particularly duration of ventilation, seemed of predictive strength and significance. Images FIGURE 5 A FIGURE 5 B FIGURE 4 A FIGURE 4 B PMID:6689564

  1. Applying Geostatistical Analysis to Crime Data: Car-Related Thefts in the Baltic States.

    PubMed

    Kerry, Ruth; Goovaerts, Pierre; Haining, Robert P; Ceccato, Vania

    2010-01-01

    Geostatistical methods have rarely been applied to area-level offense data. This article demonstrates their potential for improving the interpretation and understanding of crime patterns using previously analyzed data about car-related thefts for Estonia, Latvia, and Lithuania in 2000. The variogram is used to inform about the scales of variation in offense, social, and economic data. Area-to-area and area-to-point Poisson kriging are used to filter the noise caused by the small number problem. The latter is also used to produce continuous maps of the estimated crime risk (expected number of crimes per 10,000 habitants), thereby reducing the visual bias of large spatial units. In seeking to detect the most likely crime clusters, the uncertainty attached to crime risk estimates is handled through a local cluster analysis using stochastic simulation. Factorial kriging analysis is used to estimate the local- and regional-scale spatial components of the crime risk and explanatory variables. Then regression modeling is used to determine which factors are associated with the risk of car-related theft at different scales.

  2. Terrestrial ecological risk evaluation for triclosan in land-applied biosolids.

    PubMed

    Fuchsman, Phyllis; Lyndall, Jennifer; Bock, Michael; Lauren, Darrel; Barber, Timothy; Leigh, Katrina; Perruchon, Elyse; Capdevielle, Marie

    2010-07-01

    Triclosan is an antimicrobial compound found in many consumer products including soaps and personal care products. Most triclosan is disposed of down household drains, whereupon it is conveyed to wastewater treatment plants. Although a high percentage of triclosan biodegrades during wastewater treatment, most of the remainder is adsorbed to sludge, which may ultimately be applied to land as biosolids. We evaluated terrestrial ecological risks related to triclosan in land-applied biosolids for soil microbes, plants, soil invertebrates, mammals, and birds. Exposures are estimated using a probabilistic fugacity-based model. Triclosan concentrations in biosolids and reported biosolids application rates are compiled to support estimation of triclosan concentrations in soil. Concentrations in biota tissue are estimated using an equilibrium partitioning model for plants and worms and a steady-state model for small mammals; the resulting tissue concentrations are used to model mammalian and avian dietary exposures. Toxicity benchmarks are identified from a review of published and proprietary studies. The results indicate that adverse effects related to soil fertility (i.e., disruption of nitrogen cycling) would be expected only under "worst-case" exposures, under certain soil conditions and would likely be transient. The available data indicate that adverse effects on plants, invertebrates, birds, and mammals due to triclosan in land-applied biosolids are unlikely.

  3. First Attempt of Applying Factor Analysis in Moving Base Gravimetry

    NASA Astrophysics Data System (ADS)

    Li, X.; Roman, D. R.

    2014-12-01

    For gravimetric observation systems on mobile platforms (land/sea/airborne), the Low Signal to Noise Ratio (SNR) issue is the main barrier to achieving an accurate, high resolution gravity signal. Normally, low-pass filters (Childers et al 1999, Forsberg et al 2000, Kwon and Jekeli 2000, Hwang et al 2006) are applied to smooth or remove the high frequency "noise" - even though some of the high frequency component is not necessarily noise. This is especially true for aerogravity surveys such as those from the Gravity for the Redefinition of the American Vertical Datum (GRAV-D) project. These gravity survey flights have a spatial resolution of 10 km between tracks but higher resolution along track. The along track resolution is improved due to the lower flight height (6.1 km), equipment sensitivity, and improved modeling of potential errors. Additionally, these surveys suffer from a loss of signal power due to the increased flight elevation. Hence, application of a low-pass filter removes possible signal sensed in the along-track direction that might otherwise prove useful for various geophysical and geodetic applications. Some cutting-edge developments in Wavelets and Artificial Neural Networks had been successfully applied for obtaining improved results (Li 2008 and 2011, Liang and Liu 2013). However, a clearer and fundamental understanding of the error characteristics will further improve the quality of the gravity estimates out of these gravimetric systems. Here, instead of using any predefined basis function or any a priori model, the idea of Factor Analysis is first employed to try to extract the underlying factors of the noises in the systems. Real data sets collected by both land vehicle and aircraft will be processed as the examples.

  4. Multitaper Spectral Analysis and Wavelet Denoising Applied to Helioseismic Data

    NASA Technical Reports Server (NTRS)

    Komm, R. W.; Gu, Y.; Hill, F.; Stark, P. B.; Fodor, I. K.

    1999-01-01

    Estimates of solar normal mode frequencies from helioseismic observations can be improved by using Multitaper Spectral Analysis (MTSA) to estimate spectra from the time series, then using wavelet denoising of the log spectra. MTSA leads to a power spectrum estimate with reduced variance and better leakage properties than the conventional periodogram. Under the assumption of stationarity and mild regularity conditions, the log multitaper spectrum has a statistical distribution that is approximately Gaussian, so wavelet denoising is asymptotically an optimal method to reduce the noise in the estimated spectra. We find that a single m-upsilon spectrum benefits greatly from MTSA followed by wavelet denoising, and that wavelet denoising by itself can be used to improve m-averaged spectra. We compare estimates using two different 5-taper estimates (Stepian and sine tapers) and the periodogram estimate, for GONG time series at selected angular degrees l. We compare those three spectra with and without wavelet-denoising, both visually, and in terms of the mode parameters estimated from the pre-processed spectra using the GONG peak-fitting algorithm. The two multitaper estimates give equivalent results. The number of modes fitted well by the GONG algorithm is 20% to 60% larger (depending on l and the temporal frequency) when applied to the multitaper estimates than when applied to the periodogram. The estimated mode parameters (frequency, amplitude and width) are comparable for the three power spectrum estimates, except for modes with very small mode widths (a few frequency bins), where the multitaper spectra broadened the modest compared with the periodogram. We tested the influence of the number of tapers used and found that narrow modes at low n values are broadened to the extent that they can no longer be fit if the number of tapers is too large. For helioseismic time series of this length and temporal resolution, the optimal number of tapers is less than 10.

  5. Risk-Stratified Imputation in Survival Analysis

    PubMed Central

    Kennedy, Richard E.; Adragni, Kofi P.; Tiwari, Hemant K.; Voeks, Jenifer H.; Brott, Thomas G.; Howard, George

    2013-01-01

    Background Censoring that is dependent on covariates associated with survival can arise in randomized trials due to changes in recruitment and eligibility criteria to minimize withdrawals, potentially leading to biased treatment effect estimates. Imputation approaches have been proposed to address censoring in survival analysis; and while these approaches may provide unbiased estimates of treatment effects, imputation of a large number of outcomes may over- or underestimate the associated variance based on the imputation pool selected. Purpose We propose an improved method, risk-stratified imputation, as an alternative to address withdrawal related to the risk of events in the context of time-to-event analyses. Methods Our algorithm performs imputation from a pool of replacement subjects with similar values of both treatment and covariate(s) of interest, that is, from a risk-stratified sample. This stratification prior to imputation addresses the requirement of time-to-event analysis that censored observations are representative of all other observations in the risk group with similar exposure variables. We compared our risk-stratified imputation to case deletion and bootstrap imputation in a simulated dataset in which the covariate of interest (study withdrawal) was related to treatment. A motivating example from a recent clinical trial is also presented to demonstrate the utility of our method. Results In our simulations, risk-stratified imputation gives estimates of treatment effect comparable to bootstrap and auxiliary variable imputation while avoiding inaccuracies of the latter two in estimating the associated variance. Similar results were obtained in analysis of clinical trial data. Limitations Risk-stratified imputation has little advantage over other imputation methods when covariates of interest are not related to treatment, although its performance is superior when covariates are related to treatment. Risk-stratified imputation is intended for

  6. Selected Tools for Risk Analysis in Logistics Processes

    NASA Astrophysics Data System (ADS)

    Kulińska, Ewa

    2012-03-01

    As each organization aims at managing effective logistics processes, risk factors can and should be controlled through proper system of risk management. Implementation of complex approach to risk management allows for the following: - evaluation of significant risk groups associated with logistics processes implementation, - composition of integrated strategies of risk management, - composition of tools for risk analysis in logistics processes.

  7. Contribution of European research to risk analysis.

    PubMed

    Boenke, A

    2001-12-01

    The European Commission's, Quality of Life Research Programme, Key Action 1-Health, Food & Nutrition is mission-oriented and aims, amongst other things, at providing a healthy, safe and high-quality food supply leading to reinforced consumer confidence in the safety, of European food. Its objectives also include the enhancing of the competitiveness of the European food supply. Key Action 1 is currently supporting a number of different types of European collaborative projects in the area of risk analysis. The objectives of these projects range from the development and validation of prevention strategies including the reduction of consumers risks; development and validation of new modelling approaches, harmonization of risk assessment principles methodologies and terminology; standardization of methods and systems used for the safety evaluation of transgenic food; providing of tools for the evaluation of human viral contamination of shellfish and quality control; new methodologies for assessing the potential of unintended effects of genetically modified (genetically modified) foods; development of a risk assessment model for Cryptosporidium parvum related to the food and water industries, to the development of a communication platform for genetically modified organism, producers, retailers, regulatory authorities and consumer groups to improve safety assessment procedures, risk management strategies and risk communication; development and validation of new methods for safety testing of transgenic food; evaluation of the safety and efficacy of iron supplementation in pregnant women, evaluation of the potential cancer-preventing activity of pro- and pre-biotic ('synbiotic') combinations in human volunteers. An overview of these projects is presented here.

  8. Addressing Challenging Behaviour in Children with Down Syndrome: The Use of Applied Behaviour Analysis for Assessment and Intervention

    ERIC Educational Resources Information Center

    Feeley, Kathlee M.; Jones, Emily A.

    2006-01-01

    Children with Down syndrome are at an increased risk for engaging in challenging behaviour that may be part of a behavioural phenotype characteristic of Down syndrome. The methodology of applied behaviour analysis has been demonstrated effective with a wide range of challenging behaviours, across various disabilities. Applications to children with…

  9. Supplemental Hazard Analysis and Risk Assessment - Hydrotreater

    SciTech Connect

    Lowry, Peter P.; Wagner, Katie A.

    2015-04-01

    A supplemental hazard analysis was conducted and quantitative risk assessment performed in response to an independent review comment received by the Pacific Northwest National Laboratory (PNNL) from the U.S. Department of Energy Pacific Northwest Field Office (PNSO) against the Hydrotreater/Distillation Column Hazard Analysis Report issued in April 2013. The supplemental analysis used the hazardous conditions documented by the previous April 2013 report as a basis. The conditions were screened and grouped for the purpose of identifying whether additional prudent, practical hazard controls could be identified, using a quantitative risk evaluation to assess the adequacy of the controls and establish a lower level of concern for the likelihood of potential serious accidents. Calculations were performed to support conclusions where necessary.

  10. To Apply or Not to Apply: A Survey Analysis of Grant Writing Costs and Benefits

    PubMed Central

    von Hippel, Ted; von Hippel, Courtney

    2015-01-01

    We surveyed 113 astronomers and 82 psychologists active in applying for federally funded research on their grant-writing history between January, 2009 and November, 2012. We collected demographic data, effort levels, success rates, and perceived non-financial benefits from writing grant proposals. We find that the average proposal takes 116 PI hours and 55 CI hours to write; although time spent writing was not related to whether the grant was funded. Effort did translate into success, however, as academics who wrote more grants received more funding. Participants indicated modest non-monetary benefits from grant writing, with psychologists reporting a somewhat greater benefit overall than astronomers. These perceptions of non-financial benefits were unrelated to how many grants investigators applied for, the number of grants they received, or the amount of time they devoted to writing their proposals. We also explored the number of years an investigator can afford to apply unsuccessfully for research grants and our analyses suggest that funding rates below approximately 20%, commensurate with current NIH and NSF funding, are likely to drive at least half of the active researchers away from federally funded research. We conclude with recommendations and suggestions for individual investigators and for department heads. PMID:25738742

  11. To apply or not to apply: a survey analysis of grant writing costs and benefits.

    PubMed

    von Hippel, Ted; von Hippel, Courtney

    2015-01-01

    We surveyed 113 astronomers and 82 psychologists active in applying for federally funded research on their grant-writing history between January, 2009 and November, 2012. We collected demographic data, effort levels, success rates, and perceived non-financial benefits from writing grant proposals. We find that the average proposal takes 116 PI hours and 55 CI hours to write; although time spent writing was not related to whether the grant was funded. Effort did translate into success, however, as academics who wrote more grants received more funding. Participants indicated modest non-monetary benefits from grant writing, with psychologists reporting a somewhat greater benefit overall than astronomers. These perceptions of non-financial benefits were unrelated to how many grants investigators applied for, the number of grants they received, or the amount of time they devoted to writing their proposals. We also explored the number of years an investigator can afford to apply unsuccessfully for research grants and our analyses suggest that funding rates below approximately 20%, commensurate with current NIH and NSF funding, are likely to drive at least half of the active researchers away from federally funded research. We conclude with recommendations and suggestions for individual investigators and for department heads.

  12. To apply or not to apply: a survey analysis of grant writing costs and benefits.

    PubMed

    von Hippel, Ted; von Hippel, Courtney

    2015-01-01

    We surveyed 113 astronomers and 82 psychologists active in applying for federally funded research on their grant-writing history between January, 2009 and November, 2012. We collected demographic data, effort levels, success rates, and perceived non-financial benefits from writing grant proposals. We find that the average proposal takes 116 PI hours and 55 CI hours to write; although time spent writing was not related to whether the grant was funded. Effort did translate into success, however, as academics who wrote more grants received more funding. Participants indicated modest non-monetary benefits from grant writing, with psychologists reporting a somewhat greater benefit overall than astronomers. These perceptions of non-financial benefits were unrelated to how many grants investigators applied for, the number of grants they received, or the amount of time they devoted to writing their proposals. We also explored the number of years an investigator can afford to apply unsuccessfully for research grants and our analyses suggest that funding rates below approximately 20%, commensurate with current NIH and NSF funding, are likely to drive at least half of the active researchers away from federally funded research. We conclude with recommendations and suggestions for individual investigators and for department heads. PMID:25738742

  13. Sensitivity analysis for texture models applied to rust steel classification

    NASA Astrophysics Data System (ADS)

    Trujillo, Maite; Sadki, Mustapha

    2004-05-01

    The exposure of metallic structures to rust degradation during their operational life is a known problem and it affects storage tanks, steel bridges, ships, etc. In order to prevent this degradation and the potential related catastrophes, the surfaces have to be assessed and the appropriate surface treatment and coating need to be applied according to the corrosion time of the steel. We previously investigated the potential of image processing techniques to tackle this problem. Several mathematical algorithms methods were analyzed and evaluated on a database of 500 images. In this paper, we extend our previous research and provide a further analysis of the textural mathematical methods for automatic rust time steel detection. Statistical descriptors are provided to evaluate the sensitivity of the results as well as the advantages and limitations of the different methods. Finally, a selector of the classifiers algorithms is introduced and the ratio between sensitivity of the results and time response (execution time) is analyzed to compromise good classification results (high sensitivity) and acceptable time response for the automation of the system.

  14. Applying simulation modeling to problems in toxicology and risk assessment--a short perspective.

    PubMed

    Andersen, M E; Clewell, H J; Frederick, C B

    1995-08-01

    The goals of this perspective have been to examine areas where quantitative simulation models may be useful in toxicology and related risk assessment fields and to offer suggestions for preparing manuscripts that describe these models. If developments in other disciplines serve as a bell-wether, the use of mathematical models in toxicology will continue to increase, partly, at least, because the new generations of scientists are being trained in an electronic environment where computation of all kinds is learned at an early age. Undoubtedly, however, the utility of these models will be directly tied to the skills of investigators in accurately describing models in their research papers. These publications should convey descriptions of both the insights obtained and the opportunities provided by these models to integrate existing data bases and suggest new and useful experiments. We hope these comments serve to facilitate the expansion of good modeling practices as applied to toxicological problems.

  15. 14 CFR Appendix C to Part 420 - Risk Analysis

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 14 Aeronautics and Space 4 2011-01-01 2011-01-01 false Risk Analysis C Appendix C to Part 420... TRANSPORTATION LICENSING LICENSE TO OPERATE A LAUNCH SITE Pt. 420, App. C Appendix C to Part 420—Risk Analysis (a... risk is minimal. (2) An applicant shall perform a risk analysis when a populated area is located...

  16. 14 CFR Appendix C to Part 420 - Risk Analysis

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 14 Aeronautics and Space 4 2014-01-01 2014-01-01 false Risk Analysis C Appendix C to Part 420... TRANSPORTATION LICENSING LICENSE TO OPERATE A LAUNCH SITE Pt. 420, App. C Appendix C to Part 420—Risk Analysis (a... risk is minimal. (2) An applicant shall perform a risk analysis when a populated area is located...

  17. 14 CFR Appendix C to Part 420 - Risk Analysis

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 14 Aeronautics and Space 4 2013-01-01 2013-01-01 false Risk Analysis C Appendix C to Part 420... TRANSPORTATION LICENSING LICENSE TO OPERATE A LAUNCH SITE Pt. 420, App. C Appendix C to Part 420—Risk Analysis (a... risk is minimal. (2) An applicant shall perform a risk analysis when a populated area is located...

  18. 14 CFR Appendix C to Part 420 - Risk Analysis

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 14 Aeronautics and Space 4 2012-01-01 2012-01-01 false Risk Analysis C Appendix C to Part 420... TRANSPORTATION LICENSING LICENSE TO OPERATE A LAUNCH SITE Pt. 420, App. C Appendix C to Part 420—Risk Analysis (a... risk is minimal. (2) An applicant shall perform a risk analysis when a populated area is located...

  19. Risk assessment and its application to flight safety analysis

    SciTech Connect

    Keese, D.L.; Barton, W.R.

    1989-12-01

    Potentially hazardous test activities have historically been a part of Sandia National Labs mission to design, develop, and test new weapons systems. These test activities include high speed air drops for parachute development, sled tests for component and system level studies, multiple stage rocket experiments, and artillery firings of various projectiles. Due to the nature of Sandia's test programs, the risk associated with these activities can never be totally eliminated. However, a consistent set of policies should be available to provide guidance into the level of risk that is acceptable in these areas. This report presents a general set of guidelines for addressing safety issues related to rocket flight operations at Sandia National Laboratories. Even though the majority of this report deals primarily with rocket flight safety, these same principles could be applied to other hazardous test activities. The basic concepts of risk analysis have a wide range of applications into many of Sandia's current operations. 14 refs., 1 tab.

  20. Health cancer risk assessment for arsenic exposure in potentially contaminated areas by fertilizer plants: a possible regulatory approach applied to a case study in Moscow region-Russia.

    PubMed

    Zakharova, Tatiana; Tatàno, Fabio; Menshikov, Valery

    2002-08-01

    At present, fertilizer industry plants are considered as a potential source of soil contamination in Russia. Therefore health risk assessment should be pursued in Russian fertilizer plant areas, but unfortunately risk assessment methodology for contaminated sites does not have yet a regulatory value in Russia. In this paper a possible and intentionally simple regulatory approach for health cancer risk assessment at phosphogypsum waste-storing potentially contaminated sites is presented. The proposed approach is applied to a potential contaminated area located in the Moscow river (Moscow Region) protective zone. At this case-study area, arsenic has been chosen as a contaminant indicator, according to the proposed selection procedure. For estimating the human exposure to arsenic through various pathways the original McKone & Daniels '91 model has been adapted. As a specific result of the risk assessment for the case-study area, it has been shown that arsenic exposure pathways (in risk-ranking order) "ingestion of agricultural products," "groundwater uptake," "dermal contact," and "soil ingestion" pose a significant health risk. From a general point of view, the proposed and applied health risk assessment approach could give some contribution (for comparison and discussion) for policies on contaminated soils to other countries. In this perspective, the paper expressly considers the current Italian regulative situation concerning restricted use of risk analysis and concerning soil quality for agricultural land use.

  1. From aviation to medicine: applying concepts of aviation safety to risk management in ambulatory care

    PubMed Central

    Wilf-Miron, R; Lewenhoff, I; Benyamini, Z; Aviram, A

    2003-01-01

    

 The development of a medical risk management programme based on the aviation safety approach and its implementation in a large ambulatory healthcare organisation is described. The following key safety principles were applied: (1) errors inevitably occur and usually derive from faulty system design, not from negligence; (2) accident prevention should be an ongoing process based on open and full reporting; (3) major accidents are only the "tip of the iceberg" of processes that indicate possibilities for organisational learning. Reporting physicians were granted immunity, which encouraged open reporting of errors. A telephone "hotline" served the medical staff for direct reporting and receipt of emotional support and medical guidance. Any adverse event which had learning potential was debriefed, while focusing on the human cause of error within a systemic context. Specific recommendations were formulated to rectify processes conducive to error when failures were identified. During the first 5 years of implementation, the aviation safety concept and tools were successfully adapted to ambulatory care, fostering a culture of greater concern for patient safety through risk management while providing support to the medical staff. PMID:12571343

  2. 76 FR 30705 - Problem Formulation for Human Health Risk Assessments of Pathogens in Land-Applied Biosolids

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-05-26

    ... the public and an independent, external panel of scientific experts (73 FR 54400). Dated: May 18, 2011... AGENCY Problem Formulation for Human Health Risk Assessments of Pathogens in Land-Applied Biosolids... the availability of a final report titled, ``Problem Formulation for Human Health Risk Assessments...

  3. Risk analysis for renewable energy projects due to constraints arising

    NASA Astrophysics Data System (ADS)

    Prostean, G.; Vasar, C.; Prostean, O.; Vartosu, A.

    2016-02-01

    Starting from the target of the European Union (EU) to use renewable energy in the area that aims a binding target of 20% renewable energy in final energy consumption by 2020, this article illustrates the identification of risks for implementation of wind energy projects in Romania, which could lead to complex technical implications, social and administrative. In specific projects analyzed in this paper were identified critical bottlenecks in the future wind power supply chain and reasonable time periods that may arise. Renewable energy technologies have to face a number of constraints that delayed scaling-up their production process, their transport process, the equipment reliability, etc. so implementing these types of projects requiring complex specialized team, the coordination of which also involve specific risks. The research team applied an analytical risk approach to identify major risks encountered within a wind farm project developed in Romania in isolated regions with different particularities, configured for different geographical areas (hill and mountain locations in Romania). Identification of major risks was based on the conceptual model set up for the entire project implementation process. Throughout this conceptual model there were identified specific constraints of such process. Integration risks were examined by an empirical study based on the method HAZOP (Hazard and Operability). The discussion describes the analysis of our results implementation context of renewable energy projects in Romania and creates a framework for assessing energy supply to any entity from renewable sources.

  4. RISK ASSESSMENT AND EPIDEMIOLOGICAL INFORMATION FOR PATHOGENIC MICROORGANISMS APPLIED TO SOIL

    EPA Science Inventory

    There is increasing interest in the development of a microbial risk assessment methodology for regulatory and operational decision making. Initial interests in microbial risk assessments focused on drinking, recreational, and reclaimed water issues. More recently risk assessmen...

  5. Integrated Reliability and Risk Analysis System (IRRAS)

    SciTech Connect

    Russell, K D; McKay, M K; Sattison, M.B. Skinner, N.L.; Wood, S T; Rasmuson, D M

    1992-01-01

    The Integrated Reliability and Risk Analysis System (IRRAS) is a state-of-the-art, microcomputer-based probabilistic risk assessment (PRA) model development and analysis tool to address key nuclear plant safety issues. IRRAS is an integrated software tool that gives the user the ability to create and analyze fault trees and accident sequences using a microcomputer. This program provides functions that range from graphical fault tree construction to cut set generation and quantification. Version 1.0 of the IRRAS program was released in February of 1987. Since that time, many user comments and enhancements have been incorporated into the program providing a much more powerful and user-friendly system. This version has been designated IRRAS 4.0 and is the subject of this Reference Manual. Version 4.0 of IRRAS provides the same capabilities as Version 1.0 and adds a relational data base facility for managing the data, improved functionality, and improved algorithm performance.

  6. New risk metrics and mathematical tools for risk analysis: Current and future challenges

    NASA Astrophysics Data System (ADS)

    Skandamis, Panagiotis N.; Andritsos, Nikolaos; Psomas, Antonios; Paramythiotis, Spyridon

    2015-01-01

    The current status of the food safety supply world wide, has led Food and Agriculture Organization (FAO) and World Health Organization (WHO) to establishing Risk Analysis as the single framework for building food safety control programs. A series of guidelines and reports that detail out the various steps in Risk Analysis, namely Risk Management, Risk Assessment and Risk Communication is available. The Risk Analysis approach enables integration between operational food management systems, such as Hazard Analysis Critical Control Points, public health and governmental decisions. To do that, a series of new Risk Metrics has been established as follows: i) the Appropriate Level of Protection (ALOP), which indicates the maximum numbers of illnesses in a population per annum, defined by quantitative risk assessments, and used to establish; ii) Food Safety Objective (FSO), which sets the maximum frequency and/or concentration of a hazard in a food at the time of consumption that provides or contributes to the ALOP. Given that ALOP is rather a metric of the public health tolerable burden (it addresses the total `failure' that may be handled at a national level), it is difficult to be interpreted into control measures applied at the manufacturing level. Thus, a series of specific objectives and criteria for performance of individual processes and products have been established, all of them assisting in the achievement of FSO and hence, ALOP. In order to achieve FSO, tools quantifying the effect of processes and intrinsic properties of foods on survival and growth of pathogens are essential. In this context, predictive microbiology and risk assessment have offered an important assistance to Food Safety Management. Predictive modelling is the basis of exposure assessment and the development of stochastic and kinetic models, which are also available in the form of Web-based applications, e.g., COMBASE and Microbial Responses Viewer), or introduced into user-friendly softwares

  7. New risk metrics and mathematical tools for risk analysis: Current and future challenges

    SciTech Connect

    Skandamis, Panagiotis N. Andritsos, Nikolaos Psomas, Antonios Paramythiotis, Spyridon

    2015-01-22

    The current status of the food safety supply world wide, has led Food and Agriculture Organization (FAO) and World Health Organization (WHO) to establishing Risk Analysis as the single framework for building food safety control programs. A series of guidelines and reports that detail out the various steps in Risk Analysis, namely Risk Management, Risk Assessment and Risk Communication is available. The Risk Analysis approach enables integration between operational food management systems, such as Hazard Analysis Critical Control Points, public health and governmental decisions. To do that, a series of new Risk Metrics has been established as follows: i) the Appropriate Level of Protection (ALOP), which indicates the maximum numbers of illnesses in a population per annum, defined by quantitative risk assessments, and used to establish; ii) Food Safety Objective (FSO), which sets the maximum frequency and/or concentration of a hazard in a food at the time of consumption that provides or contributes to the ALOP. Given that ALOP is rather a metric of the public health tolerable burden (it addresses the total ‘failure’ that may be handled at a national level), it is difficult to be interpreted into control measures applied at the manufacturing level. Thus, a series of specific objectives and criteria for performance of individual processes and products have been established, all of them assisting in the achievement of FSO and hence, ALOP. In order to achieve FSO, tools quantifying the effect of processes and intrinsic properties of foods on survival and growth of pathogens are essential. In this context, predictive microbiology and risk assessment have offered an important assistance to Food Safety Management. Predictive modelling is the basis of exposure assessment and the development of stochastic and kinetic models, which are also available in the form of Web-based applications, e.g., COMBASE and Microbial Responses Viewer), or introduced into user

  8. 14 CFR 417.225 - Debris risk analysis.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 14 Aeronautics and Space 4 2011-01-01 2011-01-01 false Debris risk analysis. 417.225 Section 417... OF TRANSPORTATION LICENSING LAUNCH SAFETY Flight Safety Analysis § 417.225 Debris risk analysis. A flight safety analysis must demonstrate that the risk to the public potentially exposed to inert...

  9. 14 CFR 417.225 - Debris risk analysis.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 14 Aeronautics and Space 4 2014-01-01 2014-01-01 false Debris risk analysis. 417.225 Section 417... OF TRANSPORTATION LICENSING LAUNCH SAFETY Flight Safety Analysis § 417.225 Debris risk analysis. A flight safety analysis must demonstrate that the risk to the public potentially exposed to inert...

  10. 14 CFR 417.225 - Debris risk analysis.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 14 Aeronautics and Space 4 2010-01-01 2010-01-01 false Debris risk analysis. 417.225 Section 417... OF TRANSPORTATION LICENSING LAUNCH SAFETY Flight Safety Analysis § 417.225 Debris risk analysis. A flight safety analysis must demonstrate that the risk to the public potentially exposed to inert...

  11. Factor Analysis Applied the VFY-218 RCS Data

    NASA Technical Reports Server (NTRS)

    Woo, Alex; Chancellor, Marisa K. (Technical Monitor)

    1997-01-01

    Present statistical factor analysis of computer simulations and measurement data for the VFY-218 configuration. Factor analysis try to quantify the statistical grouping of measurements and simulations.

  12. Fire Risk Analysis for Armenian NPP Confinement

    SciTech Connect

    Poghosyan, Shahen; Malkhasyan, Albert; Bznuni, Surik; Amirjanyan, Armen

    2006-07-01

    Major fire occurred at Armenian NPP (ANPP) in October 1982 showed that fire-induced initiating events (IE) can have dominant contribution in overall risk of core damage. Probabilistic Safety Assessment study for fire-induced initiating events for ANPP was initiated in 2002. Analysis was performed for compartments fires in which could result in failure of components which are necessary for reactor cold shutdown. Analysis shows that main risk from fire at ANPP is conditioned by fire in cable tunnels 61-64. Meanwhile fire in confinement compartments don't have significant contribution to overall risk of core damage. The exception is so called 'confinement valves compartment' (room no.A-013/2) fire (more than 7.5% of CDF) in which fire could result in the loss of coolant accident with unavailability of primary makeup system, which directly leads to core damage. Detailed analysis of this problem that is common for typical WWER-440/230 reactors with no hermetic MCPs and recommendations for solution are presented in this paper. (authors)

  13. Debris Flow Risk Management Framework and Risk Analysis in Taiwan, A Preliminary Study

    NASA Astrophysics Data System (ADS)

    Tsao, Ting-Chi; Hsu, Wen-Ko; Chiou, Lin-Bin; Cheng, Chin-Tung; Lo, Wen-Chun; Chen, Chen-Yu; Lai, Cheng-Nong; Ju, Jiun-Ping

    2010-05-01

    Taiwan is located on a seismically active mountain belt between the Philippine Sea plate and Eurasian plate. After 1999's Chi-Chi earthquake (Mw=7.6), landslide and debris flow occurred frequently. In Aug. 2009, Typhoon Morakot struck Taiwan and numerous landslides and debris flow events, some with tremendous fatalities, were observed. With limited resources, authorities should establish a disaster management system to cope with slope disaster risks more effectively. Since 2006, Taiwan's authority in charge of debris flow management, the Soil and Water Conservation Bureau (SWCB), completed the basic investigation and data collection of 1,503 potential debris flow creeks around Taiwan. During 2008 and 2009, a debris flow quantitative risk analysis (QRA) framework, based on landslide risk management framework of Australia, was proposed and conducted on 106 creeks of the 30 villages with debris flow hazard history. Information and value of several types of elements at risk (bridge, road, building and crop) were gathered and integrated into a GIS layer, with the vulnerability model of each elements at risk applied. Through studying the historical hazard events of the 30 villages, numerical simulations of debris flow hazards with different magnitudes (5, 10, 25, 50, 100 and 200 years return period) were conducted, the economic losses and fatalities of each scenario were calculated for each creek. When taking annual exceeding probability into account, the annual total risk of each creek was calculated, and the results displayed on a debris flow risk map. The number of fatalities and frequency were calculated, and the F-N curves of 106 creeks were provided. For F-N curves, the individual risk to life per year of 1.0E-04 and slope of 1, which matched with international standards, were considered to be an acceptable risk. Applying the results of the 106 creeks onto the F-N curve, they were divided into 3 categories: Unacceptable, ALARP (As Low As Reasonable Practicable) and

  14. Global Human Settlement Analysis for Disaster Risk Reduction

    NASA Astrophysics Data System (ADS)

    Pesaresi, M.; Ehrlich, D.; Ferri, S.; Florczyk, A.; Freire, S.; Haag, F.; Halkia, M.; Julea, A. M.; Kemper, T.; Soille, P.

    2015-04-01

    The Global Human Settlement Layer (GHSL) is supported by the European Commission, Joint Research Center (JRC) in the frame of his institutional research activities. Scope of GHSL is developing, testing and applying the technologies and analysis methods integrated in the JRC Global Human Settlement analysis platform for applications in support to global disaster risk reduction initiatives (DRR) and regional analysis in the frame of the European Cohesion policy. GHSL analysis platform uses geo-spatial data, primarily remotely sensed and population. GHSL also cooperates with the Group on Earth Observation on SB-04-Global Urban Observation and Information, and various international partners andWorld Bank and United Nations agencies. Some preliminary results integrating global human settlement information extracted from Landsat data records of the last 40 years and population data are presented.

  15. Phase plane analysis: applying chaos theory in health care.

    PubMed

    Priesmeyer, H R; Sharp, L F

    1995-01-01

    This article applies the new science of nonlinearity to administrative issues and accounts receivable management in health care, and it provides a new perspective on common operating and quality control measures. PMID:10151628

  16. Meta-analysis of osteoporosis: fracture risks, medication and treatment.

    PubMed

    Liu, W; Yang, L-H; Kong, X-C; An, L-K; Wang, R

    2015-08-01

    Osteoporosis is a brittle bone disease that can cause fractures mostly in older men and women. Meta-analysis is the statistical method which is applied in the frame work for the assessment of results obtained from various research studies conducted in several years. A meta-analysis of osteoporotic fracture risk with medication non-adherence has been described to assess the bone fracture risk among patients non-adherent versus adherent to therapy for osteoporosis by many researchers. Osteoporosis therapy reduces the risk of fracture in clinical trials, and real-world adherence to therapy which is suboptimal and can reduce the effectiveness of intervention. The methods of Medline, Embase, and CINAHL were literature searched for these observational studies from year 1998 to 2009, and up to 2015. The results of meta-analysis of osteoporosis research on fractures of postmenopausal women and men are presented. The use of bisphosphonate therapy for osteoporosis has been described with other drugs. The authors, design, studies (women %), years (data), follow-up (wks), fractures (types), and compliance or persistence results from years 2004 to 2009 from are shown in a brief table. The meta-analysis studies have been reviewed from other researchers on osteoporosis and fractures, medications and treatments.

  17. Applying the theory of reasoned action to AIDS risk behavior: condom use among black women.

    PubMed

    Jemmott, L S; Jemmott, J B

    1991-01-01

    This study tested hypotheses regarding attitudinal and normative influences on intentions to use condoms, a practice that would reduce women's risk of sexually transmitted HIV infection. Participants were 103 sexually active unmarried black women undergraduates at an inner-city commuter university, in an area with a high rate of reported AIDS cases among women. Consistent with the theory of reasoned action, multiple regression analysis on women's anonymous responses to a mailed survey revealed that those who registered more favorable attitudes toward condoms and those who perceived subjective norms more supportive of condom use reported firmer intentions to use condoms in the next three months. Key behavioral beliefs related to attitudes centered on the adverse effects of condom use on sexual enjoyment. Key normative influences were respondents' sexual partners and mothers. However, women's own attitudes were a stronger determinant of intentions to use condoms than were their perceptions of normative influences, particularly among women with above-average AIDS knowledge. The results suggest that the theory of reasoned action provides a potentially useful conceptual framework for interventions to change a key AIDS risk behavior among women.

  18. Multi-Criteria Analysis for Biomass Utilization Applying Data Envelopment Analysis

    NASA Astrophysics Data System (ADS)

    Morimoto, Hidetsugu; Hoshino, Satoshi; Kuki, Yasuaki

    This paper aimed to consider about material-recycling, preventing global warming, and economic efficiency on preset and planed 195 Biomass Towns applying DEA (Data Envelopment Analysis), which can evaluate operational efficiency entities such as private companies or projects. In the results, although the Biomass Town can recycle material efficiently, it was clarified that preventing global warming and business profitability was brushed off like it in Biomass Town Design. Moreover, from the point of view of operational efficiency, we suggested an improvement of the Biomass Town scale for more efficiency-enhancing applying DEA. We found that applying DEA was able to catch more improvements or indicator as compared with cost-benefit analysis and cost-effectiveness analysis.

  19. Metabolic and Dynamic Profiling for Risk Assessment of Fluopyram, a Typical Phenylamide Fungicide Widely Applied in Vegetable Ecosystem

    PubMed Central

    Wei, Peng; Liu, Yanan; Li, Wenzhuo; Qian, Yuan; Nie, Yanxia; Kim, Dongyeop; Wang, Mengcen

    2016-01-01

    Fluopyram, a typical phenylamide fungicide, was widely applied to protect fruit vegetables from fungal pathogens-responsible yield loss. Highly linked to the ecological and dietary risks, its residual and metabolic profiles in the fruit vegetable ecosystem still remained obscure. Here, an approach using modified QuEChERS (Quick, Easy, Cheap, Effective, Rugged and Safe) extraction combined with GC-MS/MS analysis was developed to investigate fluopyram fate in the typical fruit vegetables including tomato, cucumber, pepper under the greenhouse environment. Fluopyram dissipated in accordance with the first-order rate dynamics equation with the maximum half-life of 5.7 d. Cleveage of fluopyram into 2-trifluoromethyl benzamide and subsequent formation of 3-chloro-5-(trifluoromethyl) pyridine-2-acetic acid and 3-chloro-5-(trifluoromethyl) picolinic acid was elucidated to be its ubiquitous metabolic pathway. Moreover, the incurrence of fluopyram at the pre-harvest interval (PHI) of 7–21 d was between 0.0108 and 0.1603 mg/kg, and the Hazard Quotients (HQs) were calculated to be less than 1, indicating temporary safety on consumption of the fruit vegetables incurred with fluopyram, irrespective of the uncertain toxicity of the metabolites. Taken together, our findings reveal the residual essential of fluopyram in the typical agricultural ecosystem, and would advance the further insight into ecological risk posed by this fungicide associated with its metabolites. PMID:27654708

  20. Metabolic and Dynamic Profiling for Risk Assessment of Fluopyram, a Typical Phenylamide Fungicide Widely Applied in Vegetable Ecosystem

    NASA Astrophysics Data System (ADS)

    Wei, Peng; Liu, Yanan; Li, Wenzhuo; Qian, Yuan; Nie, Yanxia; Kim, Dongyeop; Wang, Mengcen

    2016-09-01

    Fluopyram, a typical phenylamide fungicide, was widely applied to protect fruit vegetables from fungal pathogens-responsible yield loss. Highly linked to the ecological and dietary risks, its residual and metabolic profiles in the fruit vegetable ecosystem still remained obscure. Here, an approach using modified QuEChERS (Quick, Easy, Cheap, Effective, Rugged and Safe) extraction combined with GC-MS/MS analysis was developed to investigate fluopyram fate in the typical fruit vegetables including tomato, cucumber, pepper under the greenhouse environment. Fluopyram dissipated in accordance with the first-order rate dynamics equation with the maximum half-life of 5.7 d. Cleveage of fluopyram into 2-trifluoromethyl benzamide and subsequent formation of 3-chloro-5-(trifluoromethyl) pyridine-2-acetic acid and 3-chloro-5-(trifluoromethyl) picolinic acid was elucidated to be its ubiquitous metabolic pathway. Moreover, the incurrence of fluopyram at the pre-harvest interval (PHI) of 7–21 d was between 0.0108 and 0.1603 mg/kg, and the Hazard Quotients (HQs) were calculated to be less than 1, indicating temporary safety on consumption of the fruit vegetables incurred with fluopyram, irrespective of the uncertain toxicity of the metabolites. Taken together, our findings reveal the residual essential of fluopyram in the typical agricultural ecosystem, and would advance the further insight into ecological risk posed by this fungicide associated with its metabolites.

  1. Approach to uncertainty in risk analysis

    SciTech Connect

    Rish, W.R.

    1988-08-01

    In the Fall of 1985 EPA's Office of Radiation Programs (ORP) initiated a project to develop a formal approach to dealing with uncertainties encountered when estimating and evaluating risks to human health and the environment. Based on a literature review of modeling uncertainty, interviews with ORP technical and management staff, and input from experts on uncertainty analysis, a comprehensive approach was developed. This approach recognizes by design the constraints on budget, time, manpower, expertise, and availability of information often encountered in ''real world'' modeling. It is based on the observation that in practice risk modeling is usually done to support a decision process. As such, the approach focuses on how to frame a given risk modeling problem, how to use that framing to select an appropriate mixture of uncertainty analyses techniques, and how to integrate the techniques into an uncertainty assessment that effectively communicates important information and insight to decision-makers. The approach is presented in this report. Practical guidance on characterizing and analyzing uncertainties about model form and quantities and on effectively communicating uncertainty analysis results is included. Examples from actual applications are presented.

  2. Applying Regression Analysis to Problems in Institutional Research.

    ERIC Educational Resources Information Center

    Bohannon, Tom R.

    1988-01-01

    Regression analysis is one of the most frequently used statistical techniques in institutional research. Principles of least squares, model building, residual analysis, influence statistics, and multi-collinearity are described and illustrated. (Author/MSE)

  3. Nutrient Status and Contamination Risks from Digested Pig Slurry Applied on a Vegetable Crops Field.

    PubMed

    Zhang, Shaohui; Hua, Yumei; Deng, Liangwei

    2016-04-01

    The effects of applied digested pig slurry on a vegetable crops field were studied. The study included a 3-year investigation on nutrient characteristics, heavy metals contamination and hygienic risks of a vegetable crops field in Wuhan, China. The results showed that, after anaerobic digestion, abundant N, P and K remained in the digested pig slurry while fecal coliforms, ascaris eggs, schistosoma eggs and hookworm eggs were highly reduced. High Cr, Zn and Cu contents in the digested pig slurry were found in spring. Digested pig slurry application to the vegetable crops field led to improved soil fertility. Plant-available P in the fertilized soils increased due to considerable increase in total P content and decrease in low-availability P fraction. The As content in the fertilized soils increased slightly but significantly (p = 0.003) compared with control. The Hg, Zn, Cr, Cd, Pb, and Cu contents in the fertilized soils did not exceed the maximum permissible contents for vegetable crops soils in China. However, high Zn accumulation should be of concern due to repeated applications of digested pig slurry. No fecal coliforms, ascaris eggs, schistosoma eggs or hookworm eggs were detected in the fertilized soils. PMID:27058548

  4. Nutrient Status and Contamination Risks from Digested Pig Slurry Applied on a Vegetable Crops Field

    PubMed Central

    Zhang, Shaohui; Hua, Yumei; Deng, Liangwei

    2016-01-01

    The effects of applied digested pig slurry on a vegetable crops field were studied. The study included a 3-year investigation on nutrient characteristics, heavy metals contamination and hygienic risks of a vegetable crops field in Wuhan, China. The results showed that, after anaerobic digestion, abundant N, P and K remained in the digested pig slurry while fecal coliforms, ascaris eggs, schistosoma eggs and hookworm eggs were highly reduced. High Cr, Zn and Cu contents in the digested pig slurry were found in spring. Digested pig slurry application to the vegetable crops field led to improved soil fertility. Plant-available P in the fertilized soils increased due to considerable increase in total P content and decrease in low-availability P fraction. The As content in the fertilized soils increased slightly but significantly (p = 0.003) compared with control. The Hg, Zn, Cr, Cd, Pb, and Cu contents in the fertilized soils did not exceed the maximum permissible contents for vegetable crops soils in China. However, high Zn accumulation should be of concern due to repeated applications of digested pig slurry. No fecal coliforms, ascaris eggs, schistosoma eggs or hookworm eggs were detected in the fertilized soils. PMID:27058548

  5. Nutrient Status and Contamination Risks from Digested Pig Slurry Applied on a Vegetable Crops Field.

    PubMed

    Zhang, Shaohui; Hua, Yumei; Deng, Liangwei

    2016-04-01

    The effects of applied digested pig slurry on a vegetable crops field were studied. The study included a 3-year investigation on nutrient characteristics, heavy metals contamination and hygienic risks of a vegetable crops field in Wuhan, China. The results showed that, after anaerobic digestion, abundant N, P and K remained in the digested pig slurry while fecal coliforms, ascaris eggs, schistosoma eggs and hookworm eggs were highly reduced. High Cr, Zn and Cu contents in the digested pig slurry were found in spring. Digested pig slurry application to the vegetable crops field led to improved soil fertility. Plant-available P in the fertilized soils increased due to considerable increase in total P content and decrease in low-availability P fraction. The As content in the fertilized soils increased slightly but significantly (p = 0.003) compared with control. The Hg, Zn, Cr, Cd, Pb, and Cu contents in the fertilized soils did not exceed the maximum permissible contents for vegetable crops soils in China. However, high Zn accumulation should be of concern due to repeated applications of digested pig slurry. No fecal coliforms, ascaris eggs, schistosoma eggs or hookworm eggs were detected in the fertilized soils.

  6. Comprehensive safeguards evaluation methods and societal risk analysis

    SciTech Connect

    Richardson, J.M.

    1982-03-01

    Essential capabilities of an integrated evaluation methodology for analyzing safeguards systems are discussed. Such a methodology must be conceptually meaningful, technically defensible, discriminating and consistent. A decompostion of safeguards systems by function is mentioned as a possible starting point for methodology development. The application of a societal risk equation to safeguards systems analysis is addressed. Conceptual problems with this approach are discussed. Technical difficulties in applying this equation to safeguards systems are illustrated through the use of confidence intervals, information content, hypothesis testing and ranking and selection procedures.

  7. Methodology for risk analysis based on atmospheric dispersion modelling from nuclear risk sites

    NASA Astrophysics Data System (ADS)

    Baklanov, A.; Mahura, A.; Sørensen, J. H.; Rigina, O.

    2003-04-01

    The main purpose of this multidisciplinary study is to develop a methodology for complex nuclear risk and vulnerability assessment, and to test it on example of estimation of nuclear risk to the population in the Nordic countries in case of a severe accident at a nuclear risk site (NRS). The main focus of the paper is the methodology for the evaluation of the atmospheric transport and deposition of radioactive pollutants from NRSs. The method developed for this evaluation is derived from a probabilistic point of view. The main question we are trying to answer is: What is the probability for radionuclide atmospheric transport and impact to different neighbouring regions and countries in case of an accident at an NPP? To answer this question we applied a number of different tools: (i) Trajectory Modelling - to calculate multiyear forward trajectories originating over the locations of selected risk sites; (ii) Dispersion Modelling - for long-term simulation and case studies of radionuclide transport from hypothetical accidental releases at NRSs; (iii) Cluster Analysis - to identify atmospheric transport pathways from NRSs; (iv) Probability Fields Analysis - to construct annual, monthly, and seasonal NRS impact indicators to identify the most impacted geographical regions; (v) Specific Case Studies - to estimate consequences for the environment and the populations after a hypothetical accident; (vi) Vulnerability Evaluation to Radioactive Deposition - to describe its persistence in the ecosystems with a focus to the transfer of certain radionuclides into the food chains of key importance for the intake and exposure for a whole population and for certain population groups; (vii) Risk Evaluation and Mapping - to analyse socio-economical consequences for different geographical areas and various population groups taking into account social-geophysical factors and probabilities, and using demographic databases based on GIS analysis.

  8. Application of classical process risk analysis tools to business risk management

    SciTech Connect

    Einolf, D.M.; Menghini, L.K.

    1999-07-01

    Process engineers and safety professionals have used qualitative and quantitative risk assessment techniques for many years to analyze process hazards. Environmental managers have recently been exposed to many of these techniques through submission of Accidental Release Risk Management Plans (40 CFR 68). This presentation discusses the broader application of such tools (Hazard and Operability Analysis, Fault-Tree, Failure Modes and Effects) to issues involving emergency and disaster preparedness and business contingency planning. In particular, the authors have successfully expanded the HazOp methodology to apply to a broad range of natural and man-made disasters that may affect continued business operations. Specific examples, including the development of comprehensive contingency planning for a water supply system, electronics manufacturer, and a distribution center will be discussed. These classical safety and reliability tools have broad application in the environmental planning arena, including emergency response and the operability of air and water pollution control equipment.

  9. Duration Analysis Applied to the Adoption of Knowledge.

    ERIC Educational Resources Information Center

    Vega-Cervera, Juan A.; Gordillo, Isabel Cuadrado

    2001-01-01

    Analyzes knowledge acquisition in a sample of 264 pupils in 9 Spanish elementary schools, using time as a dependent variable. Introduces psycho-pedagogical, pedagogical, and social variables into a hazard model applied to the reading process. Auditory discrimination (not intelligence or visual perception) most significantly influences learning to…

  10. Some Applied Research Concerns Using Multiple Linear Regression Analysis.

    ERIC Educational Resources Information Center

    Newman, Isadore; Fraas, John W.

    The intention of this paper is to provide an overall reference on how a researcher can apply multiple linear regression in order to utilize the advantages that it has to offer. The advantages and some concerns expressed about the technique are examined. A number of practical ways by which researchers can deal with such concerns as…

  11. How Has Applied Behavior Analysis and Behavior Therapy Changed?: An Historical Analysis of Journals

    ERIC Educational Resources Information Center

    O'Donohue, William; Fryling, Mitch

    2007-01-01

    Applied behavior analysis and behavior therapy are now nearly a half century old. It is interesting to ask if and how these disciplines have changed over time, particularly regarding some of their key internal controversies (e.g., role of cognitions). We examined the first five years and the 2000-2004 five year period of the "Journal of Applied…

  12. Applying Model Analysis to a Resource-Based Analysis of the Force and Motion Conceptual Evaluation

    ERIC Educational Resources Information Center

    Smith, Trevor I.; Wittmann, Michael C.; Carter, Tom

    2014-01-01

    Previously, we analyzed the Force and Motion Conceptual Evaluation in terms of a resources-based model that allows for clustering of questions so as to provide useful information on how students correctly or incorrectly reason about physics. In this paper, we apply model analysis to show that the associated model plots provide more information…

  13. Risk Analysis Related to Quality Management Principles

    NASA Astrophysics Data System (ADS)

    Vykydal, David; Halfarová, Petra; Nenadál, Jaroslav; Plura, Jiří; Hekelová, Edita

    2012-12-01

    Efficient and effective implementation of quality management principles asks for a responsible approach from top managers' perspectives. A study of the current state of affairs in Czech organizations discovers a lot of shortcomings in this field that can be changed to vary managerial risks. The article identifies and analyses some of them and gives short guidance for appropriate treatment. Text of the article reflects the authors' experience as well as knowledge obtained from the systematic analysis of industrial companies' environments.

  14. Applying the Land Use Portfolio Model with Hazus to analyse risk from natural hazard events

    USGS Publications Warehouse

    Dinitz, Laura B.; Taketa, Richard A.

    2013-01-01

    This paper describes and demonstrates the integration of two geospatial decision-support systems for natural-hazard risk assessment and management. Hazus is a risk-assessment tool developed by the Federal Emergency Management Agency to identify risks and estimate the severity of risk from natural hazards. The Land Use Portfolio Model (LUPM) is a risk-management tool developed by the U.S. Geological Survey to evaluate plans or actions intended to reduce risk from natural hazards. We analysed three mitigation policies for one earthquake scenario in the San Francisco Bay area to demonstrate the added value of using Hazus and the LUPM together. The demonstration showed that Hazus loss estimates can be input to the LUPM to obtain estimates of losses avoided through mitigation, rates of return on mitigation investment, and measures of uncertainty. Together, they offer a more comprehensive approach to help with decisions for reducing risk from natural hazards.

  15. System Analysis and Risk Assessment System.

    2000-11-20

    Version 00 SARA4.16 is a program that allows the user to review the results of a Probabilistic Risk Assessment (PRA) and to perform limited sensitivity analysis on these results. This tool is intended to be used by a less technical oriented user and does not require the level of understanding of PRA concepts required by a full PRA analysis tool. With this program a user can review the information generated by a PRA analyst andmore » compare the results to those generated by making limited modifications to the data in the PRA. Also included in this program is the ability to graphically display the information stored in the database. This information includes event trees, fault trees, P&IDs and uncertainty distributions. SARA 4.16 is incorporated in the SAPHIRE 5.0 code package.« less

  16. Factorial kriging analysis applied to geological data from petroleum exploration

    SciTech Connect

    Jaquet, O.

    1989-10-01

    A regionalized variable, thickness of the reservoir layer, from a gas field is decomposed by factorial kriging analysis. Maps of the obtained components may be associated with depositional environments that are favorable for petroleum exploration.

  17. Spectrophotometric multicomponent analysis applied to trace metal determinations

    SciTech Connect

    Otto, M.; Wegscheider, W.

    1985-01-01

    Quantitative spectrometric analysis of mixture components is featured for systems with low spectral selectivity, namely, in the ultraviolet, visible, and infrared spectral range. Limitations imposed by data reduction schemes based on ordinary multiple regression are shown to be overcome by means of partial least-squares analysis in latent variables. The influences of variables such as noise, band separation band intensity ratios, number of wavelengths, number of components, number of calibration mixtures, time drift, or deviations from Beer's law on the analytical result has been evaluated under a wide range of conditions providing a basis to search for new systems applicable to spectrophotometric multicomponent analysis. The practical utility of the method is demonstrated for simultaneous analysis of copper, nickel, cobalt, iron, and palladium down to 2 X 10/sup -6/ M concentrations by use of their diethyldithiocarbamate chelate complexes with relative errors less than 6%. 26 references, 4 figures, 6 tables.

  18. INDICATORS OF RISK: AN ANALYSIS APPROACH FOR IMPROVED RIVER MANAGEMENT

    EPA Science Inventory

    A risk index is an approach to measuring the level of risk to the plants and/or animals (biota) in a certain area using water and habitat quality information. A new technique for developing risk indices was applied to data collected from Mid-Atlantic streams of the U.S. during 1...

  19. Risk analysis for autonomous underwater vehicle operations in extreme environments.

    PubMed

    Brito, Mario Paulo; Griffiths, Gwyn; Challenor, Peter

    2010-12-01

    Autonomous underwater vehicles (AUVs) are used increasingly to explore hazardous marine environments. Risk assessment for such complex systems is based on subjective judgment and expert knowledge as much as on hard statistics. Here, we describe the use of a risk management process tailored to AUV operations, the implementation of which requires the elicitation of expert judgment. We conducted a formal judgment elicitation process where eight world experts in AUV design and operation were asked to assign a probability of AUV loss given the emergence of each fault or incident from the vehicle's life history of 63 faults and incidents. After discussing methods of aggregation and analysis, we show how the aggregated risk estimates obtained from the expert judgments were used to create a risk model. To estimate AUV survival with mission distance, we adopted a statistical survival function based on the nonparametric Kaplan-Meier estimator. We present theoretical formulations for the estimator, its variance, and confidence limits. We also present a numerical example where the approach is applied to estimate the probability that the Autosub3 AUV would survive a set of missions under Pine Island Glacier, Antarctica in January-March 2009.

  20. The colour analysis method applied to homogeneous rocks

    NASA Astrophysics Data System (ADS)

    Halász, Amadé; Halmai, Ákos

    2015-12-01

    Computer-aided colour analysis can facilitate cyclostratigraphic studies. Here we report on a case study involving the development of a digital colour analysis method for examination of the Boda Claystone Formation which is the most suitable in Hungary for the disposal of high-level radioactive waste. Rock type colours are reddish brown or brownish red, or any shade between brown and red. The method presented here could be used to differentiate similar colours and to identify gradual transitions between these; the latter are of great importance in a cyclostratigraphic analysis of the succession. Geophysical well-logging has demonstrated the existence of characteristic cyclic units, as detected by colour and natural gamma. Based on our research, colour, natural gamma and lithology correlate well. For core Ib-4, these features reveal the presence of orderly cycles with thicknesses of roughly 0.64 to 13 metres. Once the core has been scanned, this is a time- and cost-effective method.

  1. Orbit Response Matrix Analysis Applied at PEP-II

    SciTech Connect

    Steier, C.; Wolski, A.; Ecklund, S.; Safranek, J.A.; Tenenbaum, P.; Terebilo, A.; Turner, J.L.; Yocky, G.; /SLAC

    2005-05-17

    The analysis of orbit response matrices has been used very successfully to measure and correct the gradient and skew gradient distribution in many accelerators. It allows determination of an accurately calibrated model of the coupled machine lattice, which then can be used to calculate the corrections necessary to improve coupling, dynamic aperture and ultimately luminosity. At PEP-II, the Matlab version of LOCO has been used to analyze coupled response matrices for both the LER and the HER. The large number of elements in PEP-II and the very complicated interaction region present unique challenges to the data analysis. All necessary tools to make the analysis method useable at PEP-II have been implemented and LOCO can now be used as a routine tool for lattice diagnostic.

  2. Joint regression analysis and AMMI model applied to oat improvement

    NASA Astrophysics Data System (ADS)

    Oliveira, A.; Oliveira, T. A.; Mejza, S.

    2012-09-01

    In our work we present an application of some biometrical methods useful in genotype stability evaluation, namely AMMI model, Joint Regression Analysis (JRA) and multiple comparison tests. A genotype stability analysis of oat (Avena Sativa L.) grain yield was carried out using data of the Portuguese Plant Breeding Board, sample of the 22 different genotypes during the years 2002, 2003 and 2004 in six locations. In Ferreira et al. (2006) the authors state the relevance of the regression models and of the Additive Main Effects and Multiplicative Interactions (AMMI) model, to study and to estimate phenotypic stability effects. As computational techniques we use the Zigzag algorithm to estimate the regression coefficients and the agricolae-package available in R software for AMMI model analysis.

  3. Improving Patient Prostate Cancer Risk Assessment: Moving From Static, Globally-Applied to Dynamic, Practice-Specific Cancer Risk Calculators

    PubMed Central

    Strobl, Andreas N.; Vickers, Andrew J.; Van Calster, Ben; Steyerberg, Ewout; Leach, Robin J.; Thompson, Ian M.; Ankerst, Donna P.

    2015-01-01

    Clinical risk calculators are now widely available but have generally been implemented in a static and one-size-fits-all fashion. The objective of this study was to challenge these notions and show via a case study concerning risk-based screening for prostate cancer how calculators can be dynamically and locally tailored to improve on-site patient accuracy. Yearly data from five international prostate biopsy cohorts (3 in the US, 1 in Austria, 1 in England) were used to compare 6 methods for annual risk prediction: static use of the online US-developed Prostate Cancer Prevention Trial Risk Calculator (PCPTRC); recalibration of the PCPTRC; revision of the PCPTRC; building a new model each year using logistic regression, Bayesian prior-to-posterior updating, or random forests. All methods performed similarly with respect to discrimination, except for random forests, which were worse. All methods except for random forests greatly improved calibration over the static PCPTRC in all cohorts except for Austria, where the PCPTRC had the best calibration followed closely by recalibration. The case study shows that a simple annual recalibration of a general online risk tool for prostate cancer can improve its accuracy with respect to the local patient practice at hand. PMID:25989018

  4. On the relation between applied behavior analysis and positive behavioral support

    PubMed Central

    Carr, James E.; Sidener, Tina M.

    2002-01-01

    Anderson and Freeman (2000) recently defined positive behavioral support (PBS) as a systematic approach to the delivery of clinical and educational services that is rooted in behavior analysis. However, the recent literature contains varied definitions of PBS as well as discrepant notions regarding the relation between applied behavior analysis and PBS. After summarizing common definitional characteristics of PBS from the literature, we conclude that PBS is comprised almost exclusively of techniques and values originating in applied behavior analysis. We then discuss the relations between applied behavior analysis and PBS that have been proposed in the literature. Finally, we discuss possible implications of considering PBS a field separate from applied behavior analysis. PMID:22478389

  5. Systems design analysis applied to launch vehicle configuration

    NASA Technical Reports Server (NTRS)

    Ryan, R.; Verderaime, V.

    1993-01-01

    As emphasis shifts from optimum-performance aerospace systems to least lift-cycle costs, systems designs must seek, adapt, and innovate cost improvement techniques in design through operations. The systems design process of concept, definition, and design was assessed for the types and flow of total quality management techniques that may be applicable in a launch vehicle systems design analysis. Techniques discussed are task ordering, quality leverage, concurrent engineering, Pareto's principle, robustness, quality function deployment, criteria, and others. These cost oriented techniques are as applicable to aerospace systems design analysis as to any large commercial system.

  6. Probabilistic methods in fire-risk analysis

    SciTech Connect

    Brandyberry, M.D.

    1989-01-01

    The first part of this work outlines a method for assessing the frequency of ignition of a consumer product in a building and shows how the method would be used in an example scenario utilizing upholstered furniture as the product and radiant auxiliary heating devices (electric heaters, wood stoves) as the ignition source. Deterministic thermal models of the heat-transport processes are coupled with parameter uncertainty analysis of the models and with a probabilistic analysis of the events involved in a typical scenario. This leads to a distribution for the frequency of ignition for the product. In second part, fire-risk analysis as currently used in nuclear plants is outlines along with a discussion of the relevant uncertainties. The use of the computer code COMPBRN is discussed for use in the fire-growth analysis along with the use of response-surface methodology to quantify uncertainties in the code's use. Generalized response surfaces are developed for temperature versus time for a cable tray, as well as a surface for the hot gas layer temperature and depth for a room of arbitrary geometry within a typical nuclear power plant compartment. These surfaces are then used to simulate the cable tray damage time in a compartment fire experiment.

  7. Applying Skinner's Analysis of Verbal Behavior to Persons with Dementia

    ERIC Educational Resources Information Center

    Dixon, Mark; Baker, Jonathan C.; Sadowski, Katherine Ann

    2011-01-01

    Skinner's 1957 analysis of verbal behavior has demonstrated a fair amount of utility to teach language to children with autism and other various disorders. However, the learning of language can be forgotten, as is the case for many elderly suffering from dementia or other degenerative diseases. It appears possible that Skinner's operants may…

  8. Action, Content and Identity in Applied Genre Analysis for ESP

    ERIC Educational Resources Information Center

    Flowerdew, John

    2011-01-01

    Genres are staged, structured, communicative events, motivated by various communicative purposes, and performed by members of specific discourse communities (Swales 1990; Bhatia 1993, 2004; Berkenkotter & Huckin 1995). Since its inception, with the two seminal works on the topic by Swales (1990) and Bhatia (1993), genre analysis has taken pride of…

  9. Applying Adult Learning Theory through a Character Analysis

    ERIC Educational Resources Information Center

    Baskas, Richard S.

    2011-01-01

    The purpose of this study is to analyze the behavior of a character, Celie, in a movie, 'The Color Purple," through the lens of two adult learning theorists to determine the relationships the character has with each theory. The development and portrayal of characters in movies can be explained and understood by the analysis of adult learning…

  10. Applying Score Analysis to a Rehearsal Pedagogy of Expressive Performance

    ERIC Educational Resources Information Center

    Byo, James L.

    2014-01-01

    The discoveries of score analysis (e.g., minor seventh chord, ostinato, phrase elision, melodic fragment, half cadence) are more than just compositional techniques or music vocabulary. They are sounds--fascinating, storytelling, dynamic modes of expression--that when approached as such enrich the rehearsal experience. This article presents a…

  11. Applying AI tools to operational space environmental analysis

    NASA Technical Reports Server (NTRS)

    Krajnak, Mike; Jesse, Lisa; Mucks, John

    1995-01-01

    The U.S. Air Force and National Oceanic Atmospheric Agency (NOAA) space environmental operations centers are facing increasingly complex challenges meeting the needs of their growing user community. These centers provide current space environmental information and short term forecasts of geomagnetic activity. Recent advances in modeling and data access have provided sophisticated tools for making accurate and timely forecasts, but have introduced new problems associated with handling and analyzing large quantities of complex data. AI (Artificial Intelligence) techniques have been considered as potential solutions to some of these problems. Fielding AI systems has proven more difficult than expected, in part because of operational constraints. Using systems which have been demonstrated successfully in the operational environment will provide a basis for a useful data fusion and analysis capability. Our approach uses a general purpose AI system already in operational use within the military intelligence community, called the Temporal Analysis System (TAS). TAS is an operational suite of tools supporting data processing, data visualization, historical analysis, situation assessment and predictive analysis. TAS includes expert system tools to analyze incoming events for indications of particular situations and predicts future activity. The expert system operates on a knowledge base of temporal patterns encoded using a knowledge representation called Temporal Transition Models (TTM's) and an event database maintained by the other TAS tools. The system also includes a robust knowledge acquisition and maintenance tool for creating TTM's using a graphical specification language. The ability to manipulate TTM's in a graphical format gives non-computer specialists an intuitive way of accessing and editing the knowledge base. To support space environmental analyses, we used TAS's ability to define domain specific event analysis abstractions. The prototype system defines

  12. A value analysis model applied to the management of amblyopia.

    PubMed Central

    Beauchamp, G R; Bane, M C; Stager, D R; Berry, P M; Wright, W W

    1999-01-01

    PURPOSE: To assess the value of amblyopia-related services by utilizing a health value model (HVM). Cost and quality criteria are evaluated in accordance with the interests of patients, physicians, and purchasers. METHODS: We applied an HVM to a hypothetical statistical ("median") child with amblyopia whose visual acuity is 20/80 and to a group of children with amblyopia who are managed by our practice. We applied the model to calculate the value of these services by evaluating the responses of patients and physicians and relating these responses to clinical outcomes. RESULTS: The consensus value of care for the hypothetical median child was calculated to be 0.406 (of 1.000). For those children managed in our practice, the calculated value is 0.682. Clinically, 79% achieved 20/40 or better visual acuity, and the mean final visual acuity was 0.2 logMAR (20/32). Value appraisals revealed significant concerns about the financial aspects of amblyopia-related services, particularly among physicians. Patients rated services more positively than did physicians. CONCLUSIONS: Amblyopia care is difficult, sustained, and important work that requires substantial sensitivity to and support of children and families. Compliance and early detection are essential to success. The value of amblyopia services is rated significantly higher by patients than by physicians. Relative to the measured value, amblyopia care is undercompensated. The HVM is useful to appraise clinical service delivery and its variation. The costs of failure and the benefits of success are high; high-value amblyopia care yields substantial dividends and should be commensurately compensated in the marketplace. PMID:10703133

  13. 49 CFR 260.17 - Credit risk premium analysis.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 49 Transportation 4 2011-10-01 2011-10-01 false Credit risk premium analysis. 260.17 Section 260... Financial Assistance § 260.17 Credit risk premium analysis. (a) When Federal appropriations are not available to cover the total subsidy cost, the Administrator will determine the Credit Risk...

  14. 49 CFR 260.17 - Credit risk premium analysis.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 49 Transportation 4 2010-10-01 2010-10-01 false Credit risk premium analysis. 260.17 Section 260... Financial Assistance § 260.17 Credit risk premium analysis. (a) When Federal appropriations are not available to cover the total subsidy cost, the Administrator will determine the Credit Risk...

  15. Setting events in applied behavior analysis: Toward a conceptual and methodological expansion

    PubMed Central

    Wahler, Robert G.; Fox, James J.

    1981-01-01

    The contributions of applied behavior analysis as a natural science approach to the study of human behavior are acknowledged. However, it is also argued that applied behavior analysis has provided limited access to the full range of environmental events that influence socially significant behavior. Recent changes in applied behavior analysis to include analysis of side effects and social validation represent ways in which the traditional applied behavior analysis conceptual and methodological model has been profitably expanded. A third area of expansion, the analysis of setting events, is proposed by the authors. The historical development of setting events as a behavior influence concept is traced. Modifications of the basic applied behavior analysis methodology and conceptual systems that seem necessary to setting event analysis are discussed and examples of descriptive and experimental setting event analyses are presented. PMID:16795646

  16. Balancing access to participation in research and protection from risks: applying the principle of justice.

    PubMed

    Kiskaddon, Sarah H

    2005-04-01

    The problem for Institutional Review Boards (IRBs) of balancing access to participation in research with protection of research subjects has always been a difficult one. IRBs, charged with applying the "Common Rule," as well as the Belmont Principles, in their review of clinical research, are given little guidance on approaching this problem. This article argues that the third Belmont Principle, the Justice Principle, may provide a useful framework for considering this balance. The changing research environment is discussed in an historical context, and the Justice Principle is considered both in the context of individual rights, as well as the potential benefit to classes of people. The author further suggests that application of the Justice Principle be driven by findings derived from an analysis of the first 2 principles. This feedback model will enable a more formal application of the Justice Principle and less ambiguous, more transparent, decisions regarding the equitable selection of subjects. The author calls for more systematic attention to the Justice Principle by IRBs, and proposes a model that includes incorporating the deliberation of the other Belmont Principles into the Justice Principle.

  17. Applying a Generic Juvenile Risk Assessment Instrument to a Local Context: Some Practical and Theoretical Lessons

    ERIC Educational Resources Information Center

    Miller, Joel; Lin, Jeffrey

    2007-01-01

    This article examines issues raised by the application of a generic actuarial juvenile risk instrument (the Model Risk Assessment Instrument) to New York City, a context different from the one in which it was developed. It describes practical challenges arising from the constraints of locally available data and local sensibilities and highlights…

  18. A Critique of the Concept of At Risk as Applied to Emergent Literacy.

    ERIC Educational Resources Information Center

    Pellegrini, Anthony D.

    1991-01-01

    Recommends abandoning the term "at risk." Concludes from analyses of parent-child story readings in African-American homes that the primary reason the children were at risk for failure in school is that the contexts for literacy in these homes were different from the contexts in school, not that there was something wrong with the families of the…

  19. Applying the National College Health Risk Behavior Survey to Rural Campuses.

    ERIC Educational Resources Information Center

    Peterson, Yasenka

    2001-01-01

    Determined current health risk behaviors of rural college freshmen using elements of the National College Health Risk Behavior Survey (NCHRBS). Student surveys indicated that for some behaviors, the incidence among these rural students was higher than the incidence among freshmen from the NCHRBS (e.g., binge drinking, ever smoking marijuana, and…

  20. RADON EXPOSURE ASSESSMENT AND DOSIMETRY APPLIED TO EPIDEMIOLOGY AND RISK ESTIMATION

    EPA Science Inventory

    Epidemiological studies of underground miners provide the primary basis for radon risk estimates for indoor exposures as well as mine exposures. A major source of uncertainty in these risk estimates is the uncertainty in radon progeny exposure estimates for the miners. In addit...

  1. Applying a Forensic Actuarial Assessment (the Violence Risk Appraisal Guide) to Nonforensic Patients

    ERIC Educational Resources Information Center

    Harris, Grant T.; Rice, Marnie E.; Camilleri, Joseph A..

    2004-01-01

    The actuarial Violence Risk Appraisal Guide (VRAG) was developed for male offenders where it has shown excellent replicability in many new forensic samples using officially recorded outcomes. Clinicians also make decisions, however, about the risk of interpersonal violence posed by nonforensic psychiatric patients of both sexes. Could an actuarial…

  2. Big Data Usage Patterns in the Health Care Domain: A Use Case Driven Approach Applied to the Assessment of Vaccination Benefits and Risks

    PubMed Central

    Liyanage, H.; Liaw, S-T.; Kuziemsky, C.; Mold, F.; Krause, P.; Fleming, D.; Jones, S.

    2014-01-01

    Summary Background Generally benefits and risks of vaccines can be determined from studies carried out as part of regulatory compliance, followed by surveillance of routine data; however there are some rarer and more long term events that require new methods. Big data generated by increasingly affordable personalised computing, and from pervasive computing devices is rapidly growing and low cost, high volume, cloud computing makes the processing of these data inexpensive. Objective To describe how big data and related analytical methods might be applied to assess the benefits and risks of vaccines. Method: We reviewed the literature on the use of big data to improve health, applied to generic vaccine use cases, that illustrate benefits and risks of vaccination. We defined a use case as the interaction between a user and an information system to achieve a goal. We used flu vaccination and pre-school childhood immunisation as exemplars. Results We reviewed three big data use cases relevant to assessing vaccine benefits and risks: (i) Big data processing using crowd-sourcing, distributed big data processing, and predictive analytics, (ii) Data integration from heterogeneous big data sources, e.g. the increasing range of devices in the “internet of things”, and (iii) Real-time monitoring for the direct monitoring of epidemics as well as vaccine effects via social media and other data sources. Conclusions Big data raises new ethical dilemmas, though its analysis methods can bring complementary real-time capabilities for monitoring epidemics and assessing vaccine benefit-risk balance. PMID:25123718

  3. Current Human Reliability Analysis Methods Applied to Computerized Procedures

    SciTech Connect

    Ronald L. Boring

    2012-06-01

    Computerized procedures (CPs) are an emerging technology within nuclear power plant control rooms. While CPs have been implemented internationally in advanced control rooms, to date no US nuclear power plant has implemented CPs in its main control room (Fink et al., 2009). Yet, CPs are a reality of new plant builds and are an area of considerable interest to existing plants, which see advantages in terms of enhanced ease of use and easier records management by omitting the need for updating hardcopy procedures. The overall intent of this paper is to provide a characterization of human reliability analysis (HRA) issues for computerized procedures. It is beyond the scope of this document to propose a new HRA approach or to recommend specific methods or refinements to those methods. Rather, this paper serves as a review of current HRA as it may be used for the analysis and review of computerized procedures.

  4. Analysis of Facial Aesthetics as Applied to Injectables.

    PubMed

    Lam, Samuel M; Glasgold, Robert; Glasgold, Mark

    2015-11-01

    Understanding the role of volume loss in the aging face has resulted in a paradigm shift in facial rejuvenation techniques. Injectable materials for volume restoration are among the most widespread cosmetic procedures performed. A new approach to the aesthetics of facial aging is necessary to allow the greatest improvement from volumetric techniques while maintaining natural appearing results. Examining the face in terms of facial frames and facial shadows provides the fundamental basis for our injectable analysis.

  5. Applying hydraulic transient analysis: The Grizzly Hydro Project

    SciTech Connect

    Logan, T.H.; Stutsman, R.D. )

    1992-04-01

    No matter the size of the hydro plant, if it has a long waterway and will operate in peaking mode, the project designer needs to address the issue of hydraulic transients-known as water hammer-early in the design. This article describes the application of transient analysis to the design of a 20-MW hydro plant in California. In this case, a Howell Bunger valve was used as a pressure regulating valve to control transient pressures and speed rise.

  6. Applying Decision-Making Approaches to Health Risk-Taking Behaviors: Progress and Remaining Challenges.

    PubMed

    Cho; Keller; Cooper

    1999-06-01

    This paper critically examines how risk-taking behaviors can be modeled from a decision-making perspective. We first review several applications of a decision perspective to the study of risk-taking behaviors, including studies that investigate consequence generation and the components of the overall utility (i.e., consequence, desirability, and likelihood) of risk-taking and studies that investigate the validity of two decision-oriented models (subjective expected utility and the theory of reasoned action) in predicting risk-taking behaviors. We then discuss challenges in modeling risk-taking behaviors from a decision-making perspective. These challenges include (i) finding the factors that are necessary to improve the predictability of models, (ii) difficulties in eliciting the individual components of overall utility, and (iii) incorporating overall utility changes over time. Copyright 1999 Academic Press. PMID:10366518

  7. Development of partial failure analysis method in probability risk assessments

    SciTech Connect

    Ni, T.; Modarres, M.

    1996-12-01

    This paper presents a new approach to evaluate the partial failure effect on current Probability Risk Assessments (PRAs). An integrated methodology of the thermal-hydraulic analysis and fuzzy logic simulation using the Dynamic Master Logic Diagram (DMLD) was developed. The thermal-hydraulic analysis used in this approach is to identify partial operation effect of any PRA system function in a plant model. The DMLD is used to simulate the system performance of the partial failure effect and inspect all minimal cut sets of system functions. This methodology can be applied in the context of a full scope PRA to reduce core damage frequency. An example of this application of the approach is presented. The partial failure data used in the example is from a survey study of partial failure effects from the Nuclear Plant Reliability Data System (NPRDS).

  8. IT-OSRA: applying ensemble simulations to estimate the oil spill risk associated to operational and accidental oil spills

    NASA Astrophysics Data System (ADS)

    Sepp Neves, Antonio Augusto; Pinardi, Nadia; Martins, Flavio

    2016-08-01

    Oil Spill Risk Assessments (OSRAs) are widely employed to support decision making regarding oil spill risks. This article adapts the ISO-compliant OSRA framework developed by Sepp Neves et al. (J Environ Manag 159:158-168, 2015) to estimate risks in a complex scenario where uncertainties related to the meteo-oceanographic conditions, where and how a spill could happen exist and the risk computation methodology is not yet well established (ensemble oil spill modeling). The improved method was applied to the Algarve coast, Portugal. Over 50,000 simulations were performed in 2 ensemble experiments to estimate the risks due to operational and accidental spill scenarios associated with maritime traffic. The level of risk was found to be important for both types of scenarios, with significant seasonal variations due to the the currents and waves variability. Higher frequency variability in the meteo-oceanographic variables were also found to contribute to the level of risk. The ensemble results show that the distribution of oil concentrations found on the coast is not Gaussian, opening up new fields of research on how to deal with oil spill risks and related uncertainties.

  9. Applying Costs, Risks and Values Evaluation (CRAVE) methodology to Engineering Support Request (ESR) prioritization

    NASA Technical Reports Server (NTRS)

    Joglekar, Prafulla N.

    1994-01-01

    Given limited budget, the problem of prioritization among Engineering Support Requests (ESR's) with varied sizes, shapes, and colors is a difficult one. At the Kennedy Space Center (KSC), the recently developed 4-Matrix (4-M) method represents a step in the right direction as it attempts to combine the traditional criteria of technical merits only with the new concern for cost-effectiveness. However, the 4-M method was not adequately successful in the actual prioritization of ESRs for the fiscal year 1995 (FY95). This research identifies a number of design issues that should help us to develop better methods. It emphasizes that given the variety and diversity of ESR's one should not expect that a single method could help in the assessment of all ESR's. One conclusion is that a methodology such as Costs, Risks, and Values Evaluation (CRAVE) should be adopted. It also is clear that the development of methods such as 4-M requires input not only from engineers with technical expertise in ESR's but also from personnel with adequate background in the theory and practice of cost-effectiveness analysis. At KSC, ESR prioritization is one part of the Ground Support Working Teams (GSWT) Integration Process. It was discovered that the more important barriers to the incorporation of cost-effectiveness considerations in ESR prioritization lie in this process. The culture of integration, and the corresponding structure of review by a committee of peers, is not conducive to the analysis and confrontation necessary in the assessment and prioritization of ESR's. Without assistance from appropriately trained analysts charged with the responsibility to analyze and be confrontational about each ESR, the GSWT steering committee will continue to make its decisions based on incomplete understanding, inconsistent numbers, and at times, colored facts. The current organizational separation of the prioritization and the funding processes is also identified as an important barrier to the

  10. Software Speeds Up Analysis of Breast Cancer Risk

    MedlinePlus

    ... fullstory_161117.html Software Speeds Up Analysis of Breast Cancer Risk: Study Doctors were 30 times slower reading ... quickly analyzes mammograms and patient history to determine breast cancer risk could save time and reduce unnecessary biopsies, ...

  11. Applying temporal network analysis to the venture capital market

    NASA Astrophysics Data System (ADS)

    Zhang, Xin; Feng, Ling; Zhu, Rongqian; Stanley, H. Eugene

    2015-10-01

    Using complex network theory to study the investment relationships of venture capital firms has produced a number of significant results. However, previous studies have often neglected the temporal properties of those relationships, which in real-world scenarios play a pivotal role. Here we examine the time-evolving dynamics of venture capital investment in China by constructing temporal networks to represent (i) investment relationships between venture capital firms and portfolio companies and (ii) the syndication ties between venture capital investors. The evolution of the networks exhibits rich variations in centrality, connectivity and local topology. We demonstrate that a temporal network approach provides a dynamic and comprehensive analysis of real-world networks.

  12. Applied analysis/computational mathematics. Final report 1993

    SciTech Connect

    Lax, P.; Berger, M.

    1993-12-01

    This is the final report for the Courant Mathematics and Computing Laboratory (CMCL) research program for the years 1991--1993. Our research efforts encompass the formulation of physical problems in terms of mathematical models (both old and new), the mathematical analysis of such models, and their numerical resolution. This last step involves the development and implementation of efficient methods for large scale computation. Our analytic and numerical work often go hand in hand; new theoretical approaches often have numerical counterparts, while numerical experimentation often suggests avenues for analytical investigation.

  13. Fundamental and Applied Investigations in Atomic Spectrometric Analysis

    NASA Astrophysics Data System (ADS)

    Wu, Min

    Simultaneous laser-excited fluorescence and absorption measurements were performed and the results have revealed that any interference caused by easily ionized elements does not originate from variations in analyte emission (quantum) efficiency. A closely related area, the roles of wet and dry aerosols in the matrix interference are clarified through spatially resolved imaging of the plasma by a charged coupled device camera. To eliminate matrix interference effects practically, various methods have been developed based on the above studies. The use of column pre-concentration with flow injection analysis has been found to provide a simple solution for reducing interference effects and increasing sensitivity of elemental analysis. A novel mini-spray chamber was invented. The new vertical rotary spray chamber combines gravitational, centrifugal, turbulent, and impact droplet segregation mechanisms to achieve a higher efficiency of small-droplet formation in a nebulized sample spray. As a result, it offers also higher sample-transport efficiency, lower memory effects, and improved analytical figures of merit over existing devices. This new device was employed with flow injection analysis to simulate an interface for coupling high performance liquid chromatography (HPLC) to a microwave plasma for chromatographic detection. The detection limits for common metallic elements are in the range of 5-50 mug/mL, and are degraded only twofold when the elements are presented in an organic solvent such as ethanol or methanol. Other sample-introduction schemes have also been investigated to improve sample-introduction technology. The direct coupling of hydride-generation techniques to the helium microwave plasma torch was evaluated for the determination of arsenic, antimony and tin by atomic emission spectrometry. A manually controlled peristaltic pump was modified for computer control and continuous flow injection was evaluated for standard calibration and trace elemental

  14. Applying the Analytic Hierarchy Process to Oil Sands Environmental Compliance Risk Management

    NASA Astrophysics Data System (ADS)

    Roux, Izak Johannes, III

    Oil companies in Alberta, Canada, invested $32 billion on new oil sands projects in 2013. Despite the size of this investment, there is a demonstrable deficiency in the uniformity and understanding of environmental legislation requirements that manifest into increased project compliance risks. This descriptive study developed 2 prioritized lists of environmental regulatory compliance risks and mitigation strategies and used multi-criteria decision theory for its theoretical framework. Information from compiled lists of environmental compliance risks and mitigation strategies was used to generate a specialized pairwise survey, which was piloted by 5 subject matter experts (SMEs). The survey was validated by a sample of 16 SMEs, after which the Analytic Hierarchy Process (AHP) was used to rank a total of 33 compliance risks and 12 mitigation strategy criteria. A key finding was that the AHP is a suitable tool for ranking of compliance risks and mitigation strategies. Several working hypotheses were also tested regarding how SMEs prioritized 1 compliance risk or mitigation strategy compared to another. The AHP showed that regulatory compliance, company reputation, environmental compliance, and economics ranked the highest and that a multi criteria mitigation strategy for environmental compliance ranked the highest. The study results will inform Alberta oil sands industry leaders about the ranking and utility of specific compliance risks and mitigations strategies, enabling them to focus on actions that will generate legislative and public trust. Oil sands leaders implementing a risk management program using the risks and mitigation strategies identified in this study will contribute to environmental conservation, economic growth, and positive social change.

  15. Operational modal analysis applied to the concert harp

    NASA Astrophysics Data System (ADS)

    Chomette, B.; Le Carrou, J.-L.

    2015-05-01

    Operational modal analysis (OMA) methods are useful to extract modal parameters of operating systems. These methods seem to be particularly interesting to investigate the modal basis of string instruments during operation to avoid certain disadvantages due to conventional methods. However, the excitation in the case of string instruments is not optimal for OMA due to the presence of damped harmonic components and low noise in the disturbance signal. Therefore, the present study investigates the least-square complex exponential (LSCE) and the modified least-square complex exponential methods in the case of a string instrument to identify modal parameters of the instrument when it is played. The efficiency of the approach is experimentally demonstrated on a concert harp excited by some of its strings and the two methods are compared to a conventional modal analysis. The results show that OMA allows us to identify modes particularly present in the instrument's response with a good estimation especially if they are close to the excitation frequency with the modified LSCE method.

  16. Unsupervised feature relevance analysis applied to improve ECG heartbeat clustering.

    PubMed

    Rodríguez-Sotelo, J L; Peluffo-Ordoñez, D; Cuesta-Frau, D; Castellanos-Domínguez, G

    2012-10-01

    The computer-assisted analysis of biomedical records has become an essential tool in clinical settings. However, current devices provide a growing amount of data that often exceeds the processing capacity of normal computers. As this amount of information rises, new demands for more efficient data extracting methods appear. This paper addresses the task of data mining in physiological records using a feature selection scheme. An unsupervised method based on relevance analysis is described. This scheme uses a least-squares optimization of the input feature matrix in a single iteration. The output of the algorithm is a feature weighting vector. The performance of the method was assessed using a heartbeat clustering test on real ECG records. The quantitative cluster validity measures yielded a correctly classified heartbeat rate of 98.69% (specificity), 85.88% (sensitivity) and 95.04% (general clustering performance), which is even higher than the performance achieved by other similar ECG clustering studies. The number of features was reduced on average from 100 to 18, and the temporal cost was a 43% lower than in previous ECG clustering schemes. PMID:22672933

  17. Multivariate calibration applied to the quantitative analysis of infrared spectra

    SciTech Connect

    Haaland, D.M.

    1991-01-01

    Multivariate calibration methods are very useful for improving the precision, accuracy, and reliability of quantitative spectral analyses. Spectroscopists can more effectively use these sophisticated statistical tools if they have a qualitative understanding of the techniques involved. A qualitative picture of the factor analysis multivariate calibration methods of partial least squares (PLS) and principal component regression (PCR) is presented using infrared calibrations based upon spectra of phosphosilicate glass thin films on silicon wafers. Comparisons of the relative prediction abilities of four different multivariate calibration methods are given based on Monte Carlo simulations of spectral calibration and prediction data. The success of multivariate spectral calibrations is demonstrated for several quantitative infrared studies. The infrared absorption and emission spectra of thin-film dielectrics used in the manufacture of microelectronic devices demonstrate rapid, nondestructive at-line and in-situ analyses using PLS calibrations. Finally, the application of multivariate spectral calibrations to reagentless analysis of blood is presented. We have found that the determination of glucose in whole blood taken from diabetics can be precisely monitored from the PLS calibration of either mind- or near-infrared spectra of the blood. Progress toward the non-invasive determination of glucose levels in diabetics is an ultimate goal of this research. 13 refs., 4 figs.

  18. Dynamical systems analysis applied to working memory data.

    PubMed

    Gasimova, Fidan; Robitzsch, Alexander; Wilhelm, Oliver; Boker, Steven M; Hu, Yueqin; Hülür, Gizem

    2014-01-01

    In the present paper we investigate weekly fluctuations in the working memory capacity (WMC) assessed over a period of 2 years. We use dynamical system analysis, specifically a second order linear differential equation, to model weekly variability in WMC in a sample of 112 9th graders. In our longitudinal data we use a B-spline imputation method to deal with missing data. The results show a significant negative frequency parameter in the data, indicating a cyclical pattern in weekly memory updating performance across time. We use a multilevel modeling approach to capture individual differences in model parameters and find that a higher initial performance level and a slower improvement at the MU task is associated with a slower frequency of oscillation. Additionally, we conduct a simulation study examining the analysis procedure's performance using different numbers of B-spline knots and values of time delay embedding dimensions. Results show that the number of knots in the B-spline imputation influence accuracy more than the number of embedding dimensions. PMID:25071657

  19. Dynamical systems analysis applied to working memory data

    PubMed Central

    Gasimova, Fidan; Robitzsch, Alexander; Wilhelm, Oliver; Boker, Steven M.; Hu, Yueqin; Hülür, Gizem

    2014-01-01

    In the present paper we investigate weekly fluctuations in the working memory capacity (WMC) assessed over a period of 2 years. We use dynamical system analysis, specifically a second order linear differential equation, to model weekly variability in WMC in a sample of 112 9th graders. In our longitudinal data we use a B-spline imputation method to deal with missing data. The results show a significant negative frequency parameter in the data, indicating a cyclical pattern in weekly memory updating performance across time. We use a multilevel modeling approach to capture individual differences in model parameters and find that a higher initial performance level and a slower improvement at the MU task is associated with a slower frequency of oscillation. Additionally, we conduct a simulation study examining the analysis procedure's performance using different numbers of B-spline knots and values of time delay embedding dimensions. Results show that the number of knots in the B-spline imputation influence accuracy more than the number of embedding dimensions. PMID:25071657

  20. Unsupervised feature relevance analysis applied to improve ECG heartbeat clustering.

    PubMed

    Rodríguez-Sotelo, J L; Peluffo-Ordoñez, D; Cuesta-Frau, D; Castellanos-Domínguez, G

    2012-10-01

    The computer-assisted analysis of biomedical records has become an essential tool in clinical settings. However, current devices provide a growing amount of data that often exceeds the processing capacity of normal computers. As this amount of information rises, new demands for more efficient data extracting methods appear. This paper addresses the task of data mining in physiological records using a feature selection scheme. An unsupervised method based on relevance analysis is described. This scheme uses a least-squares optimization of the input feature matrix in a single iteration. The output of the algorithm is a feature weighting vector. The performance of the method was assessed using a heartbeat clustering test on real ECG records. The quantitative cluster validity measures yielded a correctly classified heartbeat rate of 98.69% (specificity), 85.88% (sensitivity) and 95.04% (general clustering performance), which is even higher than the performance achieved by other similar ECG clustering studies. The number of features was reduced on average from 100 to 18, and the temporal cost was a 43% lower than in previous ECG clustering schemes.

  1. Sensitivity and uncertainty analysis applied to the JHR reactivity prediction

    SciTech Connect

    Leray, O.; Vaglio-Gaudard, C.; Hudelot, J. P.; Santamarina, A.; Noguere, G.; Di-Salvo, J.

    2012-07-01

    The on-going AMMON program in EOLE reactor at CEA Cadarache (France) provides experimental results to qualify the HORUS-3D/N neutronics calculation scheme used for the design and safety studies of the new Material Testing Jules Horowitz Reactor (JHR). This paper presents the determination of technological and nuclear data uncertainties on the core reactivity and the propagation of the latter from the AMMON experiment to JHR. The technological uncertainty propagation was performed with a direct perturbation methodology using the 3D French stochastic code TRIPOLI4 and a statistical methodology using the 2D French deterministic code APOLLO2-MOC which leads to a value of 289 pcm (1{sigma}). The Nuclear Data uncertainty propagation relies on a sensitivity study on the main isotopes and the use of a retroactive marginalization method applied to the JEFF 3.1.1 {sup 27}Al evaluation in order to obtain a realistic multi-group covariance matrix associated with the considered evaluation. This nuclear data uncertainty propagation leads to a K{sub eff} uncertainty of 624 pcm for the JHR core and 684 pcm for the AMMON reference configuration core. Finally, transposition and reduction of the prior uncertainty were made using the Representativity method which demonstrates the similarity of the AMMON experiment with JHR (the representativity factor is 0.95). The final impact of JEFF 3.1.1 nuclear data on the Begin Of Life (BOL) JHR reactivity calculated by the HORUS-3D/N V4.0 is a bias of +216 pcm with an associated posterior uncertainty of 304 pcm (1{sigma}). (authors)

  2. Applying Skinner's analysis of verbal behavior to persons with dementia.

    PubMed

    Dixon, Mark; Baker, Jonathan C; Sadowski, Katherine Ann

    2011-03-01

    Skinner's 1957 analysis of verbal behavior has demonstrated a fair amount of utility to teach language to children with autism and other various disorders. However, the learning of language can be forgotten, as is the case for many elderly suffering from dementia or other degenerative diseases. It appears possible that Skinner's operants may facilitate not only acquisition of language but also the ability to recall items or objects that may have appeared to be "forgotten." The present study examined the utility of having a series of adults in long-term care emit tacts, echoics, or intraverbals upon presentation of various visual stimuli. Compared to a no-verbal response condition, it appears that the incorporation of Skinner's verbal operants can in fact improve recall for this population. Implications for the retraining of lost language are presented. PMID:21292058

  3. Applying Machine Learning to GlueX Data Analysis

    NASA Astrophysics Data System (ADS)

    Boettcher, Thomas

    2014-03-01

    GlueX is a high energy physics experiment with the goal of collecting data necessary for understanding confinement in quantum chromodynamics. Beginning in 2015, GlueX will collect huge amounts of data describing billions of particle collisions. In preparation for data collection, efforts are underway to develop a methodology for analyzing these large data sets. One of the primary challenges in GlueX data analysis is isolating events of interest from a proportionally large background. GlueX has recently begun approaching this selection problem using machine learning algorithms, specifically boosted decision trees. Preliminary studies indicate that these algorithms have the potential to offer vast improvements in both signal selection efficiency and purity over more traditional techniques.

  4. Naming, the formation of stimulus classes, and applied behavior analysis.

    PubMed Central

    Stromer, R; Mackay, H A; Remington, B

    1996-01-01

    The methods used in Sidman's original studies on equivalence classes provide a framework for analyzing functional verbal behavior. Sidman and others have shown how teaching receptive, name-referent matching may produce rudimentary oral reading and word comprehension skills. Eikeseth and Smith (1992) have extended these findings by showing that children with autism may acquire equivalence classes after learning to supply a common oral name to each stimulus in a potential class. A stimulus class analysis suggests ways to examine (a) the problem of programming generalization from teaching situations to other environments, (b) the expansion of the repertoires that occur in those settings, and (c) the use of naming to facilitate these forms of generalization. Such research will help to clarify and extend Horne and Lowe's recent (1996) account of the role of verbal behavior in the formation of stimulus classes. PMID:8810064

  5. Fractographic principles applied to Y-TZP mechanical behavior analysis.

    PubMed

    Ramos, Carla Müller; Cesar, Paulo Francisco; Bonfante, Estevam Augusto; Rubo, José Henrique; Wang, Linda; Borges, Ana Flávia Sanches

    2016-04-01

    The purpose of this study was to evaluate the use of fractography principles to determine the fracture toughness of Y-TZP dental ceramic in which KIc was measured fractographically using controlled-flaw beam bending techniques and to correlate the flaw distribution with the mechanical properties. The Y-TZP blocks studied were: Zirconia Zirklein (ZZ); Zirconcad (ZCA); IPS e.max ZirCad (ZMAX); and In Ceram YZ (ZYZ). Samples were prepared (16mm×4mm×2mm) according to ISO 6872 specifications and subjected to three-point bending at a crosshead speed of 0.5mm/min. Weibull probability curves (95% confidence bounds) were calculated and a contour plot with the Weibull modulus (m) versus characteristic strength (σ0) was used to examine the differences among groups. The fractured surface of each specimen was inspected in a scanning electron microscope (SEM) for qualitative and quantitative fractographic analysis. The critical defect size (c) and fracture toughness (KIc) were estimated. The fractured surfaces of the samples from all groups showed similar fractographic characteristics, except ZCA showed pores and defects. Fracture toughness and the flexural strength values were not different among the groups except for ZCA. The characteristic strength (p<0.05) of ZZ (η=920.4) was higher than the ZCA (η=651.1) and similar to the ZMAX (η=983.6) and ZYZ (η=1054.8). By means of quantitative and qualitative fractographic analysis, this study showed fracture toughness and strength that could be correlated to the observable microstructural features of the evaluated zirconia polycrystalline ceramics. PMID:26722988

  6. Elusive Critical Elements of Transformative Risk Assessment Practice and Interpretation: Is Alternatives Analysis the Next Step?

    PubMed

    Francis, Royce A

    2015-11-01

    This article argues that "game-changing" approaches to risk analysis must focus on "democratizing" risk analysis in the same way that information technologies have democratized access to, and production of, knowledge. This argument is motivated by the author's reading of Goble and Bier's analysis, "Risk Assessment Can Be a Game-Changing Information Technology-But Too Often It Isn't" (Risk Analysis, 2013; 33: 1942-1951), in which living risk assessments are shown to be "game changing" in probabilistic risk analysis. In this author's opinion, Goble and Bier's article focuses on living risk assessment's potential for transforming risk analysis from the perspective of risk professionals-yet, the game-changing nature of information technologies has typically achieved a much broader reach. Specifically, information technologies change who has access to, and who can produce, information. From this perspective, the author argues that risk assessment is not a game-changing technology in the same way as the printing press or the Internet because transformative information technologies reduce the cost of production of, and access to, privileged knowledge bases. The author argues that risk analysis does not reduce these costs. The author applies Goble and Bier's metaphor to the chemical risk analysis context, and in doing so proposes key features that transformative risk analysis technology should possess. The author also discusses the challenges and opportunities facing risk analysis in this context. These key features include: clarity in information structure and problem representation, economical information dissemination, increased transparency to nonspecialists, democratized manufacture and transmission of knowledge, and democratic ownership, control, and interpretation of knowledge. The chemical safety decision-making context illustrates the impact of changing the way information is produced and accessed in the risk context. Ultimately, the author concludes that although

  7. Applying the emergency risk management process to tackle the crisis of antibiotic resistance.

    PubMed

    Dominey-Howes, Dale; Bajorek, Beata; Michael, Carolyn A; Betteridge, Brittany; Iredell, Jonathan; Labbate, Maurizio

    2015-01-01

    We advocate that antibiotic resistance be reframed as a disaster risk management problem. Antibiotic-resistant infections represent a risk to life as significant as other commonly occurring natural disasters (e.g., earthquakes). Despite efforts by global health authorities, antibiotic resistance continues to escalate. Therefore, new approaches and expertise are needed to manage the issue. In this perspective we: (1) make a call for the emergency management community to recognize the antibiotic resistance risk and join in addressing this problem; (2) suggest using the risk management process to help tackle antibiotic resistance; (3) show why this approach has value and why it is different to existing approaches; and (4) identify public perception of antibiotic resistance as an important issue that warrants exploration. PMID:26388864

  8. Applying the emergency risk management process to tackle the crisis of antibiotic resistance

    PubMed Central

    Dominey-Howes, Dale; Bajorek, Beata; Michael, Carolyn A.; Betteridge, Brittany; Iredell, Jonathan; Labbate, Maurizio

    2015-01-01

    We advocate that antibiotic resistance be reframed as a disaster risk management problem. Antibiotic-resistant infections represent a risk to life as significant as other commonly occurring natural disasters (e.g., earthquakes). Despite efforts by global health authorities, antibiotic resistance continues to escalate. Therefore, new approaches and expertise are needed to manage the issue. In this perspective we: (1) make a call for the emergency management community to recognize the antibiotic resistance risk and join in addressing this problem; (2) suggest using the risk management process to help tackle antibiotic resistance; (3) show why this approach has value and why it is different to existing approaches; and (4) identify public perception of antibiotic resistance as an important issue that warrants exploration. PMID:26388864

  9. The impact of applied behavior analysis on diverse areas of research.

    PubMed

    Kazdin, A E

    1975-01-01

    The impact of applied behavior analysis on various disciplines and areas of research was assessed through two major analyses. First, the relationship of applied behavior analysis to the general area of "behavior modification" was evaluated by examining the citation characteristics of journal articles in JABA and three other behavior-modification journals. Second, the penetration of applied behavior analysis into diverse areas and disciplines, including behavior modification, psychiatry, clinical psychology, education, special education, retardation, speech and hearing, counselling, and law enforcement and correction was assessed. Twenty-five journals representing diverse research areas were evaluated from 1968 to 1974 to assess the extent to which operant techniques were applied for therapeutic, rehabilitative, and educative purposes and the degree to which methodological desiderata of applied behavior analysis were met. The analyses revealed diverse publication outlets for applied behavior analysis in various disciplines.

  10. HANFORD SAFETY ANALYSIS & RISK ASSESSMENT HANDBOOK (SARAH)

    SciTech Connect

    EVANS, C B

    2004-12-21

    The purpose of the Hanford Safety Analysis and Risk Assessment Handbook (SARAH) is to support the development of safety basis documentation for Hazard Category 2 and 3 (HC-2 and 3) U.S. Department of Energy (DOE) nuclear facilities to meet the requirements of 10 CFR 830, ''Nuclear Safety Management''. Subpart B, ''Safety Basis Requirements.'' Consistent with DOE-STD-3009-94, Change Notice 2, ''Preparation Guide for U.S. Department of Energy Nonreactor Nuclear Facility Documented Safety Analyses'' (STD-3009), and DOE-STD-3011-2002, ''Guidance for Preparation of Basis for Interim Operation (BIO) Documents'' (STD-3011), the Hanford SARAH describes methodology for performing a safety analysis leading to development of a Documented Safety Analysis (DSA) and derivation of Technical Safety Requirements (TSR), and provides the information necessary to ensure a consistently rigorous approach that meets DOE expectations. The DSA and TSR documents, together with the DOE-issued Safety Evaluation Report (SER), are the basic components of facility safety basis documentation. For HC-2 or 3 nuclear facilities in long-term surveillance and maintenance (S&M), for decommissioning activities, where source term has been eliminated to the point that only low-level, residual fixed contamination is present, or for environmental remediation activities outside of a facility structure, DOE-STD-1120-98, ''Integration of Environment, Safety, and Health into Facility Disposition Activities'' (STD-1120), may serve as the basis for the DSA. HC-2 and 3 environmental remediation sites also are subject to the hazard analysis methodologies of this standard.

  11. Problem Formulation for Human Health Risk Assessments of Pathogens in Land-Applied Biosolids (Final Report)

    EPA Science Inventory

    Cover of the Land-<span class=Applied Biosolids 2011 Final Report "> Millions of tons of treated sewage sludges or “biosolids” are applied annually to f...

  12. Quantitative microbial risk assessment applied to irrigation of salad crops with waste stabilization pond effluents.

    PubMed

    Pavione, D M S; Bastos, R K X; Bevilacqua, P D

    2013-01-01

    A quantitative microbial risk assessment model for estimating infection risks arising from consuming crops eaten raw that have been irrigated with effluents from stabilization ponds was constructed. A log-normal probability distribution function was fitted to a large database from a comprehensive monitoring of an experimental pond system to account for variability in Escherichia coli concentration in irrigation water. Crop contamination levels were estimated using predictive models derived from field experiments involving the irrigation of several crops with different effluent qualities. Data on daily intake of salad crops were obtained from a national survey in Brazil. Ten thousand-trial Monte Carlo simulations were used to estimate human health risks associated with the use of wastewater for irrigating low- and high-growing crops. The use of effluents containing 10(3)-10(4) E. coli per 100 ml resulted in median rotavirus infection risk of approximately 10(-3) and 10(-4) pppy when irrigating, respectively, low- and high-growing crops; the corresponding 95th percentile risk estimates were around 10(-2) in both scenarios. Sensitivity analyses revealed that variations in effluent quality, in the assumed ratios of pathogens to E. coli, and in the reduction of pathogens between harvest and consumption had great impact upon risk estimates.

  13. Applying data mining for the analysis of breast cancer data.

    PubMed

    Liou, Der-Ming; Chang, Wei-Pin

    2015-01-01

    Data mining, also known as Knowledge-Discovery in Databases (KDD), is the process of automatically searching large volumes of data for patterns. For instance, a clinical pattern might indicate a female who have diabetes or hypertension are easier suffered from stroke for 5 years in a future. Then, a physician can learn valuable knowledge from the data mining processes. Here, we present a study focused on the investigation of the application of artificial intelligence and data mining techniques to the prediction models of breast cancer. The artificial neural network, decision tree, logistic regression, and genetic algorithm were used for the comparative studies and the accuracy and positive predictive value of each algorithm were used as the evaluation indicators. 699 records acquired from the breast cancer patients at the University of Wisconsin, nine predictor variables, and one outcome variable were incorporated for the data analysis followed by the tenfold cross-validation. The results revealed that the accuracies of logistic regression model were 0.9434 (sensitivity 0.9716 and specificity 0.9482), the decision tree model 0.9434 (sensitivity 0.9615, specificity 0.9105), the neural network model 0.9502 (sensitivity 0.9628, specificity 0.9273), and the genetic algorithm model 0.9878 (sensitivity 1, specificity 0.9802). The accuracy of the genetic algorithm was significantly higher than the average predicted accuracy of 0.9612. The predicted outcome of the logistic regression model was higher than that of the neural network model but no significant difference was observed. The average predicted accuracy of the decision tree model was 0.9435 which was the lowest of all four predictive models. The standard deviation of the tenfold cross-validation was rather unreliable. This study indicated that the genetic algorithm model yielded better results than other data mining models for the analysis of the data of breast cancer patients in terms of the overall accuracy of

  14. Applied Climate-Change Analysis: The Climate Wizard Tool

    PubMed Central

    Girvetz, Evan H.; Zganjar, Chris; Raber, George T.; Maurer, Edwin P.; Kareiva, Peter; Lawler, Joshua J.

    2009-01-01

    Background Although the message of “global climate change” is catalyzing international action, it is local and regional changes that directly affect people and ecosystems and are of immediate concern to scientists, managers, and policy makers. A major barrier preventing informed climate-change adaptation planning is the difficulty accessing, analyzing, and interpreting climate-change information. To address this problem, we developed a powerful, yet easy to use, web-based tool called Climate Wizard (http://ClimateWizard.org) that provides non-climate specialists with simple analyses and innovative graphical depictions for conveying how climate has and is projected to change within specific geographic areas throughout the world. Methodology/Principal Findings To demonstrate the Climate Wizard, we explored historic trends and future departures (anomalies) in temperature and precipitation globally, and within specific latitudinal zones and countries. We found the greatest temperature increases during 1951–2002 occurred in northern hemisphere countries (especially during January–April), but the latitude of greatest temperature change varied throughout the year, sinusoidally ranging from approximately 50°N during February-March to 10°N during August-September. Precipitation decreases occurred most commonly in countries between 0–20°N, and increases mostly occurred outside of this latitudinal region. Similarly, a quantile ensemble analysis based on projections from 16 General Circulation Models (GCMs) for 2070–2099 identified the median projected change within countries, which showed both latitudinal and regional patterns in projected temperature and precipitation change. Conclusions/Significance The results of these analyses are consistent with those reported by the Intergovernmental Panel on Climate Change, but at the same time, they provide examples of how Climate Wizard can be used to explore regionally- and temporally-specific analyses of climate

  15. Applied and computational harmonic analysis on graphs and networks

    NASA Astrophysics Data System (ADS)

    Irion, Jeff; Saito, Naoki

    2015-09-01

    In recent years, the advent of new sensor technologies and social network infrastructure has provided huge opportunities and challenges for analyzing data recorded on such networks. In the case of data on regular lattices, computational harmonic analysis tools such as the Fourier and wavelet transforms have well-developed theories and proven track records of success. It is therefore quite important to extend such tools from the classical setting of regular lattices to the more general setting of graphs and networks. In this article, we first review basics of graph Laplacian matrices, whose eigenpairs are often interpreted as the frequencies and the Fourier basis vectors on a given graph. We point out, however, that such an interpretation is misleading unless the underlying graph is either an unweighted path or cycle. We then discuss our recent effort of constructing multiscale basis dictionaries on a graph, including the Hierarchical Graph Laplacian Eigenbasis Dictionary and the Generalized Haar-Walsh Wavelet Packet Dictionary, which are viewed as generalizations of the classical hierarchical block DCTs and the Haar-Walsh wavelet packets, respectively, to the graph setting. Finally, we demonstrate the usefulness of our dictionaries by using them to simultaneously segment and denoise 1-D noisy signals sampled on regular lattices, a problem where classical tools have difficulty.

  16. Ion Beam Analysis applied to laser-generated plasmas

    NASA Astrophysics Data System (ADS)

    Cutroneo, M.; Macková, A.; Havranek, V.; Malinsky, P.; Torrisi, L.; Kormunda, M.; Barchuk, M.; Ullschmied, J.; Dudzak, R.

    2016-04-01

    This paper presents the research activity on Ion Beam Analysis methods performed at Tandetron Laboratory (LT) of the Institute of Nuclear Physics AS CR, Rez, Czech Republic. Recently, many groups are paying attention to implantation by laser generated plasma. This process allows to insert a controllable amount of energetic ions into the surface layers of different materials modifying the physical and chemical properties of the surface material. Different substrates are implanted by accelerated ions from plasma through terawatt iodine laser, at nominal intensity of 1015 W/cm2, at the PALS Research Infrastructure AS CR, in the Czech Republic. This regime of the laser matter interaction generates, multi-MeV proton beams, and multi-charged ions that are tightly confined in time (hundreds ps) and space (source radius of a few microns). These ion beams have a much lower transverse temperature, a much shorter duration and a much higher current than those obtainable from conventional accelerators. The implementation of protons and ions acceleration driven by ultra-short high intensity lasers is exhibited by adopting suitable irradiation conditions as well as tailored targets. An overview of implanted targets and their morphological and structural characterizations is presented and discussed.

  17. LAVA (Los Alamos Vulnerability and Risk Assessment Methodology): A conceptual framework for automated risk analysis

    SciTech Connect

    Smith, S.T.; Lim, J.J.; Phillips, J.R.; Tisinger, R.M.; Brown, D.C.; FitzGerald, P.D.

    1986-01-01

    At Los Alamos National Laboratory, we have developed an original methodology for performing risk analyses on subject systems characterized by a general set of asset categories, a general spectrum of threats, a definable system-specific set of safeguards protecting the assets from the threats, and a general set of outcomes resulting from threats exploiting weaknesses in the safeguards system. The Los Alamos Vulnerability and Risk Assessment Methodology (LAVA) models complex systems having large amounts of ''soft'' information about both the system itself and occurrences related to the system. Its structure lends itself well to automation on a portable computer, making it possible to analyze numerous similar but geographically separated installations consistently and in as much depth as the subject system warrants. LAVA is based on hierarchical systems theory, event trees, fuzzy sets, natural-language processing, decision theory, and utility theory. LAVA's framework is a hierarchical set of fuzzy event trees that relate the results of several embedded (or sub-) analyses: a vulnerability assessment providing information about the presence and efficacy of system safeguards, a threat analysis providing information about static (background) and dynamic (changing) threat components coupled with an analysis of asset ''attractiveness'' to the dynamic threat, and a consequence analysis providing information about the outcome spectrum's severity measures and impact values. By using LAVA, we have modeled our widely used computer security application as well as LAVA/CS systems for physical protection, transborder data flow, contract awards, and property management. It is presently being applied for modeling risk management in embedded systems, survivability systems, and weapons systems security. LAVA is especially effective in modeling subject systems that include a large human component.

  18. Radiation Leukemogenesis: Applying Basic Science of Epidemiological Estimates of Low Dose Risks and Dose-Rate Effects

    SciTech Connect

    Hoel, D. G.

    1998-11-01

    The next stage of work has been to examine more closely the A-bomb leukemia data which provides the underpinnings of the risk estimation of CML in the above mentioned manuscript. The paper by Hoel and Li (Health Physics 75:241-50) shows how the linear-quadratic model has basic non-linearities at the low dose region for the leukemias including CML. Pierce et. al., (Radiation Research 123:275-84) have developed distributions for the uncertainty in the estimated exposures of the A-bomb cohort. Kellerer, et. al., (Radiation and Environmental Biophysics 36:73-83) has further considered possible errors in the estimated neutron values and with changing RBE values with dose and has hypothesized that the tumor response due to gamma may not be linear. We have incorporated his neutron model and have constricted new A-bomb doses based on his model adjustments. The Hoel and Li dose response analysis has also been applied using the Kellerer neutron dose adjustments for the leukemias. Finally, both Pierce's dose uncertainties and Kellerer neutron adjustments are combined as well as the varying RBE with dose as suggested by Rossi and Zaider and used for leukemia dose-response analysis. First the results of Hoel and Li showing a significantly improved fit of the linear-quadratic dose response by the inclusion of a threshold (i.e. low-dose nonlinearity) persisted. This work has been complete for both solid tumor as well as leukemia for both mortality as well as incidence data. The results are given in the manuscript described below which has been submitted to Health Physics.

  19. Environmental risk analysis for indirect coal liquefaction

    SciTech Connect

    Barnthouse, L.W.; Suter, G.W. II; Baes, C.F. III; Bartell, S.M.; Cavendish, M.G.; Gardner, R.H.; O'Neill, R.V.; Rosen, A.E.

    1985-01-01

    This report presents an analysis of the risks to fish, water quality (due to noxious algal blooms), crops, forests, and wildlife of two technologies for the indirect liquefaction of coal: Lurgi and Koppers-Totzek gasification of coal for Fischer-Tropsch synthesis. A variety of analytical techniques were used to make maximum use of the available data to consider effects of effluents on different levels of ecological organization. The most significant toxicants to fish were found to be ammonia, cadmium, and acid gases. An analysis of whole-effluent toxicity indicated that the Lurgi effluent is more acutely toxic than the Koppers-Totzek effluent. Six effluent components appear to pose a potential threat of blue-green algal blooms, primarily because of their effects on higher trophic levels. The most important atmospheric emissions with respect to crops, forests, and wildlife were found to be the conventional combustion products SO/sub 2/ and NO/sub 2/. Of the materials deposited on the soil, arsenic, cadmium, and nickel appear of greatest concern for phytotoxicity. 147 references, 5 figures, 41 tables.

  20. Improving the flash flood frequency analysis applying dendrogeomorphological evidences

    NASA Astrophysics Data System (ADS)

    Ruiz-Villanueva, V.; Ballesteros, J. A.; Bodoque, J. M.; Stoffel, M.; Bollschweiler, M.; Díez-Herrero, A.

    2009-09-01

    Flash floods are one of the natural hazards that cause major damages worldwide. Especially in Mediterranean areas they provoke high economic losses every year. In mountain areas with high stream gradients, floods events are characterized by extremely high flow and debris transport rates. Flash flood analysis in mountain areas presents specific scientific challenges. On one hand, there is a lack of information on precipitation and discharge due to a lack of spatially well distributed gauge stations with long records. On the other hand, gauge stations may not record correctly during extreme events when they are damaged or the discharge exceeds the recordable level. In this case, no systematic data allows improvement of the understanding of the spatial and temporal occurrence of the process. Since historic documentation is normally scarce or even completely missing in mountain areas, tree-ring analysis can provide an alternative approach. Flash floods may influence trees in different ways: (1) tilting of the stem through the unilateral pressure of the flowing mass or individual boulders; (2) root exposure through erosion of the banks; (3) injuries and scars caused by boulders and wood transported in the flow; (4) decapitation of the stem and resulting candelabra growth through the severe impact of boulders; (5) stem burial through deposition of material. The trees react to these disturbances with specific growth changes such as abrupt change of the yearly increment and anatomical changes like reaction wood or callus tissue. In this study, we sampled 90 cross sections and 265 increment cores of trees heavily affected by past flash floods in order to date past events and to reconstruct recurrence intervals in two torrent channels located in the Spanish Central System. The first study site is located along the Pelayo River, a torrent in natural conditions. Based on the external disturbances of trees and their geomorphological position, 114 Pinus pinaster (Ait

  1. Applied Drama and the Higher Education Learning Spaces: A Reflective Analysis

    ERIC Educational Resources Information Center

    Moyo, Cletus

    2015-01-01

    This paper explores Applied Drama as a teaching approach in Higher Education learning spaces. The exploration takes a reflective analysis approach by first examining the impact that Applied Drama has had on my career as a Lecturer/Educator/Teacher working in Higher Education environments. My engagement with Applied Drama practice and theory is…

  2. Exploratory Factor Analysis as a Construct Validation Tool: (Mis)applications in Applied Linguistics Research

    ERIC Educational Resources Information Center

    Karami, Hossein

    2015-01-01

    Factor analysis has been frequently exploited in applied research to provide evidence about the underlying factors in various measurement instruments. A close inspection of a large number of studies published in leading applied linguistic journals shows that there is a misconception among applied linguists as to the relative merits of exploratory…

  3. Multi-criteria decision analysis with probabilistic risk assessment for the management of contaminated ground water

    SciTech Connect

    Khadam, Ibrahim M.; Kaluarachchi, Jagath J

    2003-10-01

    Traditionally, environmental decision analysis in subsurface contamination scenarios is performed using cost-benefit analysis. In this paper, we discuss some of the limitations associated with cost-benefit analysis, especially its definition of risk, its definition of cost of risk, and its poor ability to communicate risk-related information. This paper presents an integrated approach for management of contaminated ground water resources using health risk assessment and economic analysis through a multi-criteria decision analysis framework. The methodology introduces several important concepts and definitions in decision analysis related to subsurface contamination. These are the trade-off between population risk and individual risk, the trade-off between the residual risk and the cost of risk reduction, and cost-effectiveness as a justification for remediation. The proposed decision analysis framework integrates probabilistic health risk assessment into a comprehensive, yet simple, cost-based multi-criteria decision analysis framework. The methodology focuses on developing decision criteria that provide insight into the common questions of the decision-maker that involve a number of remedial alternatives. The paper then explores three potential approaches for alternative ranking, a structured explicit decision analysis, a heuristic approach of importance of the order of criteria, and a fuzzy logic approach based on fuzzy dominance and similarity analysis. Using formal alternative ranking procedures, the methodology seeks to present a structured decision analysis framework that can be applied consistently across many different and complex remediation settings. A simple numerical example is presented to demonstrate the proposed methodology. The results showed the importance of using an integrated approach for decision-making considering both costs and risks. Future work should focus on the application of the methodology to a variety of complex field conditions to

  4. Analysis of Radiation Pneumonitis Risk Using a Generalized Lyman Model

    SciTech Connect

    Tucker, Susan L. Liu, H. Helen; Liao Zhongxing; Wei Xiong; Wang Shulian; Jin Hekun; Komaki, Ritsuko; Martel, Mary K.; Mohan, Radhe

    2008-10-01

    Purpose: To introduce a version of the Lyman normal-tissue complication probability (NTCP) model adapted to incorporate censored time-to-toxicity data and clinical risk factors and to apply the generalized model to analysis of radiation pneumonitis (RP) risk. Methods and Materials: Medical records and radiation treatment plans were reviewed retrospectively for 576 patients with non-small cell lung cancer treated with radiotherapy. The time to severe (Grade {>=}3) RP was computed, with event times censored at last follow-up for patients not experiencing this endpoint. The censored time-to-toxicity data were analyzed using the standard and generalized Lyman models with patient smoking status taken into account. Results: The generalized Lyman model with patient smoking status taken into account produced NTCP estimates up to 27 percentage points different from the model based on dose-volume factors alone. The generalized model also predicted that 8% of the expected cases of severe RP were unobserved because of censoring. The estimated volume parameter for lung was not significantly different from n = 1, corresponding to mean lung dose. Conclusions: NTCP models historically have been based solely on dose-volume effects and binary (yes/no) toxicity data. Our results demonstrate that inclusion of nondosimetric risk factors and censored time-to-event data can markedly affect outcome predictions made using NTCP models.

  5. Maximum likelihood estimation applied to multiepoch MEG/EEG analysis

    NASA Astrophysics Data System (ADS)

    Baryshnikov, Boris V.

    A maximum likelihood based algorithm for reducing the effects of spatially colored noise in evoked response MEG and EEG experiments is presented. The signal of interest is modeled as the low rank mean, while the noise is modeled as a Kronecker product of spatial and temporal covariance matrices. The temporal covariance is assumed known, while the spatial covariance is estimated as part of the algorithm. In contrast to prestimulus based whitening followed by principal component analysis, our algorithm does not require signal-free data for noise whitening and thus is more effective with non-stationary noise and produces better quality whitening for a given data record length. The efficacy of this approach is demonstrated using simulated and real MEG data. Next, a study in which we characterize MEG cortical response to coherent vs. incoherent motion is presented. It was found that coherent motion of the object induces not only an early sensory response around 180 ms relative to the stimulus onset but also a late field in the 250--500 ms range that has not been observed previously in similar random dot kinematogram experiments. The late field could not be resolved without signal processing using the maximum likelihood algorithm. The late activity localized to parietal areas. This is what would be expected. We believe that the late field corresponds to higher order processing related to the recognition of the moving object against the background. Finally, a maximum likelihood based dipole fitting algorithm is presented. It is suitable for dipole fitting of evoked response MEG data in the presence of spatially colored noise. The method exploits the temporal multiepoch structure of the evoked response data to estimate the spatial noise covariance matrix from the section of data being fit, eliminating the stationarity assumption implicit in prestimulus based whitening approaches. The preliminary results of the application of this algorithm to the simulated data show its

  6. Relative risk analysis of the use of radiation-emitting medical devices: A preliminary application

    SciTech Connect

    Jones, E.D.

    1996-06-01

    This report describes the development of a risk analysis approach for evaluating the use of radiation-emitting medial devices. This effort was performed by Lawrence Livermore National Laboratory for the US Nuclear Regulatory Commission (NRC). The assessment approach has bee applied to understand the risks in using the Gamma Knife, a gamma irradiation therapy device. This effort represents an initial step to evaluate the potential role of risk analysis for developing regulations and quality assurance requirements in the use of nuclear medical devices. The risk approach identifies and assesses the most likely risk contributors and their relative importance for the medical system. The approach uses expert screening techniques and relative risk profiling to incorporate the type, quality, and quantity of data available and to present results in an easily understood form.

  7. Measuring and modelling pollution for risk analysis.

    PubMed

    Zidek, J V; Le, N D

    1999-01-01

    The great scale and complexity of environmental risk analysis offers major methodological challenges to those engaged in policymaking. In this paper we describe some of those challenges from the perspective gained through our work at the University of British Columbia (UBC). We describe some of our experiences with respect to the difficult problems of formulating environmental standards and developing abatement strategies. A failed but instructive attempt to find support for experiments on a promising method of reducing acid rain will be described. Then we describe an approach to scenario analysis under hypothetical new standards. Even with measurements of ambient environmental conditions in hand the problem of inferring actual human exposures remains. For example, in very hot weather people will tend to stay inside and population levels of exposure to e.g. ozone could be well below those predicted by the ambient measurements. Setting air quality criteria should ideally recognize the discrepancies likely to arise. Computer models that incorporate spatial random pollution fields and predict actual exposures from ambient levels will be described. From there we turn to the statistical issues of measurement and modelling and some of the contributions in these areas by the UBC group and its partners elsewhere. In particular we discuss the problem of measurement error when non-linear regression models are used. We sketch our approach to imputing unmeasured predictors needed in such models, deferring details to references cited below. We describe in general terms how those imputed measurements and their errors can be accommodated within the framework of health impact analysis.

  8. Risk assessment framework of fate and transport models applied to hazardous waste sites

    SciTech Connect

    Hwang, S.T.

    1993-06-01

    Risk assessment is an increasingly important part of the decision-making process in the cleanup of hazardous waste sites. Despite guidelines from regulatory agencies and considerable research efforts to reduce uncertainties in risk assessments, there are still many issues unanswered. This paper presents new research results pertaining to fate and transport models, which will be useful in estimating exposure concentrations and will help reduce uncertainties in risk assessment. These developments include an approach for (1) estimating the degree of emissions and concentration levels of volatile pollutants during the use of contaminated water, (2) absorption of organic chemicals in the soil matrix through the skin, and (3) steady state, near-field, contaminant concentrations in the aquifer within a waste boundary.

  9. Risk factor detection for heart disease by applying text analytics in electronic medical records.

    PubMed

    Torii, Manabu; Fan, Jung-Wei; Yang, Wei-Li; Lee, Theodore; Wiley, Matthew T; Zisook, Daniel S; Huang, Yang

    2015-12-01

    In the United States, about 600,000 people die of heart disease every year. The annual cost of care services, medications, and lost productivity reportedly exceeds 108.9 billion dollars. Effective disease risk assessment is critical to prevention, care, and treatment planning. Recent advancements in text analytics have opened up new possibilities of using the rich information in electronic medical records (EMRs) to identify relevant risk factors. The 2014 i2b2/UTHealth Challenge brought together researchers and practitioners of clinical natural language processing (NLP) to tackle the identification of heart disease risk factors reported in EMRs. We participated in this track and developed an NLP system by leveraging existing tools and resources, both public and proprietary. Our system was a hybrid of several machine-learning and rule-based components. The system achieved an overall F1 score of 0.9185, with a recall of 0.9409 and a precision of 0.8972.

  10. Applying predictive analytics to develop an intelligent risk detection application for healthcare contexts.

    PubMed

    Moghimi, Fatemeh Hoda; Cheung, Michael; Wickramasinghe, Nilmini

    2013-01-01

    Healthcare is an information rich industry where successful outcomes require the processing of multi-spectral data and sound decision making. The exponential growth of data and big data issues coupled with a rapid increase of service demands in healthcare contexts today, requires a robust framework enabled by IT (information technology) solutions as well as real-time service handling in order to ensure superior decision making and successful healthcare outcomes. Such a context is appropriate for the application of real time intelligent risk detection decision support systems using predictive analytic techniques such as data mining. To illustrate the power and potential of data science technologies in healthcare decision making scenarios, the use of an intelligent risk detection (IRD) model is proffered for the context of Congenital Heart Disease (CHD) in children, an area which requires complex high risk decisions that need to be made expeditiously and accurately in order to ensure successful healthcare outcomes.

  11. Rethinking prevention in primary care: applying the chronic care model to address health risk behaviors.

    PubMed

    Hung, Dorothy Y; Rundall, Thomas G; Tallia, Alfred F; Cohen, Deborah J; Halpin, Helen Ann; Crabtree, Benjamin F

    2007-01-01

    This study examines the Chronic Care Model (CCM) as a framework for preventing health risk behaviors such as tobacco use, risky drinking, unhealthy dietary patterns, and physical inactivity. Data were obtained from primary care practices participating in a national health promotion initiative sponsored by the Robert Wood Johnson Foundation. Practices owned by a hospital health system and exhibiting a culture of quality improvement were more likely to offer recommended services such as health risk assessment, behavioral counseling, and referral to community-based programs. Practices that had a multispecialty physician staff and staff dieticians, decision support in the form of point-of-care reminders and clinical staff meetings, and clinical information systems such as electronic medical records were also more likely to offer recommended services. Adaptation of the CCM for preventive purposes may offer a useful framework for addressing important health risk behaviors.

  12. Assessing population exposure for landslide risk analysis using dasymetric cartography

    NASA Astrophysics Data System (ADS)

    Garcia, Ricardo A. C.; Oliveira, Sergio C.; Zezere, Jose L.

    2015-04-01

    Exposed Population is a major topic that needs to be taken into account in a full landslide risk analysis. Usually, risk analysis is based on an accounting of inhabitants number or inhabitants density, applied over statistical or administrative terrain units, such as NUTS or parishes. However, this kind of approach may skew the obtained results underestimating the importance of population, mainly in territorial units with predominance of rural occupation. Furthermore, the landslide susceptibility scores calculated for each terrain unit are frequently more detailed and accurate than the location of the exposed population inside each territorial unit based on Census data. These drawbacks are not the ideal setting when landslide risk analysis is performed for urban management and emergency planning. Dasymetric cartography, which uses a parameter or set of parameters to restrict the spatial distribution of a particular phenomenon, is a methodology that may help to enhance the resolution of Census data and therefore to give a more realistic representation of the population distribution. Therefore, this work aims to map and to compare the population distribution based on a traditional approach (population per administrative terrain units) and based on dasymetric cartography (population by building). The study is developed in the Region North of Lisbon using 2011 population data and following three main steps: i) the landslide susceptibility assessment based on statistical models independently validated; ii) the evaluation of population distribution (absolute and density) for different administrative territorial units (Parishes and BGRI - the basic statistical unit in the Portuguese Census); and iii) the dasymetric population's cartography based on building areal weighting. Preliminary results show that in sparsely populated administrative units, population density differs more than two times depending on the application of the traditional approach or the dasymetric

  13. Applying the Triad method in a risk assessment of a former surface treatment and metal industry site.

    PubMed

    Ribé, Veronica; Aulenius, Elisabet; Nehrenheim, Emma; Martell, Ulrika; Odlare, Monica

    2012-03-15

    With a greater focus on soil protection in the E.U., the need for ecological risk assessment tools for cost-effective characterization of site contamination is increasing. One of the challenges in assessing the risk of soil contaminants is to accurately account for changes in mobility of contaminants over time, as a result of ageing. Improved tools for measuring the bioavailable and mobile fraction of contaminants is therefore highly desirable. In this study the Triad method was used to perform a risk characterization of a former surface treatment and metal industry in Eskilstuna, Sweden. The risk assessment confirmed the environmental risk of the most heavily contaminated sample and showed that the toxic effect was most likely caused by high metal concentrations. The assessment of the two soil samples with low to moderate metal contamination levels was more complex, as there was a higher deviation between the results from the three lines of evidence; chemistry, (eco)toxicology and ecology. For the slightly less contaminated sample of the two, a weighting of the results from the ecotoxicological LoE would be recommended in order to accurately determine the risk of the metal contamination at the sampling site as the toxic effect detected in the Microtox® test and Ostracodtoxkit™ test was more likely to be due to oil contamination. The soil sample with higher total metal concentrations requires further ecotoxicological testing, as the integrated risk value indicated an environmental risk from metal contamination. The applied methodology, the Triad method, is considered appropriate for conducting improved environmental risk assessments in order to achieve sustainable remediation processes. PMID:21890272

  14. Beyond Time Out and Table Time: Today's Applied Behavior Analysis for Students with Autism

    ERIC Educational Resources Information Center

    Boutot, E. Amanda; Hume, Kara

    2010-01-01

    Recent mandates related to the implementation of evidence-based practices for individuals with autism spectrum disorder (ASD) require that autism professionals both understand and are able to implement practices based on the science of applied behavior analysis (ABA). The use of the term "applied behavior analysis" and its related concepts…

  15. Applying Risk Science and Stakeholder Engagement to Overcome Environmental Barriers to Marine and Hydrokinetic Energy Projects

    SciTech Connect

    Copping, Andrea E.; Anderson, Richard M.; Van Cleve, Frances B.

    2010-09-20

    The production of electricity from the moving waters of the ocean has the potential to be a viable addition to the portfolio of renewable energy sources worldwide. The marine and hydrokinetic (MHK) industry faces many hurdles, including technology development, challenges of offshore deployments, and financing; however, the barrier most commonly identified by industry, regulators, and stakeholders is the uncertainty surrounding potential environmental effects of devices placed in the water and the permitting processes associated with real or potential impacts. Regulatory processes are not well positioned to judge the severity of harm due to turbines or wave generators. Risks from MHK devices to endangered or protected animals in coastal waters and rivers, as well as the habitats that support them, are poorly understood. This uncertainty raises concerns about catastrophic interactions between spinning turbine blades or slack mooring lines and marine mammals, birds and fish. In order to accelerate the deployment of tidal and wave devices, there is a need to sort through the extensive list of potential interactions that may cause harm to marine organisms and ecosystems, to set priorities for regulatory triggers, and to direct future research. Identifying the risk of MHK technology components on specific marine organisms and ecosystem components can separate perceived from real risk-relevant interactions. Scientists from Pacific Northwest National Laboratory (PNNL) are developing an Environmental Risk Evaluation System (ERES) to assess environmental effects associated with MHK technologies and projects through a systematic analytical process, with specific input from key stakeholder groups. The array of stakeholders interested in the development of MHK is broad, segmenting into those whose involvement is essential for the success of the MHK project, those that are influential, and those that are interested. PNNL and their partners have engaged these groups, gaining

  16. Seismic vulnerability assessments in risk analysis

    NASA Astrophysics Data System (ADS)

    Frolova, Nina; Larionov, Valery; Bonnin, Jean; Ugarov, Alexander

    2013-04-01

    The assessment of seismic vulnerability is a critical issue within natural and technological risk analysis. In general, there are three common types of methods used for development of vulnerability functions of different elements at risk: empirical, analytical and expert estimations. The paper addresses the empirical methods for seismic vulnerability estimation for residential buildings and industrial facilities. The results of engineering analysis of past earthquake consequences, as well as the statistical data on buildings behavior during strong earthquakes presented in the different seismic intensity scales, are used to verify the regional parameters of mathematical models in order to simulate physical and economic vulnerability for different building types classified according to seismic scale MMSK-86. Verified procedure has been used to estimate the physical and economic vulnerability of buildings and constructions against earthquakes for the Northern Caucasus Federal region of the Russian Federation and Krasnodar area, which are characterized by rather high level of seismic activity and high population density. In order to estimate expected damage states to buildings and constructions in the case of the earthquakes according to the OSR-97B (return period T=1,000 years) within big cities and towns, they were divided into unit sites and their coordinates were presented as dots located in the centers of unit sites. Then the indexes obtained for each unit site were summed up. The maps of physical vulnerability zoning for Northern Caucasus Federal region of the Russian Federation and Krasnodar area includes two elements: percent of different damage states for settlements with number of inhabitants less than 1,000 and vulnerability for cities and towns with number of inhabitants more than 1,000. The hypsometric scale is used to represent both elements on the maps. Taking into account the size of oil pipe line systems located in the highly active seismic zones in

  17. Invitational Theory and Practice Applied to Resiliency Development in At-Risk Youth

    ERIC Educational Resources Information Center

    Lee, R. Scott

    2012-01-01

    Resilience development is a growing field of study within the scholarly literature regarding social emotional achievement of at-risk students. Developing resiliency is based on the assumption that positive, pro-social, and/or strength-based values inherent in children and youth should be actively and intentionally developed. The core values of…

  18. Human health risk assessment of triclosan in land-applied biosolids.

    PubMed

    Verslycke, Tim; Mayfield, David B; Tabony, Jade A; Capdevielle, Marie; Slezak, Brian

    2016-09-01

    Triclosan (5-chloro-2-[2,4-dichlorophenoxy]-phenol) is an antimicrobial agent found in a variety of pharmaceutical and personal care products. Numerous studies have examined the occurrence and environmental fate of triclosan in wastewater, biosolids, biosolids-amended soils, and plants and organisms exposed to biosolid-amended soils. Triclosan has a propensity to adhere to organic carbon in biosolids and biosolid-amended soils. Land application of biosolids containing triclosan has the potential to contribute to multiple direct and indirect human health exposure pathways. To estimate exposures and human health risks from biosolid-borne triclosan, a risk assessment was conducted in general accordance with the methodology incorporated into the US Environmental Protection Agency's Part 503 biosolids rule. Human health exposures to biosolid-borne triclosan were estimated on the basis of published empirical data or modeled using upper-end environmental partitioning estimates. Similarly, a range of published triclosan human health toxicity values was evaluated. Margins of safety were estimated for 10 direct and indirect exposure pathways, both individually and combined. The present risk assessment found large margins of safety (>1000 to >100 000) for potential exposures to all pathways, even under the most conservative exposure and toxicity assumptions considered. The human health exposures and risks from biosolid-borne triclosan are concluded to be de minimis. Environ Toxicol Chem 2016;35:2358-2367. © 2016 SETAC. PMID:27552397

  19. Applying a Cognitive-Behavioral Model of HIV Risk to Youths in Psychiatric Care

    ERIC Educational Resources Information Center

    Donenberg, Geri R.; Schwartz, Rebecca Moss; Emerson, Erin; Wilson, Helen W.; Bryant, Fred B.; Coleman, Gloria

    2005-01-01

    This study examined the utility of cognitive and behavioral constructs (AIDS information, motivation, and behavioral skills) in explaining sexual risk taking among 172 12-20-year-old ethnically diverse urban youths in outpatient psychiatric care. Structural equation modeling revealed only moderate support for the model, explaining low to moderate…

  20. INTERPRETATION OF SPLP RESULTS FOR ASSESSING RISK TO GROUNDWATER FROM LAND-APPLIED GRANULAR WASTE

    EPA Science Inventory

    Scientists and engineers often rely on results from the synthetic precipitation leaching procedure (SPLP) to assess the risk of groundwater contamination posed by the land application of granular solid wastes. The concentrations of pollutants in SPLP leachate can be measured and ...

  1. At-Risk Students and Virtual Enterprise: Tourism and Hospitality Simulations in Applied and Academic Learning.

    ERIC Educational Resources Information Center

    Borgese, Anthony

    This paper discusses Virtual Enterprise (VE), a technology-driven business simulation program in which students conceive, create, and operate enterprises that utilize Web-based and other technologies to trade products and services around the world. The study examined the effects of VE on a learning community of at-risk students, defined as those…

  2. Supervised discretization can discover risk groups in cancer survival analysis.

    PubMed

    Gómez, Iván; Ribelles, Nuria; Franco, Leonardo; Alba, Emilio; Jerez, José M

    2016-11-01

    Discretization of continuous variables is a common practice in medical research to identify risk patient groups. This work compares the performance of gold-standard categorization procedures (TNM+A protocol) with that of three supervised discretization methods from Machine Learning (CAIM, ChiM and DTree) in the stratification of patients with breast cancer. The performance for the discretization algorithms was evaluated based on the results obtained after applying standard survival analysis procedures such as Kaplan-Meier curves, Cox regression and predictive modelling. The results show that the application of alternative discretization algorithms could lead the clinicians to get valuable information for the diagnosis and outcome of the disease. Patient data were collected from the Medical Oncology Service of the Hospital Clínico Universitario (Málaga, Spain) considering a follow up period from 1982 to 2008. PMID:27686699

  3. Risk analysis of sustainable urban drainage and irrigation

    NASA Astrophysics Data System (ADS)

    Ursino, Nadia

    2015-09-01

    Urbanization, by creating extended impervious areas, to the detriment of vegetated ones, may have an undesirable influence on the water and energy balances of urban environments. The storage and infiltration capacity of the drainage system lessens the negative influence of urbanization, and vegetated areas help to re-establish pre-development environmental conditions. Resource limitation, climate, leading to increasing water scarcity, demographic and socio-institutional shifts promote more integrated water management. Storm-water harvesting for landscape irrigation mitigates possible water restrictions for the urban population in drought scenarios. A new probabilistic model for sustainable rainfall drainage, storage and re-use systems was implemented in this study. Risk analysis of multipurpose storage capacities was generalized by the use of only a few dimensionless parameters and applied to a case study in a Mediterranean-type climate, although the applicability of the model is not restricted to any particular climatic type.

  4. Developing scenarios to assess future landslide risks: a model-based approach applied to mountainous regions

    NASA Astrophysics Data System (ADS)

    Vacquie, Laure; Houet, Thomas

    2016-04-01

    In the last century, European mountain landscapes have experienced significant transformations. Natural and anthropogenic changes, climate changes, touristic and industrial development, socio-economic interactions, and their implications in terms of LUCC (land use and land cover changes) have directly influenced the spatial organization and vulnerability of mountain landscapes. This study is conducted as part of the SAMCO project founded by the French National Science Agency (ANR). It aims at developing a methodological approach, combining various tools, modelling platforms and methods, to identify vulnerable regions to landslide hazards accounting for futures LUCC. It presents an integrated approach combining participative scenarios and a LULC changes simulation models to assess the combined effects of LUCC and climate change on landslide risks in the Cauterets valley (French Pyrenees Mountains) up to 2100. Through vulnerability and risk mapping, the objective is to gather information to support landscape planning and implement land use strategies with local stakeholders for risk management. Four contrasting scenarios are developed and exhibit contrasting trajectories of socio-economic development. Prospective scenarios are based on national and international socio-economic contexts relying on existing assessment reports. The methodological approach integrates knowledge from local stakeholders to refine each scenario during their construction and to reinforce their plausibility and relevance by accounting for local specificities, e.g. logging and pastoral activities, touristic development, urban planning, etc. A process-based model, the Forecasting Scenarios for Mountains (ForeSceM) model, developed on the Dinamica Ego modelling platform is used to spatially allocate futures LUCC for each prospective scenario. Concurrently, a spatial decision support tool, i.e. the SYLVACCESS model, is used to identify accessible areas for forestry in scenario projecting logging

  5. Locating and applying sociological theories of risk-taking to develop public health interventions for adolescents

    PubMed Central

    Pound, Pandora; Campbell, Rona

    2015-01-01

    Sociological theories seldom inform public health interventions at the community level. The reasons for this are unclear but may include difficulties in finding, understanding or operationalising theories. We conducted a study to explore the feasibility of locating sociological theories within a specific field of public health, adolescent risk-taking, and to consider their potential for practical application. We identified a range of sociological theories. These explained risk-taking: (i) as being due to lack of social integration; (ii) as a consequence of isolation from mainstream society; (iii) as a rite of passage; (iv) as a response to social constraints; (v) as resistance; (vi) as an aspect of adolescent development; (vii) by the theory of the ‘habitus’; (viii) by situated rationality and social action theories; and (ix) as social practice. We consider these theories in terms of their potential to inform public health interventions for young people. PMID:25999784

  6. Approaches to risk-adjusting outcome measures applied to criminal justice involvement after community service.

    PubMed

    Banks, S M; Pandiani, J A; Bramley, J

    2001-08-01

    The ethic of fairness in program evaluation requires that measures of behavioral health agency performance be sensitive to differences in those agencies' caseload composition. The authors describe two traditional approaches to the statistical risk adjustment of outcome measures (stratification weighting and pre-post measurement) that are designed to account for differences in caseload composition and introduce a method that incorporates the strengths of both approaches. Procedures for deriving each of these measures are described in detail and demonstrated in the evaluation of a statewide system of community-based behavioral health care programs. This evaluation examines the degree to which service recipients get into trouble with the law after treatment. Three measures are recommended for inclusion in outcome-oriented "report cards," and the interpretation of each measure is discussed. Finally, the authors suggest formats for graphic and tabular presentation of the risk-adjusted evaluation for sharing findings with diverse stakeholder groups. PMID:11497020

  7. Environmental Risk Assessment of antimicrobials applied in veterinary medicine-A field study and laboratory approach.

    PubMed

    Slana, Marko; Dolenc, Marija Sollner

    2013-01-01

    The fate and environmental risk of antimicrobial compounds of different groups of veterinary medicine pharmaceuticals (VMP's) have been compared. The aim was to demonstrate a correlation between the physical and chemical properties of active compounds and their metabolism in target animals, as well as their fate in the environment. In addition, the importance of techniques for manure management and agricultural practice and their influence on the fate of active compounds is discussed. The selected active compounds are shown to be susceptible to at least one environmental factor (sun, water, bacterial or fungal degradation) to which they are exposed during their life cycle, which contributes to its degradation. Degradation under a number of environmental factors has also to be considered as authentic information additional to that observed in the limited conditions in laboratory studies and in Environmental Risk Assessment calculations. PMID:23274419

  8. Applying an Environmental Model to Address High-Risk Drinking: A Town/Gown Case Study

    ERIC Educational Resources Information Center

    Bishop, John B.; Downs, Tracy T.; Cohen, Deborah

    2008-01-01

    This article provides a case study of a project by the University of Delaware and the City of Newark to apply an environmental model to address the excessive use of alcohol by college students. Data about changes in the behavior and experiences of students over a 10-year period are compared. The authors discuss some of the practical implications…

  9. [How to apply follow-up in relation to risk group].

    PubMed

    Montanaro, Vittorino; Di Girolamo, Antonio; Ferro, Matteo; Altieri, Vincenzo

    2013-01-01

    The term 'Non-muscle invasive bladder cancer' identifies a heterogeneous disease due to different natural history of its various appearances. T1 stage represents a non-predictable population, which might respond to non-operative treatment strategies or to the need of a more aggressive treatment, in order to avoid the progression to invasive, and possibly to metastatic stages. In the first year following transurethral resection of bladder (TURB), tumor recurrence is seen in up to 45% of the population; of this, 15% may progress to muscle invasive or metastatic disease, or both. In order to control the recurrence and progression and identify invasive tumors at the earliest possible stage, it is strongly necessary to define individual patient risk assessment follow-up. To obtain exact staging, besides a proper transurethral resection of bladder, a restaging transurethral resection of bladder should be performed in T1 patients. Data from literature support the immediate postoperative intravesical instillation of different chemotherapeutic agents in low-risk patients. Multifocal papillary lesions might require a more intensive adjuvant regimen, whereas intravesical immunotherapy using Bacillus Calmette-Guérin is recommended in patients at high risk of progression. Early cystectomy should be considered in patients with recurrent T1 tumors or refractory carcinoma in situ to avoid unfavorable tumor progression.

  10. Cumulative Benefit Analysis for Ranking Risk Reduction Actions

    SciTech Connect

    Leverenz, Fred L.; Aysa Jimenez, Julio

    2007-04-25

    The Hazard and Operability (HAZOP) study approach, and other similar methods, are very effective ways to qualitatively identify a comprehensive set of accident scenarios for a facility. If these analyses are modified to incorporate a simple system for evaluating relative risk, such as an order-of-magnitude scoring system, the resultant study can be a very powerful input to developing risk reduction strategies. By adding the concept of Risk Reduction Worth evaluations for all accident Causes, Safeguards, and proposal Action Items, an analyst can then formulate a strategy to select the minimal set of risk reduction actions that maximizes risk reduction. One strategy for doing this involves the iterative evaluation of RRW after postulation of risk reduction actions, until the residual risk reaches a tolerable value, termed Cumulative Risk Benefit Analysis. This concept was developed for the evaluation of a set of pipeline pumping stations, and provided valuable insight into how to reduce risk in a sensible, prioritized fashion.

  11. 49 CFR 260.17 - Credit risk premium analysis.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 49 Transportation 4 2014-10-01 2014-10-01 false Credit risk premium analysis. 260.17 Section 260.17 Transportation Other Regulations Relating to Transportation (Continued) FEDERAL RAILROAD... Financial Assistance § 260.17 Credit risk premium analysis. (a) When Federal appropriations are...

  12. 49 CFR 260.17 - Credit risk premium analysis.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 49 Transportation 4 2013-10-01 2013-10-01 false Credit risk premium analysis. 260.17 Section 260.17 Transportation Other Regulations Relating to Transportation (Continued) FEDERAL RAILROAD... Financial Assistance § 260.17 Credit risk premium analysis. (a) When Federal appropriations are...

  13. Virtues and Limitations of Risk Analysis

    ERIC Educational Resources Information Center

    Weatherwax, Robert K.

    1975-01-01

    After summarizing the Rasmussion Report, the author reviews the probabilistic portion of the report from the perspectives of engineering utility and risk assessment uncertainty. The author shows that the report may represent a significant step forward in the assurance of reactor safety and an imperfect measure of actual reactor risk. (BT)

  14. Risk analysis for worker exposure to benzene

    NASA Astrophysics Data System (ADS)

    Hallenbeck, William H.; Flowers, Roxanne E.

    1992-05-01

    Cancer risk factors (characterized by route, dose, dose rate per kilogram, fraction of lifetime exposed, species, and sex) were derived for workers exposed to benzene via inhalation or ingestion. Exposure at the current Occupational Safety and Health Administration (OSHA) permissible exposure limit (PEL) and at leaking underground storage tank (LUST) sites were evaluated. At the current PEL of 1 ppm, the theoretical lifetime excess risk of cancer from benzene inhalation is ten per 1000. The theoretical lifetime excess risk for worker inhalation exposure at LUST sites ranged from 10 to 40 per 1000. These results indicate that personal protection should be required. The theoretical lifetime excess risk due to soil ingestion is five to seven orders of magnitude less than the inhalation risks.

  15. Risk Management in Coastal Engineering - Applied Research Projects for the German Wadden Sea

    NASA Astrophysics Data System (ADS)

    Woeffler, T.; Grimm, C.; Bachmann, D.; Jensen, J.; Mudersbach, C.; Froehle, P.; Thorenz, F.; Schuettrumpf, H.

    2012-04-01

    Several islands in the northfrisian part of the UNESCO - World Natural Heritage Wadden Sea are exposed to extreme storm surges due to climate change and sea level rise. Existing coastal protection measures in this area do not consider the future sea state and are mainly based on tradition and expert knowledge. The two projects HoRisK and ZukunftHallig (supported by the German Coastal Engineering Research Council) focus on this area and implement the requirements defined in the Directive 2007/60/EC on the assessment and management of flood risk. The main objects of the projects are the design and evaluation of new coastal protection techniques for the investigation area. With numerical simulations hydrological parameters are investigated in order to design new coastal protection- and management strategies. The decision support system PROMAIDES (Protection Measure against Inundation Decision Support) developed at the Institute of Hydraulic Engineering and Water Resources Management of the RWTH Aachen University analyzes the effects and reliability of new coastal protection techniques and evaluates inundation areas and economic damages for different hydrological boundary conditions. As a result flood risk and hazard maps are shown in this work. Furthermore sensitivity analyses expose possible variations in future storm surges and illustrate the difference in significant wave heights for varying wind climates. This risk based approach of both projects is a suitable way to ensure life for further generations on these islands under sustainable ecological und economic conditions. Acknowledgments This work was supported by the KFKI (German Coastal Engineering Research Council) and the German Federal Ministery of Education and Research (BMBF) (Project No. 03KIS094 and 03KIS078)

  16. Viral metagenomics applied to blood donors and recipients at high risk for blood-borne infections

    PubMed Central

    Sauvage, Virginie; Laperche, Syria; Cheval, Justine; Muth, Erika; Dubois, Myriam; Boizeau, Laure; Hébert, Charles; Lionnet, François; Lefrère, Jean-Jacques; Eloit, Marc

    2016-01-01

    Background Characterisation of human-associated viral communities is essential for epidemiological surveillance and to be able to anticipate new potential threats for blood transfusion safety. In high-resource countries, the risk of blood-borne agent transmission of well-known viruses (HBV, HCV, HIV and HTLV) is currently considered to be under control. However, other unknown or unsuspected viruses may be transmitted to recipients by blood-derived products. To investigate this, the virome of plasma from individuals at high risk for parenterally and sexually transmitted infections was analysed by high throughput sequencing (HTS). Materials and methods Purified nucleic acids from two pools of 50 samples from recipients of multiple transfusions, and three pools containing seven plasma samples from either HBV−, HCV− or HIV-infected blood donors, were submitted to HTS. Results Sequences from resident anelloviruses and HPgV were evidenced in all pools. HBV and HCV sequences were detected in pools containing 3.8×103 IU/mL of HBV-DNA and 1.7×105 IU/mL of HCV-RNA, respectively, whereas no HIV sequence was found in a pool of 150 copies/mL of HIV-RNA. This suggests a lack of sensitivity in HTS performance in detecting low levels of virus. In addition, this study identified other issues, including laboratory contaminants and the uncertainty of taxonomic assignment of short sequence. No sequence suggestive of a new viral species was identified. Discussion This study did not identify any new blood-borne virus in high-risk individuals. However, rare and/or viruses present at very low titre could have escaped our protocol. Our results demonstrate the positive contribution of HTS in the detection of viral sequences in blood donations. PMID:27136432

  17. EC Transmission Line Risk Identification and Analysis

    SciTech Connect

    Bigelow, Tim S

    2012-04-01

    The purpose of this document is to assist in evaluating and planning for the cost, schedule, and technical project risks associated with the delivery and operation of the EC (Electron cyclotron) transmission line system. In general, the major risks that are anticipated to be encountered during the project delivery phase associated with the implementation of the Procurement Arrangement for the EC transmission line system are associated with: (1) Undefined or changing requirements (e.g., functional or regulatory requirements) (2) Underperformance of prototype, first unit, or production components during testing (3) Unavailability of qualified vendors for critical components Technical risks associated with the design and operation of the system are also identified.

  18. Risk analysis of gravity dam instability using credibility theory Monte Carlo simulation model.

    PubMed

    Xin, Cao; Chongshi, Gu

    2016-01-01

    Risk analysis of gravity dam stability involves complicated uncertainty in many design parameters and measured data. Stability failure risk ratio described jointly by probability and possibility has deficiency in characterization of influence of fuzzy factors and representation of the likelihood of risk occurrence in practical engineering. In this article, credibility theory is applied into stability failure risk analysis of gravity dam. Stability of gravity dam is viewed as a hybrid event considering both fuzziness and randomness of failure criterion, design parameters and measured data. Credibility distribution function is conducted as a novel way to represent uncertainty of influence factors of gravity dam stability. And combining with Monte Carlo simulation, corresponding calculation method and procedure are proposed. Based on a dam section, a detailed application of the modeling approach on risk calculation of both dam foundation and double sliding surfaces is provided. The results show that, the present method is feasible to be applied on analysis of stability failure risk for gravity dams. The risk assessment obtained can reflect influence of both sorts of uncertainty, and is suitable as an index value.

  19. Risk analysis of gravity dam instability using credibility theory Monte Carlo simulation model.

    PubMed

    Xin, Cao; Chongshi, Gu

    2016-01-01

    Risk analysis of gravity dam stability involves complicated uncertainty in many design parameters and measured data. Stability failure risk ratio described jointly by probability and possibility has deficiency in characterization of influence of fuzzy factors and representation of the likelihood of risk occurrence in practical engineering. In this article, credibility theory is applied into stability failure risk analysis of gravity dam. Stability of gravity dam is viewed as a hybrid event considering both fuzziness and randomness of failure criterion, design parameters and measured data. Credibility distribution function is conducted as a novel way to represent uncertainty of influence factors of gravity dam stability. And combining with Monte Carlo simulation, corresponding calculation method and procedure are proposed. Based on a dam section, a detailed application of the modeling approach on risk calculation of both dam foundation and double sliding surfaces is provided. The results show that, the present method is feasible to be applied on analysis of stability failure risk for gravity dams. The risk assessment obtained can reflect influence of both sorts of uncertainty, and is suitable as an index value. PMID:27386264

  20. Point pattern analysis with spatially varying covariate effects, applied to the study of cerebrovascular deaths.

    PubMed

    Pinto Junior, Jony Arrais; Gamerman, Dani; Paez, Marina Silva; Fonseca Alves, Regina Helena

    2015-03-30

    This article proposes a modeling approach for handling spatial heterogeneity present in the study of the geographical pattern of deaths due to cerebrovascular disease.The framework involvesa point pattern analysis with components exhibiting spatial variation. Preliminary studies indicate that mortality of this disease and the effect of relevant covariates do not exhibit uniform geographic distribution. Our model extends a previously proposed model in the literature that uses spatial and non-spatial variables by allowing for spatial variation of the effect of non-spatial covariates. A number of relative risk indicators are derived by comparing different covariate levels, different geographic locations, or both. The methodology is applied to the study of the geographical death pattern of cerebrovascular deaths in the city of Rio de Janeiro. The results compare well against existing alternatives, including fixed covariate effects. Our model is able to capture and highlight important data information that would not be noticed otherwise, providing information that is required for appropriate health decision-making.

  1. Applying Monte Carlo Simulation to Launch Vehicle Design and Requirements Analysis

    NASA Technical Reports Server (NTRS)

    Hanson, J. M.; Beard, B. B.

    2010-01-01

    This Technical Publication (TP) is meant to address a number of topics related to the application of Monte Carlo simulation to launch vehicle design and requirements analysis. Although the focus is on a launch vehicle application, the methods may be applied to other complex systems as well. The TP is organized so that all the important topics are covered in the main text, and detailed derivations are in the appendices. The TP first introduces Monte Carlo simulation and the major topics to be discussed, including discussion of the input distributions for Monte Carlo runs, testing the simulation, how many runs are necessary for verification of requirements, what to do if results are desired for events that happen only rarely, and postprocessing, including analyzing any failed runs, examples of useful output products, and statistical information for generating desired results from the output data. Topics in the appendices include some tables for requirements verification, derivation of the number of runs required and generation of output probabilistic data with consumer risk included, derivation of launch vehicle models to include possible variations of assembled vehicles, minimization of a consumable to achieve a two-dimensional statistical result, recontact probability during staging, ensuring duplicated Monte Carlo random variations, and importance sampling.

  2. Applying Transactional Analysis and Personality Assessment to Improve Patient Counseling and Communication Skills

    PubMed Central

    Lawrence, Lesa

    2007-01-01

    Objective To teach pharmacy students how to apply transactional analysis and personality assessment to patient counseling to improve communication. Design A lecture series for a required pharmacy communications class was developed to teach pharmacy students how to apply transactional analysis and personality assessment to patient counseling. Students were asked to apply these techniques and to report their experiences. A personality self-assessment was also conducted. Assessment After attending the lecture series, students were able to apply the techniques and demonstrated an understanding of the psychological factors that may affect patient communication, an appreciation for the diversity created by different personality types, the ability to engage patients based on adult-to-adult interaction cues, and the ability to adapt the interactive patient counseling model to different personality traits. Conclusion Students gained a greater awareness of transactional analysis and personality assessment by applying these concepts. This understanding will help students communicate more effectively with patients. PMID:17786269

  3. Capability for Integrated Systems Risk-Reduction Analysis

    NASA Technical Reports Server (NTRS)

    Mindock, J.; Lumpkins, S.; Shelhamer, M.

    2016-01-01

    NASA's Human Research Program (HRP) is working to increase the likelihoods of human health and performance success during long-duration missions, and subsequent crew long-term health. To achieve these goals, there is a need to develop an integrated understanding of how the complex human physiological-socio-technical mission system behaves in spaceflight. This understanding will allow HRP to provide cross-disciplinary spaceflight countermeasures while minimizing resources such as mass, power, and volume. This understanding will also allow development of tools to assess the state of and enhance the resilience of individual crewmembers, teams, and the integrated mission system. We will discuss a set of risk-reduction questions that has been identified to guide the systems approach necessary to meet these needs. In addition, a framework of factors influencing human health and performance in space, called the Contributing Factor Map (CFM), is being applied as the backbone for incorporating information addressing these questions from sources throughout HRP. Using the common language of the CFM, information from sources such as the Human System Risk Board summaries, Integrated Research Plan, and HRP-funded publications has been combined and visualized in ways that allow insight into cross-disciplinary interconnections in a systematic, standardized fashion. We will show examples of these visualizations. We will also discuss applications of the resulting analysis capability that can inform science portfolio decisions, such as areas in which cross-disciplinary solicitations or countermeasure development will potentially be fruitful.

  4. Fire behavior and risk analysis in spacecraft

    NASA Technical Reports Server (NTRS)

    Friedman, Robert; Sacksteder, Kurt R.

    1988-01-01

    Practical risk management for present and future spacecraft, including space stations, involves the optimization of residual risks balanced by the spacecraft operational, technological, and economic limitations. Spacecraft fire safety is approached through three strategies, in order of risk: (1) control of fire-causing elements, through exclusion of flammable materials for example; (2) response to incipient fires through detection and alarm; and (3) recovery of normal conditions through extinguishment and cleanup. Present understanding of combustion in low gravity is that, compared to normal gravity behavior, fire hazards may be reduced by the absence of buoyant gas flows yet at the same time increased by ventilation flows and hot particle expulsion. This paper discusses the application of low-gravity combustion knowledge and appropriate aircraft analogies to fire detection, fire fighting, and fire-safety decisions for eventual fire-risk management and optimization in spacecraft.

  5. Research in progress in applied mathematics, numerical analysis, and computer science

    NASA Technical Reports Server (NTRS)

    1990-01-01

    Research conducted at the Institute in Science and Engineering in applied mathematics, numerical analysis, and computer science is summarized. The Institute conducts unclassified basic research in applied mathematics in order to extend and improve problem solving capabilities in science and engineering, particularly in aeronautics and space.

  6. Risk analysis of an RTG on the Space Shuttle

    NASA Technical Reports Server (NTRS)

    Frank, Michael V.

    1991-01-01

    As part of the effort to review the Ulysses Final Safety Analysis Report and to understand the risk of plutonium release from the Ulysses spacecraft General Purpose Heat Source-Radioisotope Thermal Generator (GPHS-RTG), the Interagency Nuclear Safety Review Panel (INSRP) and the author performed an integrated, quantitative analysis of the uncertainties of the calculated risk of plutonium release from Ulysses. Using state-of-the-art probabilistic risk assessment technology, the uncertainty analysis accounted for both variability and uncertainty of the key parameters of the risk analysis. The results show tht INSRP had high confidence that risk of fatal cancers from potential plutonium release associated with calculated launch and deployment accident scenarios is low.

  7. Elder Abuse by Adult Children: An Applied Ecological Framework for Understanding Contextual Risk Factors and the Intergenerational Character of Quality of Life.

    ERIC Educational Resources Information Center

    Schiamberg, Lawrence B.; Gans, Daphna

    2000-01-01

    Using an applied ecological model, this study focuses on contextual risk factors of elder abuse. Five levels of environment were used to interpret existing research on risk factors. Configuration of risk factors provides a framework for understanding the intergenerational character of quality of life for older adults, developing recommendations…

  8. Environmental risk assessment of replication competent viral vectors applied in clinical trials: potential effects of inserted sequences.

    PubMed

    van den Akker, Eric; van der Vlugt, Cecile J B; Bleijs, Diederik A; Bergmans, Hans E

    2013-12-01

    Risk assessments of clinical applications involving genetically modified viral vectors are carried out according to general principles that are implemented in many national and regional legislations, e.g., in Directive 2001/18/EC of the European Union. Recent developments in vector design have a large impact on the concepts that underpin the risk assessments of viral vectors that are used in clinical trials. The use of (conditionally) replication competent viral vectors (RCVVs) may increase the likelihood of the exposure of the environment around the patient, compared to replication defective viral vectors. Based on this assumption we have developed a methodology for the environmental risk assessment of replication competent viral vectors, which is presented in this review. Furthermore, the increased likelihood of exposure leads to a reevaluation of what would constitute a hazardous gene product in viral vector therapies, and a keen interest in new developments in the inserts used. One of the trends is the use of inserts produced by synthetic biology. In this review the implications of these developments for the environmental risk assessment of RCVVs are highlighted, with examples from current clinical trials. The conclusion is drawn that RCVVs, notwithstanding their replication competency, can be applied in an environmentally safe way, in particular if adequate built-in safeties are incorporated, like conditional replication competency, as mitigating factors to reduce adverse environmental effects that could occur.

  9. Applying Ecodevelopmental Theory and the Theory of Reasoned Action to Understand HIV Risk Behaviors Among Hispanic Adolescents.

    PubMed

    Ortega, Johis; Huang, Shi; Prado, Guillermo

    2012-01-01

    HIV/AIDS is listed as one of the top 10 reasons for the death of Hispanics between the ages of 15 and 54 in the United States. This cross sectional, descriptive secondary study proposed that using both the systemic (ecodevelopmental) and the individually focused (theory of reasoned action) theories together would lead to an increased understanding of the risk and protective factors that influence HIV risk behaviors in this population. The sample consisted of 493 Hispanic adolescent 7th and 8th graders and their immigrant parents living in Miami, Florida. Structural Equation Modeling (SEM) was used for the data analysis. Family functioning emerged as the heart of the model, embedded within a web of direct and mediated relationships. The data support the idea that family can play a central role in the prevention of Hispanic adolescents' risk behaviors.

  10. Performance analysis of high quality parallel preconditioners applied to 3D finite element structural analysis

    SciTech Connect

    Kolotilina, L.; Nikishin, A.; Yeremin, A.

    1994-12-31

    The solution of large systems of linear equations is a crucial bottleneck when performing 3D finite element analysis of structures. Also, in many cases the reliability and robustness of iterative solution strategies, and their efficiency when exploiting hardware resources, fully determine the scope of industrial applications which can be solved on a particular computer platform. This is especially true for modern vector/parallel supercomputers with large vector length and for modern massively parallel supercomputers. Preconditioned iterative methods have been successfully applied to industrial class finite element analysis of structures. The construction and application of high quality preconditioners constitutes a high percentage of the total solution time. Parallel implementation of high quality preconditioners on such architectures is a formidable challenge. Two common types of existing preconditioners are the implicit preconditioners and the explicit preconditioners. The implicit preconditioners (e.g. incomplete factorizations of several types) are generally high quality but require solution of lower and upper triangular systems of equations per iteration which are difficult to parallelize without deteriorating the convergence rate. The explicit type of preconditionings (e.g. polynomial preconditioners or Jacobi-like preconditioners) require sparse matrix-vector multiplications and can be parallelized but their preconditioning qualities are less than desirable. The authors present results of numerical experiments with Factorized Sparse Approximate Inverses (FSAI) for symmetric positive definite linear systems. These are high quality preconditioners that possess a large resource of parallelism by construction without increasing the serial complexity.

  11. Comparing risk in conventional and organic dairy farming in the Netherlands: an empirical analysis.

    PubMed

    Berentsen, P B M; Kovacs, K; van Asseldonk, M A P M

    2012-07-01

    This study was undertaken to contribute to the understanding of why most dairy farmers do not convert to organic farming. Therefore, the objective of this research was to assess and compare risks for conventional and organic farming in the Netherlands with respect to gross margin and the underlying price and production variables. To investigate the risk factors a farm accountancy database was used containing panel data from both conventional and organic representative Dutch dairy farms (2001-2007). Variables with regard to price and production risk were identified using a gross margin analysis scheme. Price risk variables were milk price and concentrate price. The main production risk variables were milk yield per cow, roughage yield per hectare, and veterinary costs per cow. To assess risk, an error component implicit detrending method was applied and the resulting detrended standard deviations were compared between conventional and organic farms. Results indicate that the risk included in the gross margin per cow is significantly higher in organic farming. This is caused by both higher price and production risks. Price risks are significantly higher in organic farming for both milk price and concentrate price. With regard to production risk, only milk yield per cow poses a significantly higher risk in organic farming.

  12. [Disinfection of water: on the need for analysis and solution of fundamental and applied problems].

    PubMed

    Mokienko, A V

    2014-01-01

    In the paper there is presented an analysis of hygienic--medical and environmental aspects of water disinfection as exemplified of chlorine and chlorine dioxide (CD). The concept of persistent multivariate risk for aquatic pathogens, the own vision of the mechanism of formation of chlorine resistance of bacteria under the influence of biocides based on a two-step process of information and spatial interaction of the receptor and the substrate, the hypothesis of hormetic stimulating effect of residual active chlorine (in the complex with other factors) on the growth of aquatic pathogens have been proposed. The aggravation of the significance of halogen containing compounds (HCC) as byproducts of water chlorination in terms of their potential danger as toxicants and carcinogens has been substantiated. Analysis of hygienic and medical and environmental aspects of the use of chlorine dioxide as a means of disinfection of water allowed to justify chemism of its biocidal effect and mechanisms of bactericidal, virucidal, protozoocidal, sporicidal, algacidal actions, removal of biofilms, formation of disinfection byproducts. Chlorine dioxide was shown both to provide epidemic safety of drinking water due to its high virucidal, bactericidal and mycocidal action and to be toxicologically harmless in the context of the influence on the organism of laboratory animals as well as in relation to aquatic organisms under the discharge of disinfected wastewater. There has proved the necessity of the close relationship of fundamental and applied research in performing the first in terms of depth study of microbiological, molecular genetic and epidemiological problems of disinfection (chlorination) of water and the implementation of the latters by means of the introduction of alternative, including combined, technologies for water treatment and disinfection. PMID:24749274

  13. [Disinfection of water: on the need for analysis and solution of fundamental and applied problems].

    PubMed

    Mokienko, A V

    2014-01-01

    In the paper there is presented an analysis of hygienic--medical and environmental aspects of water disinfection as exemplified of chlorine and chlorine dioxide (CD). The concept of persistent multivariate risk for aquatic pathogens, the own vision of the mechanism of formation of chlorine resistance of bacteria under the influence of biocides based on a two-step process of information and spatial interaction of the receptor and the substrate, the hypothesis of hormetic stimulating effect of residual active chlorine (in the complex with other factors) on the growth of aquatic pathogens have been proposed. The aggravation of the significance of halogen containing compounds (HCC) as byproducts of water chlorination in terms of their potential danger as toxicants and carcinogens has been substantiated. Analysis of hygienic and medical and environmental aspects of the use of chlorine dioxide as a means of disinfection of water allowed to justify chemism of its biocidal effect and mechanisms of bactericidal, virucidal, protozoocidal, sporicidal, algacidal actions, removal of biofilms, formation of disinfection byproducts. Chlorine dioxide was shown both to provide epidemic safety of drinking water due to its high virucidal, bactericidal and mycocidal action and to be toxicologically harmless in the context of the influence on the organism of laboratory animals as well as in relation to aquatic organisms under the discharge of disinfected wastewater. There has proved the necessity of the close relationship of fundamental and applied research in performing the first in terms of depth study of microbiological, molecular genetic and epidemiological problems of disinfection (chlorination) of water and the implementation of the latters by means of the introduction of alternative, including combined, technologies for water treatment and disinfection.

  14. System Analysis Applied to Autonomy: Application to High-Altitude Long-Endurance Remotely Operated Aircraft

    NASA Technical Reports Server (NTRS)

    Young, Larry A.; Yetter, Jeffrey A.; Guynn, Mark D.

    2006-01-01

    Maturation of intelligent systems technologies and their incorporation into aerial platforms are dictating the development of new analysis tools and incorporation of such tools into existing system analysis methodologies in order to fully capture the trade-offs of autonomy on vehicle and mission success. A first-order "system analysis of autonomy" methodology is outlined in this paper. Further, this analysis methodology is subsequently applied to notional high-altitude long-endurance (HALE) aerial vehicle missions.

  15. Risk assessment methodology applied to counter IED research & development portfolio prioritization

    SciTech Connect

    Shevitz, Daniel W; O' Brien, David A; Zerkle, David K; Key, Brian P; Chavez, Gregory M

    2009-01-01

    In an effort to protect the United States from the ever increasing threat of domestic terrorism, the Department of Homeland Security, Science and Technology Directorate (DHS S&T), has significantly increased research activities to counter the terrorist use of explosives. More over, DHS S&T has established a robust Counter-Improvised Explosive Device (C-IED) Program to Deter, Predict, Detect, Defeat, and Mitigate this imminent threat to the Homeland. The DHS S&T portfolio is complicated and changing. In order to provide the ''best answer'' for the available resources, DHS S&T would like some ''risk based'' process for making funding decisions. There is a definite need for a methodology to compare very different types of technologies on a common basis. A methodology was developed that allows users to evaluate a new ''quad chart'' and rank it, compared to all other quad charts across S&T divisions. It couples a logic model with an evidential reasoning model using an Excel spreadsheet containing weights of the subjective merits of different technologies. The methodology produces an Excel spreadsheet containing the aggregate rankings of the different technologies. It uses Extensible Logic Modeling (ELM) for logic models combined with LANL software called INFTree for evidential reasoning.

  16. Exposure-driven risk assessment: applying exposure-based waiving of toxicity tests under REACH.

    PubMed

    Rowbotham, Anna L; Gibson, Rosemary M

    2011-08-01

    The REACH Regulation 1907/2006/EC aims to improve knowledge of the potential risks to humans and the environment of the large number of chemicals produced and used in the EU. The testing requirements are likely to trigger numerous toxicological studies, potentially involving millions of experimental animals, despite the professed goal of REACH to reduce vertebrate testing. It may be necessary therefore to shift emphasis away from animal studies towards more pragmatic strategies, reserving animal tests for the substances of greatest concern. One approach is to waive certain tests based on levels of exposure to the substance. This review explores application of 'Exposure-Based Waiving' (EBW) of toxicity studies, with a particular focus on inhalation where possible, considering the potential qualitative and quantitative supporting arguments that might be made, including the use of thresholds of toxicological concern. Incorporating EBW into intelligent testing strategies for substance registration could advance the goals of REACH and the 3Rs (reduction, replacement and refinement of animals in research) by reducing the usage of animals in toxicity tests, whilst maintaining appropriate protection of human health and the environment. However greater regulatory evaluation, acceptance and guidance are required for EBW to achieve its full impact.

  17. Risk analysis of Finnish peacekeeping in Kosovo.

    PubMed

    Lehtomäki, Kyösti; Pääkkönen, Rauno J; Rantanen, Jorma

    2005-04-01

    The research team interviewed over 90 Finnish battalion members in Kosovo, visited 22 units or posts, registered its observations, and made any necessary measurements. Key persons were asked to list the most important risks for occupational safety and health in their area of responsibility. Altogether, 106 accidents and 40 cases of disease resulted in compensation claims in 2000. The risks to the peacekeeping force were about twice those of the permanent staff of military trainees in Finland. Altogether, 21 accidents or cases of disease resulted in sick leave for at least 3 months after service. One permanent injury resulted from an explosion. Biological, chemical, and physical factors caused 8 to 9 occupational illnesses each. Traffic accidents, operational factors, and munitions and mines were evaluated to be the three most important risk factors, followed by occupational hygiene, living conditions (mold, fungi, dust), and general hygiene. Possible fatal risks, such as traffic accidents and munitions and explosives, received a high ranking in both the subjective and the objective evaluations. One permanent injury resulted from an explosion, and two traffic accidents involved a fatality, although not of a peacekeeper. The reduction of sports and military training accidents, risk-control programs, and, for some tasks, better personal protection is considered a development challenge for the near future. PMID:15876212

  18. Risk analysis and its link with standards of the World Organisation for Animal Health.

    PubMed

    Sugiura, K; Murray, N

    2011-04-01

    Among the agreements included in the treaty that created the World Trade Organization (WTO) in January 1995 is the Agreement on the Application of Sanitary and Phytosanitary Measures (SPS Agreement) that sets out the basic rules for food safety and animal and plant health standards. The SPS Agreement designates the World Organisation for Animal Health (OIE) as the organisation responsible for developing international standards for animal health and zoonoses. The SPS Agreement requires that the sanitary measures that WTO members apply should be based on science and encourages them to either apply measures based on the OIE standards or, if they choose to adopt a higher level of protection than that provided by these standards, apply measures based on a science-based risk assessment. The OIE also provides a procedural framework for risk analysis for its Member Countries to use. Despite the inevitable challenges that arise in carrying out a risk analysis of the international trade in animals and animal products, the OIE risk analysis framework provides a structured approach that facilitates the identification, assessment, management and communication of these risks.

  19. Oil shale health and environmental risk analysis

    SciTech Connect

    Gratt, L.B.

    1983-04-01

    The potential human health and environmental risks of hypothetical one-million-barrels-per-day oil shale industry have been analyzed to serve as an aid in the formulation and management of a program of environmental research. The largest uncertainties for expected fatalities are in the public sector from air pollutants although the occupational sector is estimated to have 60% more expected fatalities than the public sector. Occupational safety and illness have been analyzed for the oil shale fuel cycle from extraction to delivery of products for end use. Pneumoconiosis from the dust environment is the worker disease resulting in the greatest number of fatalities, followed by chronic bronchitis, internal cancer, and skin cancers, respectively. Research recommendations are presented for reducing the uncertainties in the risks analyzed and to fill data gaps to estimate other risks.

  20. Risk analysis: divergent models and convergent interpretations

    NASA Technical Reports Server (NTRS)

    Carnes, B. A.; Gavrilova, N.

    2001-01-01

    Material presented at a NASA-sponsored workshop on risk models for exposure conditions relevant to prolonged space flight are described in this paper. Analyses used mortality data from experiments conducted at Argonne National Laboratory on the long-term effects of external whole-body irradiation on B6CF1 mice by 60Co gamma rays and fission neutrons delivered as a single exposure or protracted over either 24 or 60 once-weekly exposures. The maximum dose considered was restricted to 1 Gy for neutrons and 10 Gy for gamma rays. Proportional hazard models were used to investigate the shape of the dose response at these lower doses for deaths caused by solid-tissue tumors and tumors of either connective or epithelial tissue origin. For protracted exposures, a significant mortality effect was detected at a neutron dose of 14 cGy and a gamma-ray dose of 3 Gy. For single exposures, radiation-induced mortality for neutrons also occurred within the range of 10-20 cGy, but dropped to 86 cGy for gamma rays. Plots of risk relative to control estimated for each observed dose gave a visual impression of nonlinearity for both neutrons and gamma rays. At least for solid-tissue tumors, male and female mortality was nearly identical for gamma-ray exposures, but mortality risks for females were higher than for males for neutron exposures. As expected, protracting the gamma-ray dose reduced mortality risks. Although curvature consistent with that observed visually could be detected by a model parameterized to detect curvature, a relative risk term containing only a simple term for total dose was usually sufficient to describe the dose response. Although detectable mortality for the three pathology end points considered typically occurred at the same level of dose, the highest risks were almost always associated with deaths caused by tumors of epithelial tissue origin.

  1. Risk Analysis for Environmental Health Triage

    SciTech Connect

    Bogen, K T

    2005-11-18

    The Homeland Security Act mandates development of a national, risk-based system to support planning for, response to and recovery from emergency situations involving large-scale toxic exposures. To prepare for and manage consequences effectively, planners and responders need not only to identify zones of potentially elevated individual risk, but also to predict expected casualties. Emergency response support systems now define ''consequences'' by mapping areas in which toxic chemical concentrations do or may exceed Acute Exposure Guideline Levels (AEGLs) or similar guidelines. However, because AEGLs do not estimate expected risks, current unqualified claims that such maps support consequence management are misleading. Intentionally protective, AEGLs incorporate various safety/uncertainty factors depending on scope and quality of chemical-specific toxicity data. Some of these factors are irrelevant, and others need to be modified, whenever resource constraints or exposure-scenario complexities require responders to make critical trade-off (triage) decisions in order to minimize expected casualties. AEGL-exceedance zones cannot consistently be aggregated, compared, or used to calculate expected casualties, and so may seriously misguide emergency response triage decisions. Methods and tools well established and readily available to support environmental health protection are not yet developed for chemically related environmental health triage. Effective triage decisions involving chemical risks require a new assessment approach that focuses on best estimates of likely casualties, rather than on upper plausible bounds of individual risk. If risk-based consequence management is to become a reality, federal agencies tasked with supporting emergency response must actively coordinate to foster new methods that can support effective environmental health triage.

  2. Latent Model Analysis of Substance Use and HIV Risk Behaviors among High-Risk Minority Adults

    ERIC Educational Resources Information Center

    Wang, Min Qi; Matthew, Resa F.; Chiu, Yu-Wen; Yan, Fang; Bellamy, Nikki D.

    2007-01-01

    Objectives: This study evaluated substance use and HIV risk profile using a latent model analysis based on ecological theory, inclusive of a risk and protective factor framework, in sexually active minority adults (N=1,056) who participated in a federally funded substance abuse and HIV prevention health initiative from 2002 to 2006. Methods: Data…

  3. Analysis of Alternatives for Risk Assessment Methodologies and Tools

    SciTech Connect

    Nachtigal, Noel M.; Fruetel, Julia A.; Gleason, Nathaniel J.; Helms, Jovana; Imbro, Dennis Raymond; Sumner, Matthew C.

    2013-10-01

    The purpose of this document is to provide a basic overview and understanding of risk assessment methodologies and tools from the literature and to assess the suitability of these methodologies and tools for cyber risk assessment. Sandia National Laboratories (SNL) performed this review in support of risk modeling activities performed for the Stakeholder Engagement and Cyber Infrastructure Resilience (SECIR) division of the Department of Homeland Security (DHS) Office of Cybersecurity and Communications (CS&C). The set of methodologies and tools covered in this document is not intended to be exhaustive; instead, it focuses on those that are commonly used in the risk assessment community. The classification of methodologies and tools was performed by a group of analysts with experience in risk analysis and cybersecurity, and the resulting analysis of alternatives has been tailored to address the needs of a cyber risk assessment.

  4. Integrated Hybrid System Architecture for Risk Analysis

    NASA Technical Reports Server (NTRS)

    Moynihan, Gary P.; Fonseca, Daniel J.; Ray, Paul S.

    2010-01-01

    A conceptual design has been announced of an expert-system computer program, and the development of a prototype of the program, intended for use as a project-management tool. The program integrates schedule and risk data for the purpose of determining the schedule applications of safety risks and, somewhat conversely, the effects of changes in schedules on changes on safety. It is noted that the design has been delivered to a NASA client and that it is planned to disclose the design in a conference presentation.

  5. A review of the technology and process on integrated circuits failure analysis applied in communications products

    NASA Astrophysics Data System (ADS)

    Ming, Zhimao; Ling, Xiaodong; Bai, Xiaoshu; Zong, Bo

    2016-02-01

    The failure analysis of integrated circuits plays a very important role in the improvement of the reliability in communications products. This paper intends to mainly introduce the failure analysis technology and process of integrated circuits applied in the communication products. There are many technologies for failure analysis, include optical microscopic analysis, infrared microscopic analysis, acoustic microscopy analysis, liquid crystal hot spot detection technology, optical microscopic analysis technology, micro analysis technology, electrical measurement, microprobe technology, chemical etching technology and ion etching technology. The integrated circuit failure analysis depends on the accurate confirmation and analysis of chip failure mode, the search of the root failure cause, the summary of failure mechanism and the implement of the improvement measures. Through the failure analysis, the reliability of integrated circuit and rate of good products can improve.

  6. Analysis of driver casualty risk for different work zone types.

    PubMed

    Weng, Jinxian; Meng, Qiang

    2011-09-01

    Using driver casualty data from the Fatality Analysis Report System, this study examines driver casualty risk and investigates the risk contributing factors in the construction, maintenance and utility work zones. The multiple t-tests results show that the driver casualty risk is statistically different depending on the work zone type. Moreover, construction work zones have the largest driver casualty risk, followed by maintenance and utility work zones. Three separate logistic regression models are developed to predict driver casualty risk for the three work zone types because of their unique features. Finally, the effects of risk factors on driver casualty risk for each work zone type are examined and compared. For all three work zone types, five significant risk factors including road alignment, truck involvement, most harmful event, vehicle age and notification time are associated with increased driver casualty risk while traffic control devices and restraint use are associated with reduced driver casualty risk. However, one finding is that three risk factors (light condition, gender and day of week) exhibit opposing effects on the driver casualty risk in different types of work zones. This may largely be due to different work zone features and driver behavior in different types of work zones.

  7. The value of integrating information from multiple hazards for flood risk analysis and management

    NASA Astrophysics Data System (ADS)

    Castillo-Rodríguez, J. T.; Escuder-Bueno, I.; Altarejos-García, L.; Serrano-Lombillo, A.

    2014-02-01

    This article presents a methodology for estimating flood risk in urban areas integrating pluvial flooding, river flooding and failure of both small and large dams. The first part includes a review of basic concepts on flood risk analysis, evaluation and management. Flood risk analyses may be developed at local, regional and national level, however a general methodology to perform a quantitative flood risk analysis including different flood hazards is still required. The second part describes the proposed methodology, which presents an integrated approach - combining pluvial, river flooding and flooding from dam failure, as applied to a case study: an urban area located downstream of a dam under construction. The methodology enhances the approach developed within the SUFRI project ("Sustainable Strategies of Urban Flood Risk Management to cope with the residual risk", 2009-2011). This article also shows how outcomes from flood risk analysis provide better and more complete information to inform authorities, local entities and the stakeholders involved in decision-making with regard to flood risk management.

  8. HVAC fault tree analysis for WIPP integrated risk assessment

    SciTech Connect

    Kirby, P.; Iacovino, J.

    1990-01-01

    In order to evaluate the public health risk from operation of the Waste Isolation Pilot Plant (WIPP) due to potential radioactive releases, a probabilistic risk assessment of waste handling operations was conducted. One major aspect of this risk assessment involved fault tree analysis of the plant heating, ventilation, and air conditioning (HVAC) systems, which comprise the final barrier between waste handling operations and the environment. 1 refs., 1 tab.

  9. Approaches for derivation of environmental quality criteria for substances applied in risk assessment of discharges from offshore drilling operations.

    PubMed

    Altin, Dag; Frost, Tone Karin; Nilssen, Ingunn

    2008-04-01

    In order to achieve the offshore petroleum industries "zero harm" goal to the environment, the environmental impact factor for drilling discharges was developed as a tool to identify and quantify the environmental risks associated with disposal of drilling discharges to the marine environment. As an initial step in this work the main categories of substances associated with drilling discharges and assumed to contribute to toxic or nontoxic stress were identified and evaluated for inclusion in the risk assessment. The selection were based on the known toxicological properties of the substances, or the total amount discharged together with their potential for accumulation in the water column or sediments to levels that could be expected to cause toxic or nontoxic stress to the biota. Based on these criteria 3 categories of chemicals were identified for risk assessment the water column and sediments: Natural organic substances, metals, and drilling fluid chemicals. Several approaches for deriving the environmentally safe threshold concentrations as predicted no effect concentrations were evaluated in the process. For the water column consensus were reached for using the species sensitivity distribution approach for metals and the assessment factor approach for natural organic substances and added drilling chemicals. For the sediments the equilibrium partitioning approach was selected for all three categories of chemicals. The theoretically derived sediment quality criteria were compared to field-derived threshold effect values based on statistical approaches applied on sediment monitoring data from the Norwegian Continental Shelf. The basis for derivation of predicted no effect concentration values for drilling discharges should be consistent with the principles of environmental risk assessment as described in the Technical Guidance Document on Risk Assessment issued by the European Union.

  10. Approaches for derivation of environmental quality criteria for substances applied in risk assessment of discharges from offshore drilling operations.

    PubMed

    Altin, Dag; Frost, Tone Karin; Nilssen, Ingunn

    2008-04-01

    In order to achieve the offshore petroleum industries "zero harm" goal to the environment, the environmental impact factor for drilling discharges was developed as a tool to identify and quantify the environmental risks associated with disposal of drilling discharges to the marine environment. As an initial step in this work the main categories of substances associated with drilling discharges and assumed to contribute to toxic or nontoxic stress were identified and evaluated for inclusion in the risk assessment. The selection were based on the known toxicological properties of the substances, or the total amount discharged together with their potential for accumulation in the water column or sediments to levels that could be expected to cause toxic or nontoxic stress to the biota. Based on these criteria 3 categories of chemicals were identified for risk assessment the water column and sediments: Natural organic substances, metals, and drilling fluid chemicals. Several approaches for deriving the environmentally safe threshold concentrations as predicted no effect concentrations were evaluated in the process. For the water column consensus were reached for using the species sensitivity distribution approach for metals and the assessment factor approach for natural organic substances and added drilling chemicals. For the sediments the equilibrium partitioning approach was selected for all three categories of chemicals. The theoretically derived sediment quality criteria were compared to field-derived threshold effect values based on statistical approaches applied on sediment monitoring data from the Norwegian Continental Shelf. The basis for derivation of predicted no effect concentration values for drilling discharges should be consistent with the principles of environmental risk assessment as described in the Technical Guidance Document on Risk Assessment issued by the European Union. PMID:18232742

  11. Risk and value analysis of SETI

    NASA Technical Reports Server (NTRS)

    Billingham, J.

    1986-01-01

    The risks, values, and costs of the SETI project are evaluated and compared with those of the Viking project. Examination of the scientific values, side benefits, and costs of the two projects reveal that both projects provide equal benefits at equal costs. The probability of scientific and technical success is analyzed.

  12. Dynamic taxonomies applied to a web-based relational database for geo-hydrological risk mitigation

    NASA Astrophysics Data System (ADS)

    Sacco, G. M.; Nigrelli, G.; Bosio, A.; Chiarle, M.; Luino, F.

    2012-02-01

    In its 40 years of activity, the Research Institute for Geo-hydrological Protection of the Italian National Research Council has amassed a vast and varied collection of historical documentation on landslides, muddy-debris flows, and floods in northern Italy from 1600 to the present. Since 2008, the archive resources have been maintained through a relational database management system. The database is used for routine study and research purposes as well as for providing support during geo-hydrological emergencies, when data need to be quickly and accurately retrieved. Retrieval speed and accuracy are the main objectives of an implementation based on a dynamic taxonomies model. Dynamic taxonomies are a general knowledge management model for configuring complex, heterogeneous information bases that support exploratory searching. At each stage of the process, the user can explore or browse the database in a guided yet unconstrained way by selecting the alternatives suggested for further refining the search. Dynamic taxonomies have been successfully applied to such diverse and apparently unrelated domains as e-commerce and medical diagnosis. Here, we describe the application of dynamic taxonomies to our database and compare it to traditional relational database query methods. The dynamic taxonomy interface, essentially a point-and-click interface, is considerably faster and less error-prone than traditional form-based query interfaces that require the user to remember and type in the "right" search keywords. Finally, dynamic taxonomy users have confirmed that one of the principal benefits of this approach is the confidence of having considered all the relevant information. Dynamic taxonomies and relational databases work in synergy to provide fast and precise searching: one of the most important factors in timely response to emergencies.

  13. Import risk analysis: the experience of Italy.

    PubMed

    Caporale, V; Giovannini, A; Calistri, P; Conte, A

    1999-12-01

    The authors propose a contribution to the possible revision of Chapters 1.4.1. and 1.4.2. of the International Animal Health Code (Code) of the Office International des Epizooties (OIE). In particular, data are presented to illustrate some of the inadequacies of both the rationale and the results of the method for risk assessment reported in the Code. The method suggested by the Code for risk assessment is based on the calculation of the 'probability of the occurrence of at least one outbreak' of a given disease following the importation of a given quantity of either live animals or animal products (unrestricted risk estimate). This is usually undertaken when dealing with rare events. For a country such as Italy, this method may not be particularly useful as the frequency of disease outbreaks is what should be estimated, so as to provide decision makers with appropriate and relevant information. Practical use of risk information generated by the use of the OIE risk assessment method for swine vesicular disease (SVD) would have encouraged the Chief Veterinary Officer of Italy to prohibit all imports of swine from the Netherlands and Belgium for at least two years in the early 1990s, with the consequential heavy economic losses for both Italy and the exporting countries. On the contrary, the number of actual outbreaks of the disease due to direct imports of swine from Member States of the European Union (EU), which occurred in Italy in 1992, 1993 and 1994 was very low (two to five outbreaks due to direct imports of swine from the Netherlands and one to two from Belgium). An example of a method for assessing the risks associated with high volumes of trade in commodities is also described. This method is based on the Monte Carlo simulation and provides the information required to evaluate the costs of the strategies compared. The method can be used to predict the number of outbreaks which are likely to occur following importation and enables a comparison to be made of

  14. The development of a 3D risk analysis method.

    PubMed

    I, Yet-Pole; Cheng, Te-Lung

    2008-05-01

    Much attention has been paid to the quantitative risk analysis (QRA) research in recent years due to more and more severe disasters that have happened in the process industries. Owing to its calculation complexity, very few software, such as SAFETI, can really make the risk presentation meet the practice requirements. However, the traditional risk presentation method, like the individual risk contour in SAFETI, is mainly based on the consequence analysis results of dispersion modeling, which usually assumes that the vapor cloud disperses over a constant ground roughness on a flat terrain with no obstructions and concentration fluctuations, which is quite different from the real situations of a chemical process plant. All these models usually over-predict the hazardous regions in order to maintain their conservativeness, which also increases the uncertainty of the simulation results. On the other hand, a more rigorous model such as the computational fluid dynamics (CFD) model can resolve the previous limitations; however, it cannot resolve the complexity of risk calculations. In this research, a conceptual three-dimensional (3D) risk calculation method was proposed via the combination of results of a series of CFD simulations with some post-processing procedures to obtain the 3D individual risk iso-surfaces. It is believed that such technique will not only be limited to risk analysis at ground level, but also be extended into aerial, submarine, or space risk analyses in the near future.

  15. Economic Risk Analysis: Using Analytical and Monte Carlo Techniques.

    ERIC Educational Resources Information Center

    O'Donnell, Brendan R.; Hickner, Michael A.; Barna, Bruce A.

    2002-01-01

    Describes the development and instructional use of a Microsoft Excel spreadsheet template that facilitates analytical and Monte Carlo risk analysis of investment decisions. Discusses a variety of risk assessment methods followed by applications of the analytical and Monte Carlo methods. Uses a case study to illustrate use of the spreadsheet tool…

  16. Student Choices: Using a Competing Risks Model of Survival Analysis.

    ERIC Educational Resources Information Center

    Denson, Katy; Schumacker, Randall E.

    By using a competing risks model, survival analysis methods can be extended to predict which of several mutually exclusive outcomes students will choose based on predictor variables, thereby ascertaining if the profile of risk differs across groups. The paper begins with a brief introduction to logistic regression and some of the basic concepts of…

  17. The JPL Cost Risk Analysis Approach that Incorporates Engineering Realism

    NASA Technical Reports Server (NTRS)

    Harmon, Corey C.; Warfield, Keith R.; Rosenberg, Leigh S.

    2006-01-01

    This paper discusses the JPL Cost Engineering Group (CEG) cost risk analysis approach that accounts for all three types of cost risk. It will also describe the evaluation of historical cost data upon which this method is based. This investigation is essential in developing a method that is rooted in engineering realism and produces credible, dependable results to aid decision makers.

  18. An Objective Comparison of Applied Behavior Analysis and Organizational Behavior Management Research

    ERIC Educational Resources Information Center

    Culig, Kathryn M.; Dickinson, Alyce M.; McGee, Heather M.; Austin, John

    2005-01-01

    This paper presents an objective review, analysis, and comparison of empirical studies targeting the behavior of adults published in Journal of Applied Behavior Analysis (JABA) and Journal of Organizational Behavior Management (JOBM) between 1997 and 2001. The purpose of the comparisons was to identify similarities and differences with respect to…

  19. Applied Behavior Analysis and the Imprisoned Adult Felon Project 1: The Cellblock Token Economy.

    ERIC Educational Resources Information Center

    Milan, Michael A.; And Others

    This report provides a technical-level analysis, discussion, and summary of five experiments in applied behavior analysis. Experiment 1 examined the token economy as a basis for motivating inmate behavior; Experiment 2 examined the relationship between magnitude of token reinforcement and level of inmate performance; Experiment 3 introduced a…

  20. Sociosexuality Education for Persons with Autism Spectrum Disorders Using Principles of Applied Behavior Analysis

    ERIC Educational Resources Information Center

    Wolfe, Pamela S.; Condo, Bethany; Hardaway, Emily

    2009-01-01

    Applied behavior analysis (ABA) has emerged as one of the most effective empirically based strategies for instructing individuals with autism spectrum disorders (ASD). Four ABA-based strategies that have been found effective are video modeling, visual strategies, social script fading, and task analysis. Individuals with ASD often struggle with…

  1. Causal Modeling--Path Analysis a New Trend in Research in Applied Linguistics

    ERIC Educational Resources Information Center

    Rastegar, Mina

    2006-01-01

    This article aims at discussing a new statistical trend in research in applied linguistics. This rather new statistical procedure is causal modeling--path analysis. The article demonstrates that causal modeling--path analysis is the best statistical option to use when the effects of a multitude of L2 learners' variables on language achievement are…

  2. Improving Skill Development: An Exploratory Study Comparing a Philosophical and an Applied Ethical Analysis Technique

    ERIC Educational Resources Information Center

    Al-Saggaf, Yeslam; Burmeister, Oliver K.

    2012-01-01

    This exploratory study compares and contrasts two types of critical thinking techniques; one is a philosophical and the other an applied ethical analysis technique. The two techniques analyse an ethically challenging situation involving ICT that a recent media article raised to demonstrate their ability to develop the ethical analysis skills of…

  3. Leakage risk assessment of the In Salah CO2 storage project: Applying the Certification Framework in a dynamic context.

    SciTech Connect

    Oldenburg, C.M.; Jordan, P.D.; Nicot, J.-P.; Mazzoldi, A.; Gupta, A.K.; Bryant, S.L.

    2010-08-01

    The Certification Framework (CF) is a simple risk assessment approach for evaluating CO{sub 2} and brine leakage risk at geologic carbon sequestration (GCS) sites. In the In Salah CO{sub 2} storage project assessed here, five wells at Krechba produce natural gas from the Carboniferous C10.2 reservoir with 1.7-2% CO{sub 2} that is delivered to the Krechba gas processing plant, which also receives high-CO{sub 2} natural gas ({approx}10% by mole fraction) from additional deeper gas reservoirs and fields to the south. The gas processing plant strips CO{sub 2} from the natural gas that is then injected through three long horizontal wells into the water leg of the Carboniferous gas reservoir at a depth of approximately 1,800 m. This injection process has been going on successfully since 2004. The stored CO{sub 2} has been monitored over the last five years by a Joint Industry Project (JIP) - a collaboration of BP, Sonatrach, and Statoil with co-funding from US DOE and EU DG Research. Over the years the JIP has carried out extensive analyses of the Krechba system including two risk assessment efforts, one before injection started, and one carried out by URS Corporation in September 2008. The long history of injection at Krechba, and the accompanying characterization, modeling, and performance data provide a unique opportunity to test and evaluate risk assessment approaches. We apply the CF to the In Salah CO{sub 2} storage project at two different stages in the state of knowledge of the project: (1) at the pre-injection stage, using data available just prior to injection around mid-2004; and (2) after four years of injection (September 2008) to be comparable to the other risk assessments. The main risk drivers for the project are CO{sub 2} leakage into potable groundwater and into the natural gas cap. Both well leakage and fault/fracture leakage are likely under some conditions, but overall the risk is low due to ongoing mitigation and monitoring activities. Results of

  4. American Airlines Propeller STOL Transport Economic Risk Analysis

    NASA Technical Reports Server (NTRS)

    Ransone, B.

    1972-01-01

    A Monte Carlo risk analysis on the economics of STOL transports in air passenger traffic established the probability of making the expected internal rate of financial return, or better, in a hypothetical regular Washington/New York intercity operation.

  5. DOE 2009 Geothermal Risk Analysis: Methodology and Results (Presentation)

    SciTech Connect

    Young, K. R.; Augustine, C.; Anderson, A.

    2010-02-01

    This presentation summarizes the methodology and results for a probabilistic risk analysis of research, development, and demonstration work-primarily for enhanced geothermal systems (EGS)-sponsored by the U.S. Department of Energy Geothermal Technologies Program.

  6. Applying under-sampling techniques and cost-sensitive learning methods on risk assessment of breast cancer.

    PubMed

    Hsu, Jia-Lien; Hung, Ping-Cheng; Lin, Hung-Yen; Hsieh, Chung-Ho

    2015-04-01

    Breast cancer is one of the most common cause of cancer mortality. Early detection through mammography screening could significantly reduce mortality from breast cancer. However, most of screening methods may consume large amount of resources. We propose a computational model, which is solely based on personal health information, for breast cancer risk assessment. Our model can be served as a pre-screening program in the low-cost setting. In our study, the data set, consisting of 3976 records, is collected from Taipei City Hospital starting from 2008.1.1 to 2008.12.31. Based on the dataset, we first apply the sampling techniques and dimension reduction method to preprocess the testing data. Then, we construct various kinds of classifiers (including basic classifiers, ensemble methods, and cost-sensitive methods) to predict the risk. The cost-sensitive method with random forest classifier is able to achieve recall (or sensitivity) as 100 %. At the recall of 100 %, the precision (positive predictive value, PPV), and specificity of cost-sensitive method with random forest classifier was 2.9 % and 14.87 %, respectively. In our study, we build a breast cancer risk assessment model by using the data mining techniques. Our model has the potential to be served as an assisting tool in the breast cancer screening.

  7. Probabilistic risk analysis toward cost-effective 3S (safety, safeguards, security) implementation

    NASA Astrophysics Data System (ADS)

    Suzuki, Mitsutoshi; Mochiji, Toshiro

    2014-09-01

    Probabilistic Risk Analysis (PRA) has been introduced for several decades in safety and nuclear advanced countries have already used this methodology in their own regulatory systems. However, PRA has not been developed in safeguards and security so far because of inherent difficulties in intentional and malicious acts. In this paper, probabilistic proliferation and risk analysis based on random process is applied to hypothetical reprocessing process and physical protection system in nuclear reactor with the Markov model that was originally developed by the Proliferation Resistance and Physical Protection Working Group (PRPPWG) in Generation IV International Framework (GIF). Through the challenge to quantify the security risk with a frequency in this model, integrated risk notion among 3S to pursue the cost-effective installation of those countermeasures is discussed in a heroic manner.

  8. Probabilistic risk analysis toward cost-effective 3S (safety, safeguards, security) implementation

    SciTech Connect

    Suzuki, Mitsutoshi; Mochiji, Toshiro

    2014-09-30

    Probabilistic Risk Analysis (PRA) has been introduced for several decades in safety and nuclear advanced countries have already used this methodology in their own regulatory systems. However, PRA has not been developed in safeguards and security so far because of inherent difficulties in intentional and malicious acts. In this paper, probabilistic proliferation and risk analysis based on random process is applied to hypothetical reprocessing process and physical protection system in nuclear reactor with the Markov model that was originally developed by the Proliferation Resistance and Physical Protection Working Group (PRPPWG) in Generation IV International Framework (GIF). Through the challenge to quantify the security risk with a frequency in this model, integrated risk notion among 3S to pursue the cost-effective installation of those countermeasures is discussed in a heroic manner.

  9. Environmental risk assessment in GMO analysis.

    PubMed

    Pirondini, Andrea; Marmiroli, Nelson

    2008-01-01

    Genetically modified or engineered organisms (GMOs, GEOs) are utilised in agriculture, expressing traits of interest, such as insect or herbicide resistance. Soybean, maize, cotton and oilseed rape are the GM crops with the largest acreage in the world. The distribution of GM acreage in the different countries is related with the different positions concerning labelling of GMO products: based on the principle of substantial equivalence, or rather based on the precautionary principle. The paper provides an overview on how the risks associated with release of GMO in the environments can be analysed and predicted, in view of a possible coexistence of GM and non-GM organisms in agriculture.Risk assessment procedures, both qualitative and quantitative, are compared in the context of application to GMOs considering also legislation requirements (Directive 2001/18/EC). Criteria and measurable properties to assess harm for human health and environmental safety are listed, and the possible consequences are evaluated in terms of significance.Finally, a mapping of the possible risks deriving from GMO release is reported, focusing on gene transfer to related species, horizontal gene transfer, direct and indirect effects on non target organisms, development of resistance in target organisms, and effects on biodiversity. PMID:19048472

  10. Environmental risk assessment in GMO analysis.

    PubMed

    Pirondini, Andrea; Marmiroli, Nelson

    2010-01-01

    Genetically modified or engineered organisms (GMOs, GEOs) are utilised in agriculture, expressing traits of interest, such as insect or herbicide resistance. Soybean, maize, cotton and oilseed rape are the GM crops with the largest acreage in the world. The distribution of GM acreage in the different countries is related with the different positions concerning labelling of GMO products: based on the principle of substantial equivalence, or rather based on the precautionary principle. The paper provides an overview on how the risks associated with release of GMO in the environments can be analysed and predicted, in view of a possible coexistence of GM and non-GM organisms in agriculture.Risk assessment procedures, both qualitative and quantitative, are compared in the context of application to GMOs considering also legislation requirements (Directive 2001/18/EC). Criteria and measurable properties to assess harm for human health and environmental safety are listed, and the possible consequences are evaluated in terms of significance.Finally, a mapping of the possible risks deriving from GMO release is reported, focusing on gene transfer to related species, horizontal gene transfer, direct and indirect effects on non target organisms, development of resistance in target organisms, and effects on biodiversity.

  11. Environmental risk assessment in GMO analysis.

    PubMed

    Pirondini, Andrea; Marmiroli, Nelson

    2008-01-01

    Genetically modified or engineered organisms (GMOs, GEOs) are utilised in agriculture, expressing traits of interest, such as insect or herbicide resistance. Soybean, maize, cotton and oilseed rape are the GM crops with the largest acreage in the world. The distribution of GM acreage in the different countries is related with the different positions concerning labelling of GMO products: based on the principle of substantial equivalence, or rather based on the precautionary principle. The paper provides an overview on how the risks associated with release of GMO in the environments can be analysed and predicted, in view of a possible coexistence of GM and non-GM organisms in agriculture.Risk assessment procedures, both qualitative and quantitative, are compared in the context of application to GMOs considering also legislation requirements (Directive 2001/18/EC). Criteria and measurable properties to assess harm for human health and environmental safety are listed, and the possible consequences are evaluated in terms of significance.Finally, a mapping of the possible risks deriving from GMO release is reported, focusing on gene transfer to related species, horizontal gene transfer, direct and indirect effects on non target organisms, development of resistance in target organisms, and effects on biodiversity.

  12. Environmental risk assessment in GMO analysis.

    PubMed

    Pirondini, Andrea; Marmiroli, Nelson

    2010-01-01

    Genetically modified or engineered organisms (GMOs, GEOs) are utilised in agriculture, expressing traits of interest, such as insect or herbicide resistance. Soybean, maize, cotton and oilseed rape are the GM crops with the largest acreage in the world. The distribution of GM acreage in the different countries is related with the different positions concerning labelling of GMO products: based on the principle of substantial equivalence, or rather based on the precautionary principle. The paper provides an overview on how the risks associated with release of GMO in the environments can be analysed and predicted, in view of a possible coexistence of GM and non-GM organisms in agriculture.Risk assessment procedures, both qualitative and quantitative, are compared in the context of application to GMOs considering also legislation requirements (Directive 2001/18/EC). Criteria and measurable properties to assess harm for human health and environmental safety are listed, and the possible consequences are evaluated in terms of significance.Finally, a mapping of the possible risks deriving from GMO release is reported, focusing on gene transfer to related species, horizontal gene transfer, direct and indirect effects on non target organisms, development of resistance in target organisms, and effects on biodiversity. PMID:21384330

  13. Confirmation of standard error analysis techniques applied to EXAFS using simulations

    SciTech Connect

    Booth, Corwin H; Hu, Yung-Jin

    2009-12-14

    Systematic uncertainties, such as those in calculated backscattering amplitudes, crystal glitches, etc., not only limit the ultimate accuracy of the EXAFS technique, but also affect the covariance matrix representation of real parameter errors in typical fitting routines. Despite major advances in EXAFS analysis and in understanding all potential uncertainties, these methods are not routinely applied by all EXAFS users. Consequently, reported parameter errors are not reliable in many EXAFS studies in the literature. This situation has made many EXAFS practitioners leery of conventional error analysis applied to EXAFS data. However, conventional error analysis, if properly applied, can teach us more about our data, and even about the power and limitations of the EXAFS technique. Here, we describe the proper application of conventional error analysis to r-space fitting to EXAFS data. Using simulations, we demonstrate the veracity of this analysis by, for instance, showing that the number of independent dat a points from Stern's rule is balanced by the degrees of freedom obtained from a 2 statistical analysis. By applying such analysis to real data, we determine the quantitative effect of systematic errors. In short, this study is intended to remind the EXAFS community about the role of fundamental noise distributions in interpreting our final results.

  14. Spherical harmonic decomposition applied to spatial-temporal analysis of human high-density electroencephalogram

    NASA Astrophysics Data System (ADS)

    Wingeier, B. M.; Nunez, P. L.; Silberstein, R. B.

    2001-11-01

    We demonstrate an application of spherical harmonic decomposition to the analysis of the human electroencephalogram (EEG). We implement two methods and discuss issues specific to the analysis of hemispherical, irregularly sampled data. Spatial sampling requirements and performance of the methods are quantified using simulated data. The analysis is applied to experimental EEG data, confirming earlier reports of an approximate frequency-wave-number relationship in some bands.

  15. Economic Risk Analysis of Experimental Cropping Systems Using the SMART Risk Tool

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Recently, a variant of stochastic dominance called stochastic efficiency with respect to a function (SERF) has been developed and applied. Unlike traditional stochastic dominance approaches, SERF uses the concept of certainty equivalents (CEs) to rank a set of risk-efficient alternatives instead of ...

  16. Bayesian Inference for NASA Probabilistic Risk and Reliability Analysis

    NASA Technical Reports Server (NTRS)

    Dezfuli, Homayoon; Kelly, Dana; Smith, Curtis; Vedros, Kurt; Galyean, William

    2009-01-01

    This document, Bayesian Inference for NASA Probabilistic Risk and Reliability Analysis, is intended to provide guidelines for the collection and evaluation of risk and reliability-related data. It is aimed at scientists and engineers familiar with risk and reliability methods and provides a hands-on approach to the investigation and application of a variety of risk and reliability data assessment methods, tools, and techniques. This document provides both: A broad perspective on data analysis collection and evaluation issues. A narrow focus on the methods to implement a comprehensive information repository. The topics addressed herein cover the fundamentals of how data and information are to be used in risk and reliability analysis models and their potential role in decision making. Understanding these topics is essential to attaining a risk informed decision making environment that is being sought by NASA requirements and procedures such as 8000.4 (Agency Risk Management Procedural Requirements), NPR 8705.05 (Probabilistic Risk Assessment Procedures for NASA Programs and Projects), and the System Safety requirements of NPR 8715.3 (NASA General Safety Program Requirements).

  17. The semantic distinction between "risk" and "danger": a linguistic analysis.

    PubMed

    Boholm, Max

    2012-02-01

    The analysis combines frame semantic and corpus linguistic approaches in analyzing the role of agency and decision making in the semantics of the words "risk" and "danger" (both nominal and verbal uses). In frame semantics, the meanings of "risk" and of related words, such as "danger," are analyzed against the background of a specific cognitive-semantic structure (a frame) comprising frame elements such as Protagonist, Bad Outcome, Decision, Possession, and Source. Empirical data derive from the British National Corpus (100 million words). Results indicate both similarities and differences in use. First, both "risk" and "danger" are commonly used to represent situations having potential negative consequences as the result of agency. Second, "risk" and "danger," especially their verbal uses (to risk, to endanger), differ in agent-victim structure, i.e., "risk" is used to express that a person affected by an action is also the agent of the action, while "endanger" is used to express that the one affected is not the agent. Third, "risk," but not "danger," tends to be used to represent rational and goal-directed action. The results therefore to some extent confirm the analysis of "risk" and "danger" suggested by German sociologist Niklas Luhmann. As a point of discussion, the present findings arguably have implications for risk communication.

  18. Risk Analysis for Unintentional Slide Deployment During Airline Operations.

    PubMed

    Ayra, Eduardo S; Insua, David Ríos; Castellanos, María Eugenia; Larbi, Lydia

    2015-09-01

    We present a risk analysis undertaken to mitigate problems in relation to the unintended deployment of slides under normal operations within a commercial airline. This type of incident entails relevant costs for the airline industry. After assessing the likelihood and severity of its consequences, we conclude that such risks need to be managed. We then evaluate the effectiveness of various countermeasures, describing and justifying the chosen ones. We also discuss several issues faced when implementing and communicating the proposed measures, thus fully illustrating the risk analysis process.

  19. Environmental risk analysis of hazardous material rail transportation.

    PubMed

    Saat, Mohd Rapik; Werth, Charles J; Schaeffer, David; Yoon, Hongkyu; Barkan, Christopher P L

    2014-01-15

    An important aspect of railroad environmental risk management involves tank car transportation of hazardous materials. This paper describes a quantitative, environmental risk analysis of rail transportation of a group of light, non-aqueous-phase liquid (LNAPL) chemicals commonly transported by rail in North America. The Hazardous Materials Transportation Environmental Consequence Model (HMTECM) was used in conjunction with a geographic information system (GIS) analysis of environmental characteristics to develop probabilistic estimates of exposure to different spill scenarios along the North American rail network. The risk analysis incorporated the estimated clean-up cost developed using the HMTECM, route-specific probability distributions of soil type and depth to groundwater, annual traffic volume, railcar accident rate, and tank car safety features, to estimate the nationwide annual risk of transporting each product. The annual risk per car-mile (car-km) and per ton-mile (ton-km) was also calculated to enable comparison between chemicals and to provide information on the risk cost associated with shipments of these products. The analysis and the methodology provide a quantitative approach that will enable more effective management of the environmental risk of transporting hazardous materials.

  20. Dynamic optimization of ISR sensors using a risk-based reward function applied to ground and space surveillance scenarios

    NASA Astrophysics Data System (ADS)

    DeSena, J. T.; Martin, S. R.; Clarke, J. C.; Dutrow, D. A.; Newman, A. J.

    2012-06-01

    As the number and diversity of sensing assets available for intelligence, surveillance and reconnaissance (ISR) operations continues to expand, the limited ability of human operators to effectively manage, control and exploit the ISR ensemble is exceeded, leading to reduced operational effectiveness. Automated support both in the processing of voluminous sensor data and sensor asset control can relieve the burden of human operators to support operation of larger ISR ensembles. In dynamic environments it is essential to react quickly to current information to avoid stale, sub-optimal plans. Our approach is to apply the principles of feedback control to ISR operations, "closing the loop" from the sensor collections through automated processing to ISR asset control. Previous work by the authors demonstrated non-myopic multiple platform trajectory control using a receding horizon controller in a closed feedback loop with a multiple hypothesis tracker applied to multi-target search and track simulation scenarios in the ground and space domains. This paper presents extensions in both size and scope of the previous work, demonstrating closed-loop control, involving both platform routing and sensor pointing, of a multisensor, multi-platform ISR ensemble tasked with providing situational awareness and performing search, track and classification of multiple moving ground targets in irregular warfare scenarios. The closed-loop ISR system is fullyrealized using distributed, asynchronous components that communicate over a network. The closed-loop ISR system has been exercised via a networked simulation test bed against a scenario in the Afghanistan theater implemented using high-fidelity terrain and imagery data. In addition, the system has been applied to space surveillance scenarios requiring tracking of space objects where current deliberative, manually intensive processes for managing sensor assets are insufficiently responsive. Simulation experiment results are presented

  1. Selenium Exposure and Cancer Risk: an Updated Meta-analysis and Meta-regression

    PubMed Central

    Cai, Xianlei; Wang, Chen; Yu, Wanqi; Fan, Wenjie; Wang, Shan; Shen, Ning; Wu, Pengcheng; Li, Xiuyang; Wang, Fudi

    2016-01-01

    The objective of this study was to investigate the associations between selenium exposure and cancer risk. We identified 69 studies and applied meta-analysis, meta-regression and dose-response analysis to obtain available evidence. The results indicated that high selenium exposure had a protective effect on cancer risk (pooled OR = 0.78; 95%CI: 0.73–0.83). The results of linear and nonlinear dose-response analysis indicated that high serum/plasma selenium and toenail selenium had the efficacy on cancer prevention. However, we did not find a protective efficacy of selenium supplement. High selenium exposure may have different effects on specific types of cancer. It decreased the risk of breast cancer, lung cancer, esophageal cancer, gastric cancer, and prostate cancer, but it was not associated with colorectal cancer, bladder cancer, and skin cancer. PMID:26786590

  2. Ecological risk assessment of the antibiotic enrofloxacin applied to Pangasius catfish farms in the Mekong Delta, Vietnam.

    PubMed

    Andrieu, Margot; Rico, Andreu; Phu, Tran Minh; Huong, Do Thi Thanh; Phuong, Nguyen Thanh; Van den Brink, Paul J

    2015-01-01

    Antibiotics applied in aquaculture production may be released into the environment and contribute to the deterioration of surrounding aquatic ecosystems. In the present study, we assessed the ecological risks posed by the use of the antibiotic enrofloxacin (ENR), and its main metabolite ciprofloxacin (CIP), in a Pangasius catfish farm in the Mekong Delta region, Vietnam. Water and sediment samples were collected in a stream receiving effluents from a Pangasius catfish farm that had applied ENR. The toxicity of ENR and CIP was assessed on three tropical aquatic species: the green-algae Chlorella sp. (72 h - growth inhibition test), the micro-invertebrate Moina macrocopa (48 h - immobilization test), and the Nile tilapia (Oreochromis niloticus). The toxic effects on O. niloticus were evaluated by measuring the cholinesterase (ChE) and catalase (CAT) activities in the fish brain and muscles, respectively, and by considering feed exposure and water exposure separately. Ecological risks were assessed by comparing maximum exposure concentrations with predicted no effect concentrations for cyanobacteria, green algae, invertebrates and fish derived with available toxicity data. The results of this study showed that maximum antibiotic concentrations in Pangasius catfish farm effluents were 0.68 μg L(-1) for ENR and 0.25 μg L(-1) for CIP (dissolved water concentrations). Antibiotics accumulated in sediments down-stream the effluent discharge point at concentrations up to 2590 μg kg(-1) d.w. and 592 μg kg(-1) d.w. for ENR and CIP, respectively. The calculated EC50 values for ENR and CIP were 111000 and 23000 μg L(-1) for Chlorella sp., and 69000 and 71000 μg L(-1) for M. macrocopa, respectively. Significant effects on the ChE and CAT enzymatic activities of O. niloticus were observed at 5 g kg(-1) feed and 400-50000 μg L(-1), for both antibiotics. The results of the ecological risk assessment performed in this study indicated only minor risks for cyanobacteria

  3. Risk Assessment of Infrastructure System of Systems with Precursor Analysis.

    PubMed

    Guo, Zhenyu; Haimes, Yacov Y

    2016-08-01

    Physical infrastructure systems are commonly composed of interconnected and interdependent subsystems, which in their essence constitute system of systems (S-o-S). System owners and policy researchers need tools to foresee potential emergent forced changes and to understand their impact so that effective risk management strategies can be developed. We develop a systemic framework for precursor analysis to support the design of an effective and efficient precursor monitoring and decision support system with the ability to (i) identify and prioritize indicators of evolving risks of system failure; and (ii) evaluate uncertainties in precursor analysis to support informed and rational decision making. This integrated precursor analysis framework is comprised of three processes: precursor identification, prioritization, and evaluation. We use an example of a highway bridge S-o-S to demonstrate the theories and methodologies of the framework. Bridge maintenance processes involve many interconnected and interdependent functional subsystems and decision-making entities and bridge failure can have broad social and economic consequences. The precursor analysis framework, which constitutes an essential part of risk analysis, examines the impact of various bridge inspection and maintenance scenarios. It enables policy researchers and analysts who are seeking a risk perspective on bridge infrastructure in a policy setting to develop more risk informed policies and create guidelines to efficiently allocate limited risk management resources and mitigate severe consequences resulting from bridge failures. PMID:27575259

  4. Risk Assessment of Infrastructure System of Systems with Precursor Analysis.

    PubMed

    Guo, Zhenyu; Haimes, Yacov Y

    2016-08-01

    Physical infrastructure systems are commonly composed of interconnected and interdependent subsystems, which in their essence constitute system of systems (S-o-S). System owners and policy researchers need tools to foresee potential emergent forced changes and to understand their impact so that effective risk management strategies can be developed. We develop a systemic framework for precursor analysis to support the design of an effective and efficient precursor monitoring and decision support system with the ability to (i) identify and prioritize indicators of evolving risks of system failure; and (ii) evaluate uncertainties in precursor analysis to support informed and rational decision making. This integrated precursor analysis framework is comprised of three processes: precursor identification, prioritization, and evaluation. We use an example of a highway bridge S-o-S to demonstrate the theories and methodologies of the framework. Bridge maintenance processes involve many interconnected and interdependent functional subsystems and decision-making entities and bridge failure can have broad social and economic consequences. The precursor analysis framework, which constitutes an essential part of risk analysis, examines the impact of various bridge inspection and maintenance scenarios. It enables policy researchers and analysts who are seeking a risk perspective on bridge infrastructure in a policy setting to develop more risk informed policies and create guidelines to efficiently allocate limited risk management resources and mitigate severe consequences resulting from bridge failures.

  5. Risk Analysis for Resource Planning Optimization

    NASA Technical Reports Server (NTRS)

    Cheung, Kar-Ming

    2008-01-01

    This paper describes a systems engineering approach to resource planning by integrating mathematical modeling and constrained optimization, empirical simulation, and theoretical analysis techniques to generate an optimal task plan in the presence of uncertainties.

  6. Walking the line: Understanding pedestrian behaviour and risk at rail level crossings with cognitive work analysis.

    PubMed

    Read, Gemma J M; Salmon, Paul M; Lenné, Michael G; Stanton, Neville A

    2016-03-01

    Pedestrian fatalities at rail level crossings (RLXs) are a public safety concern for governments worldwide. There is little literature examining pedestrian behaviour at RLXs and no previous studies have adopted a formative approach to understanding behaviour in this context. In this article, cognitive work analysis is applied to understand the constraints that shape pedestrian behaviour at RLXs in Melbourne, Australia. The five phases of cognitive work analysis were developed using data gathered via document analysis, behavioural observation, walk-throughs and critical decision method interviews. The analysis demonstrates the complex nature of pedestrian decision making at RLXs and the findings are synthesised to provide a model illustrating the influences on pedestrian decision making in this context (i.e. time, effort and social pressures). Further, the CWA outputs are used to inform an analysis of the risks to safety associated with pedestrian behaviour at RLXs and the identification of potential interventions to reduce risk.

  7. Risk analysis of tyramine concentration in food production

    NASA Astrophysics Data System (ADS)

    Doudová, L.; Buňka, F.; Michálek, J.; Sedlačík, M.; Buňková, L.

    2013-10-01

    The contribution is focused on risk analysis in food microbiology. This paper evaluates the effect of selected factors on tyramine production in bacterial strains of Lactococcus genus which were assigned as tyramine producers. Tyramine is a biogenic amine sythesized from an amino acid called tyrosine. It can be found in certain foodstuffs (often in cheese), and can cause a pseudo-response in sensitive individuals. The above-mentioned bacteria are commonly used in the biotechnological process of cheese production as starter cultures. The levels of factors were chosen with respect to the conditions which can occur in this technological process. To describe and compare tyramine production in chosen microorganisms, generalized regression models were applied. Tyramine production was modelled by Gompertz curves according to the selected factors (the lactose concentration of 0-1% w/v, NaCl 0-2% w/v and aero/anaerobiosis) for 3 different types of bacterial cultivation. Moreover, estimates of model parameters were calculated and tested; multiple comparisons were discussed as well. The aim of this paper is to find a combination of factors leading to a similar tyramine production level.

  8. Risk D&D Rapid Prototype: Scenario Documentation and Analysis Tool

    SciTech Connect

    Unwin, Stephen D.; Seiple, Timothy E.

    2009-05-28

    Report describes process and methodology associated with a rapid prototype tool for integrating project risk analysis and health & safety risk analysis for decontamination and decommissioning projects.

  9. Uncertainty propagation analysis applied to volcanic ash dispersal at Mt. Etna by using a Lagrangian model

    NASA Astrophysics Data System (ADS)

    de'Michieli Vitturi, Mattia; Pardini, Federica; Spanu, Antonio; Neri, Augusto; Vittoria Salvetti, Maria

    2015-04-01

    Volcanic ash clouds represent a major hazard for populations living nearby volcanic centers producing a risk for humans and a potential threat to crops, ground infrastructures, and aviation traffic. Lagrangian particle dispersal models are commonly used for tracking ash particles emitted from volcanic plumes and transported under the action of atmospheric wind fields. In this work, we present the results of an uncertainty propagation analysis applied to volcanic ash dispersal from weak plumes with specific focus on the uncertainties related to the grain-size distribution of the mixture. To this aim, the Eulerian fully compressible mesoscale non-hydrostatic model WRF was used to generate the driving wind, representative of the atmospheric conditions occurring during the event of November 24, 2006 at Mt. Etna. Then, the Lagrangian particle model LPAC (de' Michieli Vitturi et al., JGR 2010) was used to simulate the transport of mass particles under the action of atmospheric conditions. The particle motion equations were derived by expressing the Lagrangian particle acceleration as the sum of the forces acting along its trajectory, with drag forces calculated as a function of particle diameter, density, shape and Reynolds number. The simulations were representative of weak plume events of Mt. Etna and aimed to quantify the effect on the dispersal process of the uncertainty in the particle sphericity and in the mean and variance of a log-normal distribution function describing the grain-size of ash particles released from the eruptive column. In order to analyze the sensitivity of particle dispersal to these uncertain parameters with a reasonable number of simulations, and therefore with affordable computational costs, response surfaces in the parameter space were built by using the generalized polynomial chaos technique. The uncertainty analysis allowed to quantify the most probable values, as well as their pdf, of the number of particles as well as of the mean and

  10. A predictive Bayesian approach to risk analysis in health care

    PubMed Central

    Aven, Terje; Eidesen, Karianne

    2007-01-01

    Background The Bayesian approach is now widely recognised as a proper framework for analysing risk in health care. However, the traditional text-book Bayesian approach is in many cases difficult to implement, as it is based on abstract concepts and modelling. Methods The essential points of the risk analyses conducted according to the predictive Bayesian approach are identification of observable quantities, prediction and uncertainty assessments of these quantities, using all the relevant information. The risk analysis summarizes the knowledge and lack of knowledge concerning critical operations and other activities, and give in this way a basis for making rational decisions. Results It is shown that Bayesian risk analysis can be significantly simplified and made more accessible compared to the traditional text-book Bayesian approach by focusing on predictions of observable quantities and performing uncertainty assessments of these quantities using subjective probabilities. Conclusion The predictive Bayesian approach provides a framework for ensuring quality of risk analysis. The approach acknowledges that risk cannot be adequately described and evaluated simply by reference to summarising probabilities. Risk is defined by the combination of possible consequences and associated uncertainties. PMID:17714597

  11. Contract Negotiations Supported Through Risk Analysis

    NASA Astrophysics Data System (ADS)

    Rodrigues, Sérgio A.; Vaz, Marco A.; Souza, Jano M.

    Many clients often view software as a commodity; then, it is critical that IT sellers know how to create value into their offering to differentiate their service from all the others. Clients sometimes refuse to contract software development due to lack of technical understanding or simply because they are afraid of IT contractual commitments. The IT negotiators who recognize the importance of this issue and the reason why it is a problem will be able to work to reach the commercial terms they want. Therefore, this chapter aims to stimulate IT professionals to improve their negotiation skills and presents a computational tool to support managers to get the best out of software negotiations through the identification of contract risks.

  12. Systematic analysis of natural hazards along infrastructure networks using a GIS-tool for risk assessment

    NASA Astrophysics Data System (ADS)

    Baruffini, Mirko

    2010-05-01

    Due to the topographical conditions in Switzerland, the highways and the railway lines are frequently exposed to natural hazards as rockfalls, debris flows, landslides, avalanches and others. With the rising incidence of those natural hazards, protection measures become an important political issue. However, they are costly, and maximal protection is most probably not economically feasible. Furthermore risks are distributed in space and time. Consequently, important decision problems to the public sector decision makers are derived. This asks for a high level of surveillance and preservation along the transalpine lines. Efficient protection alternatives can be obtained consequently considering the concept of integral risk management. Risk analysis, as the central part of risk management, has become gradually a generally accepted approach for the assessment of current and future scenarios (Loat & Zimmermann 2004). The procedure aims at risk reduction which can be reached by conventional mitigation on one hand and the implementation of land-use planning on the other hand: a combination of active and passive mitigation measures is applied to prevent damage to buildings, people and infrastructures. With a Geographical Information System adapted to run with a tool developed to manage Risk analysis it is possible to survey the data in time and space, obtaining an important system for managing natural risks. As a framework, we adopt the Swiss system for risk analysis of gravitational natural hazards (BUWAL 1999). It offers a complete framework for the analysis and assessment of risks due to natural hazards, ranging from hazard assessment for gravitational natural hazards, such as landslides, collapses, rockfalls, floodings, debris flows and avalanches, to vulnerability assessment and risk analysis, and the integration into land use planning at the cantonal and municipality level. The scheme is limited to the direct consequences of natural hazards. Thus, we develop a

  13. Geographic Information Systems and Applied Spatial Statistics Are Efficient Tools to Study Hansen's Disease (Leprosy) and to Determine Areas of Greater Risk of Disease

    PubMed Central

    Queiroz, José Wilton; Dias, Gutemberg H.; Nobre, Maurício Lisboa; De Sousa Dias, Márcia C.; Araújo, Sérgio F.; Barbosa, James D.; da Trindade-Neto, Pedro Bezerra; Blackwell, Jenefer M.; Jeronimo, Selma M. B.

    2010-01-01

    Applied Spatial Statistics used in conjunction with geographic information systems (GIS) provide an efficient tool for the surveillance of diseases. Here, using these tools we analyzed the spatial distribution of Hansen's disease in an endemic area in Brazil. A sample of 808 selected from a universe of 1,293 cases was geocoded in Mossoró, Rio Grande do Norte, Brazil. Hansen's disease cases were not distributed randomly within the neighborhoods, with higher detection rates found in more populated districts. Cluster analysis identified two areas of high risk, one with a relative risk of 5.9 (P = 0.001) and the other 6.5 (P = 0.001). A significant relationship between the geographic distribution of disease and the social economic variables indicative of poverty was observed. Our study shows that the combination of GIS and spatial analysis can identify clustering of transmissible disease, such as Hansen's disease, pointing to areas where intervention efforts can be targeted to control disease. PMID:20134009

  14. Preliminary Technical Risk Analysis for the Geothermal Technologies Program

    SciTech Connect

    2009-01-18

    This report explains the goals, methods, and results of a probabilistic analysis of technical risk for a portfolio of R&D projects in the DOE Geothermal Technologies Program (The Program). The analysis is a task by Princeton Energy Resources International, LLC, in support of the National Renewable Energy Laboratory on behalf of the Program. The main challenge in the analysis lies in translating R&D results to a quantitative reflection of technical risk for a key Program metric: levelized cost of energy (LCOE).

  15. Theoretically Motivated Interventions for Reducing Sexual Risk Taking in Adolescence: A Randomized Controlled Experiment Applying Fuzzy-trace Theory

    PubMed Central

    Reyna, Valerie F.; Mills, Britain A.

    2014-01-01

    Fuzzy-trace theory is a theory of memory, judgment, and decision-making, and their development. We applied advances in this theory to increase the efficacy and durability of a multicomponent intervention to promote risk reduction and avoidance of premature pregnancy and STIs. 734 adolescents from high schools and youth programs in three states (Arizona, Texas, and New York) were randomly assigned to one of three curriculum groups: RTR (Reducing the Risk), RTR+ (a modified version of RTR using fuzzy-trace theory), and a control group. We report effects of curriculum on self-reported behaviors and behavioral intentions plus psychosocial mediators of those effects, namely, attitudes and norms, motives to have sex or get pregnant, self-efficacy and behavioral control, and gist/verbatim constructs. Among 26 outcomes, 19 showed an effect of at least one curriculum relative to the control group: RTR+ produced improvements for 17 outcomes and RTR produced improvements for 12 outcomes. For RTR+, two differences (for perceived parental norms and global benefit perception) were confined to age, gender, or racial/ethnic subgroups. Effects of RTR+ on sexual initiation emerged six months after the intervention, when many adolescents became sexually active. Effects of RTR+ were greater than RTR for nine outcomes, and remained significantly greater than controls at one-year follow-up for 12 outcomes. Consistent with fuzzy-trace theory, results suggest that, by emphasizing gist representations, which are preserved over long time periods and are key memories used in decision-making, the enhanced intervention produced larger and more sustained effects on behavioral outcomes and psychosocial mediators of adolescent risk-taking. PMID:24773191

  16. Germany wide seasonal flood risk analysis for agricultural crops

    NASA Astrophysics Data System (ADS)

    Klaus, Stefan; Kreibich, Heidi; Kuhlmann, Bernd; Merz, Bruno; Schröter, Kai

    2016-04-01

    In recent years, large-scale flood risk analysis and mapping has gained attention. Regional to national risk assessments are needed, for example, for national risk policy developments, for large-scale disaster management planning and in the (re-)insurance industry. Despite increasing requests for comprehensive risk assessments some sectors have not received much scientific attention, one of these is the agricultural sector. In contrast to other sectors, agricultural crop losses depend strongly on the season. Also flood probability shows seasonal variation. Thus, the temporal superposition of high flood susceptibility of crops and high flood probability plays an important role for agricultural flood risk. To investigate this interrelation and provide a large-scale overview of agricultural flood risk in Germany, an agricultural crop loss model is used for crop susceptibility analyses and Germany wide seasonal flood-frequency analyses are undertaken to derive seasonal flood patterns. As a result, a Germany wide map of agricultural flood risk is shown as well as the crop type most at risk in a specific region. The risk maps may provide guidance for federal state-wide coordinated designation of retention areas.

  17. How does scientific risk assessment of GM crops fit within the wider risk analysis?

    PubMed

    Johnson, Katy L; Raybould, Alan F; Hudson, Malcolm D; Poppy, Guy M

    2007-01-01

    The debate concerning genetically modified crops illustrates confusion between the role of scientists and that of wider society in regulatory decision making. We identify two fundamental misunderstandings, which, if rectified, would allow progress with confidence. First, scientific risk assessment needs to test well-defined hypotheses, not simply collect data. Second, risk assessments need to be placed in the wider context of risk analysis to enable the wider 'non-scientific' questions to be considered in regulatory decision making. Such integration and understanding is urgently required because the challenges to regulation will escalate as scientific progress advances.

  18. Design and Analysis of a Thrust Vector Mechanism Applied in a Flying Wing

    NASA Astrophysics Data System (ADS)

    Zhu, Yanhe; Gao, Liang; Wang, Hongwei; Zhao, Jie

    This paper presents the design and analysis of a thrust vector mechanism applied in a flying wing. A thrust vector mechanism driven by two servos is developed. An analysis of the dynamic differences in minimum hovering radius between conventional flying wing and one with thrust vector mechanism is given and validated with simulation. It is shown that thrust vector has obvious advantages over the usual flying wing including decreasing the hovering radius and decreasing roll angle. The benefits should improve maneuverability and agility.

  19. Geotechnical risk analysis by flat dilatometer (DMT)

    NASA Astrophysics Data System (ADS)

    Amoroso, Sara; Monaco, Paola

    2015-04-01

    In the last decades we have assisted at a massive migration from laboratory testing to in situ testing, to the point that, today, in situ testing is often the major part of a geotechnical investigation. The State of the Art indicates that direct-push in situ tests, such as the Cone Penetration Test (CPT) and the Flat Dilatometer Test (DMT), are fast and convenient in situ tests for routine site investigation. In most cases the DMT estimated parameters, in particular the undrained shear strength su and the constrained modulus M, are used with the common design methods of Geotechnical Engineering for evaluating bearing capacity, settlements etc. The paper focuses on the prediction of settlements of shallow foundations, that is probably the No. 1 application of the DMT, especially in sands, where undisturbed samples cannot be retrieved, and on the risk associated with their design. A compilation of documented case histories that compare DMT-predicted vs observed settlements, was collected by Monaco et al. (2006), indicating that, in general, the constrained modulus M can be considered a reasonable "operative modulus" (relevant to foundations in "working conditions") for settlement predictions based on the traditional linear elastic approach. Indeed, the use of a site investigation method, such as DMT, that improve the accuracy of design parameters, reduces risk, and the design can then center on the site's true soil variability without parasitic test variability. In this respect, Failmezger et al. (1999, 2015) suggested to introduce Beta probability distribution, that provides a realistic and useful description of variability for geotechnical design problems. The paper estimates Beta probability distribution in research sites where DMT tests and observed settlements are available. References Failmezger, R.A., Rom, D., Ziegler, S.R. (1999). "SPT? A better approach of characterizing residual soils using other in-situ tests", Behavioral Characterics of Residual Soils, B

  20. August Dvorak (1894-1975): Early expressions of applied behavior analysis and precision teaching.

    PubMed

    Joyce, B; Moxley, R A

    1988-01-01

    August Dvorak is best known for his development of the Dvorak keyboard. However, Dvorak also adapted and applied many behavioral and scientific management techniques to the field of education. Taken collectively, these techniques are representative of many of the procedures currently used in applied behavior analysis, in general, and especially in precision teaching. The failure to consider Dvorak's instructional methods may explain some of the discrepant findings in studies which compare the efficiency of the Dvorak to the standard keyboard. This article presents a brief background on the development of the standard (QWERTY) and Dvorak keyboards, describes parallels between Dvorak's teaching procedures and those used in precision teaching, reviews some of the comparative research on the Dvorak keyboard, and suggests some implications for further research in applying the principles of behavior analysis.

  1. August Dvorak (1894-1975): Early expressions of applied behavior analysis and precision teaching

    PubMed Central

    Joyce, Bonnie; Moxley, Roy A.

    1988-01-01

    August Dvorak is best known for his development of the Dvorak keyboard. However, Dvorak also adapted and applied many behavioral and scientific management techniques to the field of education. Taken collectively, these techniques are representative of many of the procedures currently used in applied behavior analysis, in general, and especially in precision teaching. The failure to consider Dvorak's instructional methods may explain some of the discrepant findings in studies which compare the efficiency of the Dvorak to the standard keyboard. This article presents a brief background on the development of the standard (QWERTY) and Dvorak keyboards, describes parallels between Dvorak's teaching procedures and those used in precision teaching, reviews some of the comparative research on the Dvorak keyboard, and suggests some implications for further research in applying the principles of behavior analysis. PMID:22477993

  2. Development of economic consequence methodology for process risk analysis.

    PubMed

    Zadakbar, Omid; Khan, Faisal; Imtiaz, Syed

    2015-04-01

    A comprehensive methodology for economic consequence analysis with appropriate models for risk analysis of process systems is proposed. This methodology uses loss functions to relate process deviations in a given scenario to economic losses. It consists of four steps: definition of a scenario, identification of losses, quantification of losses, and integration of losses. In this methodology, the process deviations that contribute to a given accident scenario are identified and mapped to assess potential consequences. Losses are assessed with an appropriate loss function (revised Taguchi, modified inverted normal) for each type of loss. The total loss is quantified by integrating different loss functions. The proposed methodology has been examined on two industrial case studies. Implementation of this new economic consequence methodology in quantitative risk assessment will provide better understanding and quantification of risk. This will improve design, decision making, and risk management strategies.

  3. Using fire tests for quantitative risk analysis

    SciTech Connect

    Ling, W.C.T.; Williamson, R.B.

    1980-03-01

    Fires can be considered a causal chain-of-events in which the growth and spread of fire may cause damage and injury if it is rapid enough to overcome the barriers placed in its way. Fire tests for fire resistance of the barriers can be used in a quantitative risk assessment. The fire growth and spread is modelled in a State Transition Model (STM). The fire barriers are presented as part of the Fire Protection Model (FPM) which is based on a portion of the NFPA Decision Tree. An Emergency Equivalent Network is introduced to couple the Fire Growth Model (FGM) and the FPM so that the spread of fire beyond the room-of-origin can be computed. An example is presented in which a specific building floor plan is analyzed to obtain the shortest expected time for fire to spread between two points. To obtain the probability and time for each link in the network, data from the results of fire tests were used. These results were found to be lacking and new standards giving better data are advocated.

  4. Using Applied Behaviour Analysis as Standard Practice in a UK Special Needs School

    ERIC Educational Resources Information Center

    Foran, Denise; Hoerger, Marguerite; Philpott, Hannah; Jones, Elin Walker; Hughes, J. Carl; Morgan, Jonathan

    2015-01-01

    This article describes how applied behaviour analysis can be implemented effectively and affordably in a maintained special needs school in the UK. Behaviour analysts collaborate with classroom teachers to provide early intensive behaviour education for young children with autism spectrum disorders (ASD), and function based behavioural…

  5. Says Who?: Students Apply Their Critical-Analysis Skills to Fight Town Hall

    ERIC Educational Resources Information Center

    Trimarchi, Ruth

    2002-01-01

    For some time the author looked for a tool to let students apply what they are learning about critical analysis in the science classroom to a relevant life experience. The opportunity occurred when a proposal to use environmentally friendly cleaning products in town buildings appeared on the local town meeting agenda. Using a copy of the proposal…

  6. Applied Behavior Analysis in Autism Spectrum Disorders: Recent Developments, Strengths, and Pitfalls

    ERIC Educational Resources Information Center

    Matson, Johnny L.; Turygin, Nicole C.; Beighley, Jennifer; Rieske, Robert; Tureck, Kimberly; Matson, Michael L.

    2012-01-01

    Autism has become one of the most heavily researched topics in the field of mental health and education. While genetics has been the most studied of all topics, applied behavior analysis (ABA) has also received a great deal of attention, and has arguably yielded the most promising results of any research area to date. The current paper provides a…

  7. Lovaas Model of Applied Behavior Analysis. What Works Clearinghouse Intervention Report

    ERIC Educational Resources Information Center

    What Works Clearinghouse, 2010

    2010-01-01

    The "Lovaas Model of Applied Behavior Analysis" is a type of behavioral therapy that initially focuses on discrete trials: brief periods of one-on-one instruction, during which a teacher cues a behavior, prompts the appropriate response, and provides reinforcement to the child. Children in the program receive an average of 35 to 40 hours of…

  8. Applying Socio-Identity Analysis to Counseling Practice and Preparation: A Review of Four Techniques.

    ERIC Educational Resources Information Center

    Johnson, Samuel D., Jr.

    1990-01-01

    Reviews four training strategies for applying socioidentity analysis to multicultural counseling; the Clarification Group (C Group); the Personal Dimensions of Difference Self-Inventory (PDD); the Multifactor Needs Assessment; and the Cultural Grid. Each highlights a slightly different aspect of the complex matrix of relationships that define the…

  9. A Case Study in the Misrepresentation of Applied Behavior Analysis in Autism: The Gernsbacher Lectures

    PubMed Central

    Morris, Edward K

    2009-01-01

    I know that most men, including those at ease with problems of the greatest complexity, can seldom accept the simplest and most obvious truth if it be such as would oblige them to admit the falsity of conclusions which they have proudly taught to others, and which they have woven, thread by thread, into the fabrics of their life. (Tolstoy, 1894) This article presents a case study in the misrepresentation of applied behavior analysis for autism based on Morton Ann Gernsbacher's presentation of a lecture titled “The Science of Autism: Beyond the Myths and Misconceptions.” Her misrepresentations involve the characterization of applied behavior analysis, descriptions of practice guidelines, reviews of the treatment literature, presentations of the clinical trials research, and conclusions about those trials (e.g., children's improvements are due to development, not applied behavior analysis). The article also reviews applied behavior analysis' professional endorsements and research support, and addresses issues in professional conduct. It ends by noting the deleterious effects that misrepresenting any research on autism (e.g., biological, developmental, behavioral) have on our understanding and treating it in a transdisciplinary context. PMID:22478522

  10. Applied Behavior Analysis in the Treatment of Severe Psychiatric Disorders: A Bibliography.

    ERIC Educational Resources Information Center

    Scotti, Joseph R.; And Others

    Clinical research in the area of severe psychiatric disorders constituted the major focus for the discipline of applied behavior analysis during the early 1960s. Recently, however, there appears to be a notable lack of a behavioral focus within many inpatient psychiatric settings and a relative dearth of published behavioral treatment studies with…

  11. Graphical and Numerical Descriptive Analysis: Exploratory Tools Applied to Vietnamese Data

    ERIC Educational Resources Information Center

    Haughton, Dominique; Phong, Nguyen

    2004-01-01

    This case study covers several exploratory data analysis ideas, the histogram and boxplot, kernel density estimates, the recently introduced bagplot--a two-dimensional extension of the boxplot--as well as the violin plot, which combines a boxplot with a density shape plot. We apply these ideas and demonstrate how to interpret the output from these…

  12. State of the art in benefit-risk analysis: introduction.

    PubMed

    Verhagen, H; Tijhuis, M J; Gunnlaugsdóttir, H; Kalogeras, N; Leino, O; Luteijn, J M; Magnússon, S H; Odekerken, G; Pohjola, M V; Tuomisto, J T; Ueland, Ø; White, B C; Holm, F

    2012-01-01

    Risk-taking is normal in everyday life if there are associated (perceived) benefits. Benefit-Risk Analysis (BRA) compares the risk of a situation to its related benefits and addresses the acceptability of the risk. Over the past years BRA in relation to food and food ingredients has gained attention. Food, and even the same food ingredient, may confer both beneficial and adverse effects. Measures directed at food safety may lead to suboptimal or insufficient levels of ingredients from a benefit perspective. In BRA, benefits and risks of food (ingredients) are assessed in one go and may conditionally be expressed into one currency. This allows the comparison of adverse and beneficial effects to be qualitative and quantitative. A BRA should help policy-makers to make more informed and balanced benefit-risk management decisions. Not allowing food benefits to occur in order to guarantee food safety is a risk management decision much the same as accepting some risk in order to achieve more benefits. BRA in food and nutrition is making progress, but difficulties remain. The field may benefit from looking across its borders to learn from other research areas. The BEPRARIBEAN project (Best Practices for Risk-Benefit Analysis: experience from out of food into food; http://en.opasnet.org/w/Bepraribean) aims to do so, by working together with Medicines, Food Microbiology, Environmental Health, Economics & Marketing-Finance and Consumer Perception. All perspectives are reviewed and subsequently integrated to identify opportunities for further development of BRA for food and food ingredients. Interesting issues that emerge are the varying degrees of risk that are deemed acceptable within the areas and the trend towards more open and participatory BRA processes. A set of 6 'state of the art' papers covering the above areas and a paper integrating the separate (re)views are published in this volume. PMID:21679738

  13. State of the art in benefit-risk analysis: introduction.

    PubMed

    Verhagen, H; Tijhuis, M J; Gunnlaugsdóttir, H; Kalogeras, N; Leino, O; Luteijn, J M; Magnússon, S H; Odekerken, G; Pohjola, M V; Tuomisto, J T; Ueland, Ø; White, B C; Holm, F

    2012-01-01

    Risk-taking is normal in everyday life if there are associated (perceived) benefits. Benefit-Risk Analysis (BRA) compares the risk of a situation to its related benefits and addresses the acceptability of the risk. Over the past years BRA in relation to food and food ingredients has gained attention. Food, and even the same food ingredient, may confer both beneficial and adverse effects. Measures directed at food safety may lead to suboptimal or insufficient levels of ingredients from a benefit perspective. In BRA, benefits and risks of food (ingredients) are assessed in one go and may conditionally be expressed into one currency. This allows the comparison of adverse and beneficial effects to be qualitative and quantitative. A BRA should help policy-makers to make more informed and balanced benefit-risk management decisions. Not allowing food benefits to occur in order to guarantee food safety is a risk management decision much the same as accepting some risk in order to achieve more benefits. BRA in food and nutrition is making progress, but difficulties remain. The field may benefit from looking across its borders to learn from other research areas. The BEPRARIBEAN project (Best Practices for Risk-Benefit Analysis: experience from out of food into food; http://en.opasnet.org/w/Bepraribean) aims to do so, by working together with Medicines, Food Microbiology, Environmental Health, Economics & Marketing-Finance and Consumer Perception. All perspectives are reviewed and subsequently integrated to identify opportunities for further development of BRA for food and food ingredients. Interesting issues that emerge are the varying degrees of risk that are deemed acceptable within the areas and the trend towards more open and participatory BRA processes. A set of 6 'state of the art' papers covering the above areas and a paper integrating the separate (re)views are published in this volume.

  14. Approach to proliferation risk assessment based on multiple objective analysis framework

    SciTech Connect

    Andrianov, A.; Kuptsov, I.

    2013-07-01

    The approach to the assessment of proliferation risk using the methods of multi-criteria decision making and multi-objective optimization is presented. The approach allows the taking into account of the specifics features of the national nuclear infrastructure, and possible proliferation strategies (motivations, intentions, and capabilities). 3 examples of applying the approach are shown. First, the approach has been used to evaluate the attractiveness of HEU (high enriched uranium)production scenarios at a clandestine enrichment facility using centrifuge enrichment technology. Secondly, the approach has been applied to assess the attractiveness of scenarios for undeclared production of plutonium or HEU by theft of materials circulating in nuclear fuel cycle facilities and thermal reactors. Thirdly, the approach has been used to perform a comparative analysis of the structures of developing nuclear power systems based on different types of nuclear fuel cycles, the analysis being based on indicators of proliferation risk.

  15. Credibility analysis of risk classes by generalized linear model

    NASA Astrophysics Data System (ADS)

    Erdemir, Ovgucan Karadag; Sucu, Meral

    2016-06-01

    In this paper generalized linear model (GLM) and credibility theory which are frequently used in nonlife insurance pricing are combined for reliability analysis. Using full credibility standard, GLM is associated with limited fluctuation credibility approach. Comparison criteria such as asymptotic variance and credibility probability are used to analyze the credibility of risk classes. An application is performed by using one-year claim frequency data of a Turkish insurance company and results of credible risk classes are interpreted.

  16. Ecological food web analysis for chemical risk assessment.

    PubMed

    Preziosi, Damian V; Pastorok, Robert A

    2008-12-01

    Food web analysis can be a critical component of ecological risk assessment, yet it has received relatively little attention among risk assessors. Food web data are currently used in modeling bioaccumulation of toxic chemicals and, to a limited extent, in the determination of the ecological significance of risks. Achieving more realism in ecological risk assessments requires new analysis tools and models that incorporate accurate information on key receptors in a food web paradigm. Application of food web analysis in risk assessments demands consideration of: 1) different kinds of food webs; 2) definition of trophic guilds; 3) variation in food webs with habitat, space, and time; and 4) issues for basic sampling design and collection of dietary data. The different kinds of food webs include connectance webs, materials flow webs, and functional (or interaction) webs. These three kinds of webs play different roles throughout various phases of an ecological risk assessment, but risk assessors have failed to distinguish among web types. When modeling food webs, choices must be made regarding the level of complexity for the web, assignment of species to trophic guilds, selection of representative species for guilds, use of average diets, the characterization of variation among individuals or guild members within a web, and the spatial and temporal scales/dynamics of webs. Integrating exposure and effects data in ecological models for risk assessment of toxic chemicals relies on coupling food web analysis with bioaccumulation models (e.g., Gobas-type models for fish and their food webs), wildlife exposure models, dose-response models, and population dynamics models. PMID:18703218

  17. Multi-hazard risk analysis for management strategies

    NASA Astrophysics Data System (ADS)

    Kappes, M.; Keiler, M.; Bell, R.; Glade, T.

    2009-04-01

    Risk management is very often operating in a reactive way, responding to an event, instead of proactive starting with risk analysis and building up the whole process of risk evaluation, prevention, event management and regeneration. Since damage and losses from natural hazards raise continuously more and more studies, concepts (e.g. Switzerland or South Tyrol-Bolozano) and software packages (e.g. ARMAGEDOM, HAZUS or RiskScape) are developed to guide, standardize and facilitate the risk analysis. But these approaches focus on different aspects and are mostly closely adapted to the situation (legislation, organization of the administration, specific processes etc.) of the specific country or region. We propose in this study the development of a flexible methodology for multi-hazard risk analysis, identifying the stakeholders and their needs, processes and their characteristics, modeling approaches as well as incoherencies occurring by combining all these different aspects. Based on this concept a flexible software package will be established consisting of ArcGIS as central base and being complemented by various modules for hazard modeling, vulnerability assessment and risk calculation. Not all modules will be developed newly but taken from the current state-of-the-art and connected or integrated into ArcGIS. For this purpose two study sites, Valtellina in Italy and Bacelonnette in France, were chosen and the hazards types debris flows, rockfalls, landslides, avalanches and floods are planned to be included in the tool for a regional multi-hazard risk analysis. Since the central idea of this tool is its flexibility this will only be a first step, in the future further processes and scales can be included and the instrument thus adapted to any study site.

  18. Predictive Validity of Pressure Ulcer Risk Assessment Tools for Elderly: A Meta-Analysis.

    PubMed

    Park, Seong-Hi; Lee, Young-Shin; Kwon, Young-Mi

    2016-04-01

    Preventing pressure ulcers is one of the most challenging goals existing for today's health care provider. Currently used tools which assess risk of pressure ulcer development rarely evaluate the accuracy of predictability, especially in older adults. The current study aimed at providing a systemic review and meta-analysis of 29 studies using three pressure ulcer risk assessment tools: Braden, Norton, and Waterlow Scales. Overall predictive validities of pressure ulcer risks in the pooled sensitivity and specificity indicated a similar range with a moderate accuracy level in all three scales, while heterogeneity showed more than 80% variability among studies. The studies applying the Braden Scale used five different cut-off points representing the primary cause of heterogeneity. Results indicate that commonly used screening tools for pressure ulcer risk have limitations regarding validity and accuracy for use with older adults due to heterogeneity among studies.

  19. Hazmat transport: a methodological framework for the risk analysis of marshalling yards.

    PubMed

    Cozzani, Valerio; Bonvicini, Sarah; Spadoni, Gigliola; Zanelli, Severino

    2007-08-17

    A methodological framework was outlined for the comprehensive risk assessment of marshalling yards in the context of quantified area risk analysis. Three accident typologies were considered for yards: (i) "in-transit-accident-induced" releases; (ii) "shunting-accident-induced" spills; and (iii) "non-accident-induced" leaks. A specific methodology was developed for the assessment of expected release frequencies and equivalent release diameters, based on the application of HazOp and Fault Tree techniques to reference schemes defined for the more common types of railcar vessels used for "hazmat" transportation. The approach was applied to the assessment of an extended case-study. The results evidenced that "non-accident-induced" leaks in marshalling yards represent an important contribution to the overall risk associated to these zones. Furthermore, the results confirmed the considerable role of these fixed installations to the overall risk associated to "hazmat" transportation.

  20. Hazmat transport: a methodological framework for the risk analysis of marshalling yards.

    PubMed

    Cozzani, Valerio; Bonvicini, Sarah; Spadoni, Gigliola; Zanelli, Severino

    2007-08-17

    A methodological framework was outlined for the comprehensive risk assessment of marshalling yards in the context of quantified area risk analysis. Three accident typologies were considered for yards: (i) "in-transit-accident-induced" releases; (ii) "shunting-accident-induced" spills; and (iii) "non-accident-induced" leaks. A specific methodology was developed for the assessment of expected release frequencies and equivalent release diameters, based on the application of HazOp and Fault Tree techniques to reference schemes defined for the more common types of railcar vessels used for "hazmat" transportation. The approach was applied to the assessment of an extended case-study. The results evidenced that "non-accident-induced" leaks in marshalling yards represent an important contribution to the overall risk associated to these zones. Furthermore, the results confirmed the considerable role of these fixed installations to the overall risk associated to "hazmat" transportation. PMID:17418942

  1. Risk Analysis and Decision Making FY 2013 Milestone Report

    SciTech Connect

    Engel, David W.; Dalton, Angela C.; Dale, Crystal; Jones, Edward; Thompson, J.

    2013-06-01

    Risk analysis and decision making is one of the critical objectives of CCSI, which seeks to use information from science-based models with quantified uncertainty to inform decision makers who are making large capital investments. The goal of this task is to develop tools and capabilities to facilitate the development of risk models tailored for carbon capture technologies, quantify the uncertainty of model predictions, and estimate the technical and financial risks associated with the system. This effort aims to reduce costs by identifying smarter demonstrations, which could accelerate development and deployment of the technology by several years.

  2. Gambler Risk Perception: A Mental Model and Grounded Theory Analysis.

    PubMed

    Spurrier, Michael; Blaszczynski, Alexander; Rhodes, Paul

    2015-09-01

    Few studies have investigated how gamblers perceive risk or the role of risk perception in disordered gambling. The purpose of the current study therefore was to obtain data on lay gamblers' beliefs on these variables and their effects on decision-making, behaviour, and disordered gambling aetiology. Fifteen regular lay gamblers (non-problem/low risk, moderate risk and problem gamblers) completed a semi-structured interview following mental models and grounded theory methodologies. Gambler interview data was compared to an expert 'map' of risk-perception, to identify comparative gaps or differences associated with harmful or safe gambling. Systematic overlapping processes of data gathering and analysis were used to iteratively extend, saturate, test for exception, and verify concepts and themes emerging from the data. The preliminary findings suggested that gambler accounts supported the presence of expert conceptual constructs, and to some degree the role of risk perception in protecting against or increasing vulnerability to harm and disordered gambling. Gambler accounts of causality, meaning, motivation, and strategy were highly idiosyncratic, and often contained content inconsistent with measures of disordered gambling. Disordered gambling appears heavily influenced by relative underestimation of risk and overvaluation of gambling, based on explicit and implicit analysis, and deliberate, innate, contextual, and learned processing evaluations and biases. PMID:24402720

  3. Comparative analysis of health risk assessments for municipal waste combustors

    SciTech Connect

    Levin, A.; Fratt, D.B.; Leonard, A.; Bruins, R.J.F.; Fradkin, L.

    1991-01-01

    Quantitative health risk assessments have been performed for a number of proposed municipal waste combustor (MWC) facilities over the past several years. The article presents the results of a comparative analysis of a total of 21 risk assessments, focusing on seven of the most comprehensive methodologies. The analysis concentrates on stack emissions of noncriteria pollutants and is comparative rather than critical in nature. Overall, the risk assessment methodologies used were similar whereas the assumptions and input values used varied from study to study. Some of the variability results directly from differences in site-specific characteristics, but much of it is due to absence of data, lack of field validation, lack of specific guidelines from regulatory agencies, and reliance on professional judgment. The results indicate that carcinogenic risks are more significant than chronic non-carcinogenic risks. In most instances polychlorodibenzodioxins, polychlorodibenzofurans, and cadmium contribute more significantly to the total carcinogenic risk from MWC stack emissions than other contaminants. In addition, the contribution to total risk of all indirect routes of exposure (ingestion and dermal contact) exceeds that of the direct inhalation route for most studies reviewed.

  4. Comparative analysis of health risk assessments for municipal waste combustors.

    PubMed

    Levin, A; Fratt, D B; Leonard, A; Bruins, R J; Fradkin, L

    1991-01-01

    Quantitative health risk assessments have been performed for a number of proposed municipal waste combustor (MWC) facilities over the past several years. This article presents the results of a comparative analysis of a total of 21 risk assessments, focusing on seven of the most comprehensive methodologies. The analysis concentrates on stack emissions of noncriteria pollutants and is comparative rather than critical in nature. Overall, the risk assessment methodologies used were similar whereas the assumptions and input values used varied from study to study. Some of this variability results directly from differences in site-specific characteristics, but much of it is due to absence of data, lack of field validation, lack of specific guidelines from regulatory agencies, and reliance on professional judgment. The results indicate that carcinogenic risks are more significant than chronic non-carcinogenic risks. In most instances polychlorodibenzodioxins, polychlorodibenzofurans, and cadmium contribute more significantly to the total carcinogenic risk from MWC stack emissions than other contaminants. In addition, the contribution to total risk of all indirect routes of exposure (ingestion and dermal contact) exceeds that of the direct inhalation route for most studies reviewed.

  5. Comparative analysis of health risk assessments for municipal waste combustors

    SciTech Connect

    Levin, A.; Fratt, D.B.; Leonard, A.; Bruins, R.J.; Fradkin, L. )

    1991-01-01

    Quantitative health risk assessments have been performed for a number of proposed municipal waste combustor (MWC) facilities over the past several years. This article presents the results of a comparative analysis of a total of 21 risk assessments, focusing on seven of the most comprehensive methodologies. The analysis concentrates on stack emissions of noncriteria pollutants and is comparative rather than critical in nature. Overall, the risk assessment methodologies used were similar whereas the assumptions and input values used varied from study to study. Some of this variability results directly from differences in site-specific characteristics, but much of it is due to absence of data, lack of field validation, lack of specific guidelines from regulatory agencies, and reliance on professional judgment. The results indicate that carcinogenic risks are more significant than chronic non-carcinogenic risks. In most instances polychlorodibenzodioxins, polychlorodibenzofurans, and cadmium contribute more significantly to the total carcinogenic risk from MWC stack emissions than other contaminants. In addition, the contribution to total risk of all indirect routes of exposure (ingestion and dermal contact) exceeds that of the direct inhalation route for most studies reviewed. 42 refs.

  6. Risk analysis. HIV / AIDS country profile: Mozambique.

    PubMed

    1996-12-01

    Mozambique's National STD/AIDS Control Program (NACP) estimates that, at present, about 8% of the population is infected with human immunodeficiency virus (HIV). The epidemic is expected to peak in 1997. By 2001, Mozambique is projected to have 1,650,000 HIV-positive adults 15-49 years of age, of whom 500,000 will have developed acquired immunodeficiency syndrome (AIDS), and 500,000 AIDS orphans. Incidence rates are highest in the country's central region, the transport corridors, and urban centers. The rapid spread of HIV has been facilitated by extreme poverty, the social upheaval and erosion of traditional norms created by years of political conflict and civil war, destruction of the primary health care infrastructure, growth of the commercial sex work trade, and labor migration to and from neighboring countries with high HIV prevalence. Moreover, about 10% of the adult population suffers from sexually transmitted diseases (STDs), including genital ulcers. NACP, created in 1988, is attempting to curb the further spread of HIV through education aimed at changing high-risk behaviors and condom distribution to prevent STD transmission. Theater performances and radio/television programs are used to reach the large illiterate population. The integration of sex education and STD/AIDS information in the curricula of primary and secondary schools and universities has been approved by the Ministry of Education. Several private companies have been persuaded to distribute condoms to their employees. Finally, the confidentiality of HIV patients has been guaranteed. In 1993, the total AIDS budget was US $1.67 million, 50% of which was provided by the European Union. The European Commission seeks to develop a national strategy for managing STDs within the primary health care system.

  7. Quantitative Risk Analysis of Obstacle Limitation Standards

    NASA Astrophysics Data System (ADS)

    Sandaradura, Amila Silva

    Obstacle limitation surfaces (OLS) are the main safeguard against objects that can pose a hazard to aircraft operations at and around the airports. The standard dimensions of the most of these surfaces were estimated using the pilot's experience at the time when they were included in to the standard documents. As a result, some of these standards may have been overestimated while others may not provide an adequate level of safety. With airports moving to the Safety Management System (SMS) approach to design and operations safety, proper evaluation of the level of safety provided by OLS at specific sites becomes great importance to airport operators. There is no published evidence, however, for the estimation of the safety level provided by the existing OLS standards. Moreover, the rationale used by the ICAO to establish existing OLS standards is not readily available in the standard documents. Therefore this study attempts to collect actual flight path data using information provided by air traffic control radars and construct a methodology to assess the probability of aircraft deviating from their intended/protected path. The extension of the developed methodology can be used to estimate the OLS dimensions that provide an acceptable safety level for the aircraft operations. This will be helpful to estimate safe and efficient standard dimensions of the OLS and assess the risk level of objects to the aircraft operations around airports. In order to assess the existing standards and show the applications of the methodology, three case studies were conducted using aircraft data collected from Ottawa (CYOW), Calgary (CYYC) and Edmonton (CYEG) International Airports.

  8. Risk analysis. HIV / AIDS country profile: Mozambique.

    PubMed

    1996-12-01

    Mozambique's National STD/AIDS Control Program (NACP) estimates that, at present, about 8% of the population is infected with human immunodeficiency virus (HIV). The epidemic is expected to peak in 1997. By 2001, Mozambique is projected to have 1,650,000 HIV-positive adults 15-49 years of age, of whom 500,000 will have developed acquired immunodeficiency syndrome (AIDS), and 500,000 AIDS orphans. Incidence rates are highest in the country's central region, the transport corridors, and urban centers. The rapid spread of HIV has been facilitated by extreme poverty, the social upheaval and erosion of traditional norms created by years of political conflict and civil war, destruction of the primary health care infrastructure, growth of the commercial sex work trade, and labor migration to and from neighboring countries with high HIV prevalence. Moreover, about 10% of the adult population suffers from sexually transmitted diseases (STDs), including genital ulcers. NACP, created in 1988, is attempting to curb the further spread of HIV through education aimed at changing high-risk behaviors and condom distribution to prevent STD transmission. Theater performances and radio/television programs are used to reach the large illiterate population. The integration of sex education and STD/AIDS information in the curricula of primary and secondary schools and universities has been approved by the Ministry of Education. Several private companies have been persuaded to distribute condoms to their employees. Finally, the confidentiality of HIV patients has been guaranteed. In 1993, the total AIDS budget was US $1.67 million, 50% of which was provided by the European Union. The European Commission seeks to develop a national strategy for managing STDs within the primary health care system. PMID:12320532

  9. Hypertension and Risk of Cataract: A Meta-Analysis

    PubMed Central

    Yu, Xiaoning; Lyu, Danni; Dong, Xinran; He, Jiliang; Yao, Ke

    2014-01-01

    Background Cataract is the major cause of blindness across the world. Many epidemiologic studies indicated that hypertension might play an important role in the development of cataract, while others not. We therefore conducted this meta-analysis to determine the relationship between risk of cataract and hypertension. Methods Retrieved studies on the association of hypertension with cataract risk were collected from PubMed, Web of Science and the Cochrane Library during June 2014 and were included into the final analysis according to the definite inclusion criteria. Odds ratio (OR) or risk ratio (RR) were pooled with 95% confidence interval (CI) to evaluate the relationship between hypertension and cataract risk. Subgroup analyses were carried out on the basis of cataract type, race and whether studies were adjusted for main components of metabolic syndrome (MS). Results The final meta-analysis included 25 studies (9 cohort, 5 case-control and 11 cross-sectional) from 23 articles. The pooled results showed that cataract risk in populations with hypertension significantly increased among cohort studies (RR 1.08; 95% CI: 1.05–1.12) and case-control or cross-sectional studies (OR 1.28; 95% CI: 1.12–1.45). This association was proved to be true among both Mongolians and Caucasians, and the significance was not altered by the adjustment of main components of MS. Subgroup analysis on cataract types indicated that an increased incidence of posterior subcapsular cataract (PSC) resulted among cohort studies (RR 1.22; 95% CI: 1.03–1.46) and cross-sectional/case-control studies (OR 1.23; 95% CI: 1.09–1.39). No association of hypertension with risk of nuclear cataract was found. Conclusions The present meta-analysis suggests that hypertension increases the risk of cataract, especially PSC. Further efforts should be made to explore the potential biological mechanisms. PMID:25474403

  10. Risk analysis of 222Rn gas received from East Anatolian Fault Zone in Turkey

    NASA Astrophysics Data System (ADS)

    Yilmaz, Mucahit; Kulahci, Fatih

    2016-06-01

    In this study, risk analysis and probability distribution methodologies are applied for 222Rn gas data received from Sürgü (Malatya) station located on East Anatolian Fault Zone (EAFZ). 222Rn data are recorded between 21.02.2007 and 06.06.2010 dates. For study are used total 1151 222Rn data. Changes in concentration of 222Rn are modeled as statistically.

  11. INDEPENDENT COMPONENT ANALYSIS (ICA) APPLIED TO LONG BUNCH BEAMS IN THE LOS ALAMOS PROTON STORAGE RING

    SciTech Connect

    Kolski, Jeffrey S.; Macek, Robert J.; McCrady, Rodney C.; Pang, Xiaoying

    2012-05-14

    Independent component analysis (ICA) is a powerful blind source separation (BSS) method. Compared to the typical BSS method, principal component analysis (PCA), which is the BSS foundation of the well known model independent analysis (MIA), ICA is more robust to noise, coupling, and nonlinearity. ICA of turn-by-turn beam position data has been used to measure the transverse betatron phase and amplitude functions, dispersion function, linear coupling, sextupole strength, and nonlinear beam dynamics. We apply ICA in a new way to slices along the bunch and discuss the source signals identified as betatron motion and longitudinal beam structure.

  12. A semi-quantitative approach to GMO risk-benefit analysis.

    PubMed

    Morris, E Jane

    2011-10-01

    In many countries there are increasing calls for the benefits of genetically modified organisms (GMOs) to be considered as well as the risks, and for a risk-benefit analysis to form an integral part of GMO regulatory frameworks. This trend represents a shift away from the strict emphasis on risks, which is encapsulated in the Precautionary Principle that forms the basis for the Cartagena Protocol on Biosafety, and which is reflected in the national legislation of many countries. The introduction of risk-benefit analysis of GMOs would be facilitated if clear methodologies were available to support the analysis. Up to now, methodologies for risk-benefit analysis that would be applicable to the introduction of GMOs have not been well defined. This paper describes a relatively simple semi-quantitative methodology that could be easily applied as a decision support tool, giving particular consideration to the needs of regulators in developing countries where there are limited resources and experience. The application of the methodology is demonstrated using the release of an insect resistant maize variety in South Africa as a case study. The applicability of the method in the South African regulatory system is also discussed, as an example of what might be involved in introducing changes into an existing regulatory process.

  13. A semi-quantitative approach to GMO risk-benefit analysis.

    PubMed

    Morris, E Jane

    2011-10-01

    In many countries there are increasing calls for the benefits of genetically modified organisms (GMOs) to be considered as well as the risks, and for a risk-benefit analysis to form an integral part of GMO regulatory frameworks. This trend represents a shift away from the strict emphasis on risks, which is encapsulated in the Precautionary Principle that forms the basis for the Cartagena Protocol on Biosafety, and which is reflected in the national legislation of many countries. The introduction of risk-benefit analysis of GMOs would be facilitated if clear methodologies were available to support the analysis. Up to now, methodologies for risk-benefit analysis that would be applicable to the introduction of GMOs have not been well defined. This paper describes a relatively simple semi-quantitative methodology that could be easily applied as a decision support tool, giving particular consideration to the needs of regulators in developing countries where there are limited resources and experience. The application of the methodology is demonstrated using the release of an insect resistant maize variety in South Africa as a case study. The applicability of the method in the South African regulatory system is also discussed, as an example of what might be involved in introducing changes into an existing regulatory process. PMID:21197601

  14. 2007 Wholesale Power Rate Case Initial Proposal : Risk Analysis Study.

    SciTech Connect

    United States. Bonneville Power Administration.

    2005-11-01

    The Federal Columbia River Power System (FCRPS), operated on behalf of the ratepayers of the PNW by BPA and other Federal agencies, faces many uncertainties during the FY 2007-2009 rate period. Among these uncertainties, the largest revolve around hydro conditions, market prices and river operations for fish recovery. In order to provide a high probability of making its U.S. Treasury payments, BPA performs a Risk Analysis as part of its rate-making process. In this Risk Analysis, BPA identifies key risks, models their relationships, and then analyzes their impacts on net revenues (total revenues less expenses). BPA subsequently evaluates in the ToolKit Model the Treasury Payment Probability (TPP) resulting from the rates, risks, and risk mitigation measures described here and in the Wholesale Power Rate Development Study (WPRDS). If the TPP falls short of BPA's standard, additional risk mitigation revenues, such as PNRR and CRAC revenues are incorporated in the modeling in ToolKit until the TPP standard is met. Increased wholesale market price volatility and six years of drought have significantly changed the profile of risk and uncertainty facing BPA and its stakeholders. These present new challenges for BPA in its effort to keep its power rates as low as possible while fully meeting its obligations to the U.S. Treasury. As a result, the risk BPA faces in not receiving the level of secondary revenues that have been credited to power rates before receiving those funds is greater. In addition to market price volatility, BPA also faces uncertainty around the financial impacts of operations for fish programs in FY 2006 and in the FY 2007-2009 rate period. A new Biological Opinion or possible court-ordered change to river operations in FY 2006 through FY 2009 may reduce BPA's net revenues included Initial Proposal. Finally, the FY 2007-2009 risk analysis includes new operational risks as well as a more comprehensive analysis of non-operating risks. Both the operational

  15. [Competitive karate and the risk of HIV infection--review, risk analysis and risk minimizing strategies].

    PubMed

    Müller-Rath, R; Mumme, T; Miltner, O; Skobel, E

    2004-03-01

    Bleeding facial injuries are not uncommon in competitive karate. Nevertheless, the risk of an infection with HIV is extremely low. Guidelines about the prevention of HIV infections are presented. Especially in contact sports and martial arts the athletes, judges and staff have to recognize and employ these recommendations. Bleeding wounds of the hands due to contact with the opponents teeth can be minimized by fist padding.

  16. Bayes Method Plant Aging Risk Analysis

    1992-03-13

    DORIAN is an integrated package for performing Bayesian aging analysis of reliability data; e.g. for identifying trends in component failure rates and/or outage durations as a function of time. The user must specify several alternatives hypothesized aging models (i.e. possible trends) along with prior probabilities indicating the subjective probability that each trend is actually the correct one. DORIAN then uses component failure and/or repair data over time to update these prior probabilities and develop amore » posterior probability for each aging model, representing the probability that each model is the correct one in light of the observed data rather than a priori. Mean, median, and 5th and 95th percentile trends are also compiled from the posterior probabilities.« less

  17. RiskChanges Spatial Decision Support system for the analysis of changing multi-hazard risk

    NASA Astrophysics Data System (ADS)

    van Westen, Cees; Zhang, Kaixi; Bakker, Wim; Andrejchenko, Vera; Berlin, Julian; Olyazadeh, Roya; Cristal, Irina

    2015-04-01

    Within the framework of the EU FP7 Marie Curie Project CHANGES and the EU FP7 Copernicus project INCREO a spatial decision support system was developed with the aim to analyse the effect of risk reduction planning alternatives on reducing the risk now and in the future, and support decision makers in selecting the best alternatives. Central to the SDSS are the stakeholders. The envisaged users of the system are organizations involved in planning of risk reduction measures, and that have staff capable of visualizing and analyzing spatial data at a municipal scale. The SDSS should be able to function in different countries with different legal frameworks and with organizations with different mandates. These could be subdivided into Civil protection organization with the mandate to design disaster response plans, Expert organizations with the mandate to design structural risk reduction measures (e.g. dams, dikes, check-dams etc), and planning organizations with the mandate to make land development plans. The SDSS can be used in different ways: analyzing the current level of risk, analyzing the best alternatives for risk reduction, the evaluation of the consequences of possible future scenarios to the risk levels, and the evaluation how different risk reduction alternatives will lead to risk reduction under different future scenarios. The SDSS is developed based on open source software and following open standards, for code as well as for data formats and service interfaces. Code development was based upon open source software as well. The architecture of the system is modular. The various parts of the system are loosely coupled, extensible, using standards for interoperability, flexible and web-based. The Spatial Decision Support System is composed of a number of integrated components. The Risk Assessment component allows to carry out spatial risk analysis, with different degrees of complexity, ranging from simple exposure (overlay of hazard and assets maps) to

  18. Japanese Encephalitis Risk and Contextual Risk Factors in Southwest China: A Bayesian Hierarchical Spatial and Spatiotemporal Analysis

    PubMed Central

    Zhao, Xing; Cao, Mingqin; Feng, Hai-Huan; Fan, Heng; Chen, Fei; Feng, Zijian; Li, Xiaosong; Zhou, Xiao-Hua

    2014-01-01

    It is valuable to study the spatiotemporal pattern of Japanese encephalitis (JE) and its association with the contextual risk factors in southwest China, which is the most endemic area in China. Using data from 2004 to 2009, we applied GISmapping and spatial autocorrelation analysis to analyze reported incidence data of JE in 438 counties in southwest China, finding that JE cases were not randomly distributed, and a Bayesian hierarchical spatiotemporal model identified the east part of southwest China as a high risk area. Meanwhile, the Bayesian hierarchical spatial model in 2006 demonstrated a statistically significant association between JE and the agricultural and climatic variables, including the proportion of rural population, the pig-to-human ratio, the monthly precipitation and the monthly mean minimum and maximum temperatures. Particular emphasis was placed on the time-lagged effect for climatic factors. The regression method and the Spearman correlation analysis both identified a two-month lag for the precipitation, while the regression method found a one-month lag for temperature. The results show that the high risk area in the east part of southwest China may be connected to the agricultural and climatic factors. The routine surveillance and the allocation of health resources should be given more attention in this area. Moreover, the meteorological variables might be considered as possible predictors of JE in southwest China. PMID:24739769

  19. Research in progress in applied mathematics, numerical analysis, fluid mechanics, and computer science

    NASA Technical Reports Server (NTRS)

    1994-01-01

    This report summarizes research conducted at the Institute for Computer Applications in Science and Engineering in applied mathematics, fluid mechanics, and computer science during the period October 1, 1993 through March 31, 1994. The major categories of the current ICASE research program are: (1) applied and numerical mathematics, including numerical analysis and algorithm development; (2) theoretical and computational research in fluid mechanics in selected areas of interest to LaRC, including acoustics and combustion; (3) experimental research in transition and turbulence and aerodynamics involving LaRC facilities and scientists; and (4) computer science.

  20. DMAICR in an ergonomic risks analysis.

    PubMed

    Santos, E F; Lima, C R C

    2012-01-01

    The DMAICR problem-solving methodology is used throughout this paper to show you how to implement ergonomics recommendations. The DMAICR method consists of the following five six steps by which you can solve ergonomic design problems: The steps of the proposed method, adapting DMAICR, are the following: In the steep D, there is the definition of the project or the situation to be assessed and its guiding objectives, known as demand. In the step M, it relates to the work, tasks and organizational protocols and also includes the need of measuring. In the step A, all concepts are about the analysis itself. The step I is the moment of improving or incrementing. In the step C, control, prevention from prospective troublesome situation and implementation of management are the activities controlling the situation. R is Report. Some relevant technical and conceptual aspects for the comparison of these methodologies are illustrated in this paper. The steps of DMAICR were taken by a multifunctional team (multi-professional and multi-disciplinary) termed as focus group, composed by selected members of the company and supported by experts in ergonomics.

  1. Overreliance on a single study: there is no real evidence that applying quality criteria to exposure in asbestos epidemiology affects the estimated risk.

    PubMed

    Berman, D Wayne; Case, Bruce W

    2012-10-01

    A critical need exists for reliable risk management policies and practices that can effectively mitigate asbestos-related health threats, and such policies and practices need to be based on sound science that adequately distinguishes hazardous situations from those that are not. Toward that end, the disparate means by which study quality has been addressed in recent meta-analyses used to establish potency factors (K ( L ) and K ( M ) values) for asbestos cancer risks were compared by conducting additional sensitivity analyses. Results suggest that, other than placing undue emphasis on the influence of the K ( L ) and K ( M ) values reported from a single study, there appears to be little to no evidence of a systematic effect of study quality on K ( L ) or K ( M ) values; none of the findings warrant excluding studies from current or future meta-analyses. Thus, we argue that it is better to include as much of the available data as possible in these analyses while formally addressing uncertainty as part of the analysis itself, rather than sequentially excluding studies based on one type of limitation or another. Throwing out data without clearly proving some type of bias is never a good idea because it will limit both the power to test various hypotheses and the confidence that can be placed in any findings that are derived from the resulting, truncated data set. We also believe that it is better to identify the factors that contribute to variation between studies included in a meta-analysis and, by adjusting for such factors as part of a model, showing that the disparate values from individual studies can be reconciled. If such factors are biologically reasonable (based on other evidence) and, if such a model can be shown to fit the data from all studies in the meta-analysis, the model is likely to be predictive of the parameters being evaluated and can then be applied to new (unstudied) environments.

  2. State of the art in benefit-risk analysis: food and nutrition.

    PubMed

    Tijhuis, M J; de Jong, N; Pohjola, M V; Gunnlaugsdóttir, H; Hendriksen, M; Hoekstra, J; Holm, F; Kalogeras, N; Leino, O; van Leeuwen, F X R; Luteijn, J M; Magnússon, S H; Odekerken, G; Rompelberg, C; Tuomisto, J T; Ueland, Ø; White, B C; Verhagen, H

    2012-01-01

    Benefit-risk assessment in food and nutrition is relatively new. It weighs the beneficial and adverse effects that a food (component) may have, in order to facilitate more informed management decisions regarding public health issues. It is rooted in the recognition that good food and nutrition can improve health and that some risk may be acceptable if benefit is expected to outweigh it. This paper presents an overview of current concepts and practices in benefit-risk analysis for food and nutrition. It aims to facilitate scientists and policy makers in performing, interpreting and evaluating benefit-risk assessments. Historically, the assessments of risks and benefits have been separate processes. Risk assessment is mainly addressed by toxicology, as demanded by regulation. It traditionally assumes that a maximum safe dose can be determined from experimental studies (usually in animals) and that applying appropriate uncertainty factors then defines the 'safe' intake for human populations. There is a minor role for other research traditions in risk assessment, such as epidemiology, which quantifies associations between determinants and health effects in humans. These effects can be both adverse and beneficial. Benefit assessment is newly developing in regulatory terms, but has been the subject of research for a long time within nutrition and epidemiology. The exact scope is yet to be defined. Reductions in risk can be termed benefits, but also states rising above 'the average health' are explored as benefits. In nutrition, current interest is in 'optimal' intake; from a population perspective, but also from a more individualised perspective. In current approaches to combine benefit and risk assessment, benefit assessment mirrors the traditional risk assessment paradigm of hazard identification, hazard characterization, exposure assessment and risk characterization. Benefit-risk comparison can be qualitative and quantitative. In a quantitative comparison, benefits

  3. Clean Air Act Title III accidental emission release risk management program, and how it applies to landfills

    SciTech Connect

    Hibbard, C.S.

    1999-07-01

    On June 20, 1996, EPA promulgated regulations pursuant to Title III of the Clean Air Act (CAA) Amendments of 1990 (Section 112(r)(7) of the CAA). The rule, contained in 40 CFR Part 68, is called Accidental Release Prevention Requirements: Risk Management Programs, and is intended to improve accident prevention and emergency response practices at facilities that store and/or use hazardous substances. Methane is a designated highly hazardous chemical (HHC) under the rule. The rule applies to facilities that have 10,000 pounds of methane or more in any process, roughly equivalent to about 244,000 cubic feet of methane. The US EPA has interpreted this threshold quantity as applying to landfill gas within landfills. This paper presents an overview of the Accidental Release Prevention regulations, and how landfills are affected by the requirements. This paper describes methodologies for calculating the threshold quantity of landfill gas in a landfill. Methane is in landfill gas as a mixture. Because landfill gas can burn readily, down to concentrations of about five percent methane, the entire landfill gas mixture must be treated as the regulated substance, and counts toward the 10,000-pound threshold. It is reasonable to assume that the entire landfill gas collection system, active or passive, is filled with landfill gas, and that a calculation of the volume of the system would be a calculation of the landfill gas present in the process on the site. However, the US EPA has indicated that there are some instances in which pore space gas should be included in this calculation. This paper presents methods available to calculate the amount of pore space gas in a landfill, and how to determine how much of that gas might be available for an explosion. The paper goes through how to conduct the release assessment to determine the worst-case hazard zone around the landfill.

  4. Addressing challenging behaviour in children with Down syndrome: the use of applied behaviour analysis for assessment and intervention.

    PubMed

    Feeley, Kathleen M; Jones, Emily A

    2006-09-01

    Children with Down syndrome are at an increased risk for engaging in challenging behaviour that may be part of a behavioural phenotype characteristic of Down syndrome. The methodology of applied behaviour analysis has been demonstrated effective with a wide range of challenging behaviours, across various disabilities. Applications to children with Down syndrome and the examination of behaviourally based strategies to specifically address the unique characteristics of children with Down syndrome are limited. However, there are several studies in which a subset of the participants did have Down syndrome. A handful of these studies are reviewed within the context of functional behaviour assessment and Positive Behavioural Supports. Drawing from these studies and the behavioural literature, as well as the authors' clinical experience and research, suggestions regarding early intervention for challenging behaviour with children with Down syndrome are provided.

  5. Key Attributes of the SAPHIRE Risk and Reliability Analysis Software for Risk-Informed Probabilistic Applications

    SciTech Connect

    Curtis Smith; James Knudsen; Kellie Kvarfordt; Ted Wood

    2008-08-01

    The Idaho National Laboratory is a primary developer of probabilistic risk and reliability analysis (PRRA) tools, dating back over 35 years. Evolving from mainframe-based software, the current state-of-the-practice has lead to the creation of the SAPHIRE software. Currently, agencies such as the Nuclear Regulatory Commission, the National Aeronautics and Aerospace Agency, the Department of Energy, and the Department of Defense use version 7 of the SAPHIRE software for many of their risk-informed activities. In order to better understand and appreciate the power of software as part of risk-informed applications, we need to recall that our current analysis methods and solution methods have built upon pioneering work done 30 to 40 years ago. We contrast this work with the current capabilities in the SAPHIRE analysis package. As part of this discussion, we provide information for both the typical features and special analysis capabilities which are available. We also present the application and results typically found with state-of-the-practice PRRA models. By providing both a high-level and detailed look at the SAPHIRE software, we give a snapshot in time for the current use of software tools in a risk-informed decision arena.

  6. Documentary analysis of risk-assessment and safety-planning policies and tools in a mental health context.

    PubMed

    Higgins, Agnes; Doyle, Louise; Morrissey, Jean; Downes, Carmel; Gill, Ailish; Bailey, Sive

    2016-08-01

    Despite the articulated need for policies and processes to guide risk assessment and safety planning, limited guidance exists on the processes or procedures to be used to develop such policies, and there is no body of research that examines the quality or content of the risk-management policies developed. The aim of the present study was to analyse the policies of risk and safety management used to guide mental health nursing practice in Ireland. A documentary analysis was performed on 123 documents received from 22 of the 23 directors of nursing contacted. Findings from the analysis revealed a wide variation in how risk, risk assessment, and risk management were defined. Emphasis within the risk documentation submitted was on risk related to self and others, with minimal attention paid to other types of risks. In addition, there was limited evidence of recovery-focused approaches to positive risk taking that involved service users and their families within the risk-related documentation. Many of the risk-assessment tools had not been validated, and lacked consistency or guidance in relation to how they were to be used or applied. The tick-box approach and absence of space for commentary within documentation have the potential to impact severely on the quality of information collected and documented, and subsequent clinical decision-making. Managers, and those tasked with ensuring safety and quality, need to ensure that policies and processes are, where possible, informed by best evidence and are in line with national mental health policy on recovery.

  7. Applying behavior analysis to clinical problems: review and analysis of habit reversal.

    PubMed Central

    Miltenberger, R G; Fuqua, R W; Woods, D W

    1998-01-01

    This article provides a review and analysis of habit reversal, a multicomponent procedure developed by Azrin and Nunn (1973, 1974) for the treatment of nervous habits, tics, and stuttering. The article starts with a discussion of the behaviors treated with habit reversal, behavioral covariation among habits, and functional analysis and assessment of habits. Research on habit reversal and simplified versions of the procedure is then described. Next the article discusses the limitations of habit reversal and the evidence for its generality. The article concludes with an analysis of the behavioral processes involved in habit reversal and suggestions for future research. PMID:9757583

  8. A Study in the Founding of Applied Behavior Analysis Through Its Publications

    PubMed Central

    Morris, Edward K.; Altus, Deborah E.; Smith, Nathaniel G.

    2013-01-01

    This article reports a study of the founding of applied behavior analysis through its publications. Our methods included hand searches of sources (e.g., journals, reference lists), search terms (i.e., early, applied, behavioral, research, literature), inclusion criteria (e.g., the field's applied dimension), and (d) challenges to their face and content validity. Our results were 36 articles published between 1959 and 1967 that we organized into 4 groups: 12 in 3 programs of research and 24 others. Our discussion addresses (a) limitations in our method (e.g., the completeness of our search), (b) challenges to the validity of our methods and results (e.g., convergent validity), and (c) priority claims about the field's founding. We conclude that the claims are irresolvable because identification of the founding publications depends significantly on methods and because the field's founding was an evolutionary process. We close with suggestions for future research. PMID:25729133

  9. Risk analysis of prospects vs. plays: The play`s the thing!

    SciTech Connect

    Rose, P.R.

    1996-09-01

    The most difficult and crucial decision in petroleum exploration is not which prospect to drill, but rather, which new play to enter. Such a decision, whether ultimately profitable or not, commits the Organization to years of involvement, expenditures of $millions, and hundreds of man-years of effort. Even though uncertainties and risks are high, organizations commonly make the new-play decision in a disjointed, non-analytic, even superficial way. The economic consequences of a bad play choice can be disastrous. Using established principles of prospect risk analysis, modern petroleum exploration organizations routinely assign economic value to individual prospects, but they actually operate via exploration programs in plays and trends. Accordingly, the prospect is the economic unit of exploration, whereas the play is the operational unit. Plays can be successfully analyzed as full-cycle economic risk ventures, however, using many principles of prospect risk analysis. Economic measures such as Expected Present Value, DCFROR, etc. apply equally to plays or prospects. The predicted field-size distribution of the play is analogous to the forecast prospect reserves distribution. Economic truncation applies to both. Variance of play reserves is usually much greater than for prospect reserves. Geologic chance factors such as P{sub reservoir}, P{sub generation}, etc., must be distinguished as independent or shared among prospects in the play, so they should be defined so as to apply as well to the play as to its constituent prospects. They are analogous to multiple objectives on a prospect, and are handled different in reforming the risk analysis.

  10. An improved method for risk evaluation in failure modes and effects analysis of CNC lathe

    NASA Astrophysics Data System (ADS)

    Rachieru, N.; Belu, N.; Anghel, D. C.

    2015-11-01

    Failure mode and effects analysis (FMEA) is one of the most popular reliability analysis tools for identifying, assessing and eliminating potential failure modes in a wide range of industries. In general, failure modes in FMEA are evaluated and ranked through the risk priority number (RPN), which is obtained by the multiplication of crisp values of the risk factors, such as the occurrence (O), severity (S), and detection (D) of each failure mode. However, the crisp RPN method has been criticized to have several deficiencies. In this paper, linguistic variables, expressed in Gaussian, trapezoidal or triangular fuzzy numbers, are used to assess the ratings and weights for the risk factors S, O and D. A new risk assessment system based on the fuzzy set theory and fuzzy rule base theory is to be applied to assess and rank risks associated to failure modes that could appear in the functioning of Turn 55 Lathe CNC. Two case studies have been shown to demonstrate the methodology thus developed. It is illustrated a parallel between the results obtained by the traditional method and fuzzy logic for determining the RPNs. The results show that the proposed approach can reduce duplicated RPN numbers and get a more accurate, reasonable risk assessment. As a result, the stability of product and process can be assured.

  11. MetaNetVar: Pipeline for applying network analysis tools for genomic variants analysis

    PubMed Central

    Moyer, Eric; Hagenauer, Megan; Lesko, Matthew; Francis, Felix; Rodriguez, Oscar; Nagarajan, Vijayaraj; Huser, Vojtech; Busby, Ben

    2016-01-01

    Network analysis can make variant analysis better. There are existing tools like HotNet2 and dmGWAS that can provide various analytical methods. We developed a prototype of a pipeline called MetaNetVar that allows execution of multiple tools. The code is published at https://github.com/NCBI-Hackathons/Network_SNPs. A working prototype is published as an Amazon Machine Image - ami-4510312f . PMID:27158457

  12. Rocky Flats Plant Live-Fire Range Risk Analysis Report

    SciTech Connect

    Nicolosi, S.L.; Rodriguez, M.A.

    1994-04-01

    The objective of the Live-Fire Range Risk Analysis Report (RAR) is to provide an authorization basis for operation as required by DOE 5480.16. The existing Live-Fire Range does not have a safety analysis-related authorization basis. EG&G Rocky Flats, Inc. has worked with DOE and its representatives to develop a format and content description for development of an RAR for the Live-Fire Range. Development of the RAR is closely aligned with development of the design for a baffle system to control risks from errant projectiles. DOE 5480.16 requires either an RAR or a safety analysis report (SAR) for live-fire ranges. An RAR rather than a SAR was selected in order to gain flexibility to more closely address the safety analysis and conduct of operation needs for a live-fire range in a cost-effective manner.

  13. Preliminary Technical Risk Analysis for the Geothermal Technologies Program

    SciTech Connect

    McVeigh, J.; Cohen, J.; Vorum, M.; Porro, G.; Nix, G.

    2007-03-01

    This report explains the goals, methods, and results of a probabilistic analysis of technical risk for a portfolio of R&D projects in the DOE Geothermal Technologies Program ('the Program'). The analysis is a task by Princeton Energy Resources International, LLC (PERI), in support of the National Renewable Energy Laboratory (NREL) on behalf of the Program. The main challenge in the analysis lies in translating R&D results to a quantitative reflection of technical risk for a key Program metric: levelized cost of energy (LCOE). This requires both computational development (i.e., creating a spreadsheet-based analysis tool) and a synthesis of judgments by a panel of researchers and experts of the expected results of the Program's R&D.

  14. Risk analysis and priority setting for environmental policy

    SciTech Connect

    Travis, C.C.

    1991-01-01

    There is a growing realization that the demand for funding to correct our nation's environmental problems will soon outstrip available resources. In the hazardous waste area alone, the estimated cost of remediating Superfund sites ranges from $32 billion to $80 billion. Numerous other areas of competing for these same financial resources. These include ozone depletion, global warming, the protection of endangered species and wetlands, toxic air pollution, carcinogenic pesticides, and urban smog. In response to this imbalance in the supply and demand for national funds, several political constituencies are calling for the use of risk assessment as a tool in the prioritization of research and budget needs. Comparative risk analysis offers a logical framework in which to organize information about complex environmental problems. Risk analysis allows policy analysts to make resource allocation decisions on the basis of scientific judgement rather than political expediency.

  15. Risk analysis and priority setting for environmental policy

    SciTech Connect

    Travis, C.C.

    1991-12-31

    There is a growing realization that the demand for funding to correct our nation`s environmental problems will soon outstrip available resources. In the hazardous waste area alone, the estimated cost of remediating Superfund sites ranges from $32 billion to $80 billion. Numerous other areas of competing for these same financial resources. These include ozone depletion, global warming, the protection of endangered species and wetlands, toxic air pollution, carcinogenic pesticides, and urban smog. In response to this imbalance in the supply and demand for national funds, several political constituencies are calling for the use of risk assessment as a tool in the prioritization of research and budget needs. Comparative risk analysis offers a logical framework in which to organize information about complex environmental problems. Risk analysis allows policy analysts to make resource allocation decisions on the basis of scientific judgement rather than political expediency.

  16. Applying behavior analysis to school violence and discipline problems: Schoolwide positive behavior support.

    PubMed

    Anderson, Cynthia M; Kincaid, Donald

    2005-01-01

    School discipline is a growing concern in the United States. Educators frequently are faced with discipline problems ranging from infrequent but extreme problems (e.g., shootings) to less severe problems that occur at high frequency (e.g., bullying, insubordination, tardiness, and fighting). Unfortunately, teachers report feeling ill prepared to deal effectively with discipline problems in schools. Further, research suggests that many commonly used strategies, such as suspension, expulsion, and other reactive strategies, are not effective for ameliorating discipline problems and may, in fact, make the situation worse. The principles and technology of behavior analysis have been demonstrated to be extremely effective for decreasing problem behavior and increasing social skills exhibited by school children. Recently, these principles and techniques have been applied at the level of the entire school, in a movement termed schoolwide positive behavior support. In this paper we review the tenets of schoolwide positive behavior support, demonstrating the relation between this technology and applied behavior analysis. PMID:22478439

  17. Applying behavior analysis to school violence and discipline problems: Schoolwide positive behavior support

    PubMed Central

    Anderson, Cynthia M.; Kincaid, Donald

    2005-01-01

    School discipline is a growing concern in the United States. Educators frequently are faced with discipline problems ranging from infrequent but extreme problems (e.g., shootings) to less severe problems that occur at high frequency (e.g., bullying, insubordination, tardiness, and fighting). Unfortunately, teachers report feeling ill prepared to deal effectively with discipline problems in schools. Further, research suggests that many commonly used strategies, such as suspension, expulsion, and other reactive strategies, are not effective for ameliorating discipline problems and may, in fact, make the situation worse. The principles and technology of behavior analysis have been demonstrated to be extremely effective for decreasing problem behavior and increasing social skills exhibited by school children. Recently, these principles and techniques have been applied at the level of the entire school, in a movement termed schoolwide positive behavior support. In this paper we review the tenets of schoolwide positive behavior support, demonstrating the relation between this technology and applied behavior analysis. PMID:22478439

  18. Geospatial analysis applied to epidemiological studies of dengue: a systematic review.

    PubMed

    Oliveira, Maria Aparecida de; Ribeiro, Helena; Castillo-Salgado, Carlos

    2013-12-01

    A systematic review of the geospatial analysis methods used in the dengue fever studies published between January 2001 and March 2011 was undertaken. In accordance with specific selection criteria thirty-five studies were selected for inclusion in the review. The aim was to assess the types of spatial methods that have been used to analyze dengue transmission. We found twenty-one different methods that had been used in dengue fever epidemiological studies in that period, three of which were most frequently used. The results show that few articles had applied spatial analysis methods in dengue fever studies; however, whenever they were applied they contributed to a better understanding of dengue fever geospatial diffusion.

  19. A free and open source QGIS plugin for flood risk analysis: FloodRisk

    NASA Astrophysics Data System (ADS)

    Albano, Raffaele; Sole, Aurelia; Mancusi, Leonardo

    2016-04-01

    An analysis of global statistics shows a substantial increase in flood damage over the past few decades. Moreover, it is expected that flood risk will continue to rise due to the combined effect of increasing numbers of people and economic assets in risk-prone areas and the effects of climate change. In order to increase the resilience of European economies and societies, the improvement of risk assessment and management has been pursued in the last years. This results in a wide range of flood analysis models of different complexities with substantial differences in underlying components needed for its implementation, as geographical, hydrological and social differences demand specific approaches in the different countries. At present, it is emerging the need of promote the creation of open, transparent, reliable and extensible tools for a comprehensive, context-specific and applicable flood risk analysis. In this context, the free and open-source Quantum GIS (QGIS) plugin "FloodRisk" is a good starting point to address this objective. The vision of the developers of this free and open source software (FOSS) is to combine the main features of state-of-the-art science, collaboration, transparency and interoperability in an initiative to assess and communicate flood risk worldwide and to assist authorities to facilitate the quality and fairness of flood risk management at multiple scales. Among the scientific community, this type of activity can be labelled as "participatory research", intended as adopting a set of techniques that "are interactive and collaborative" and reproducible, "providing a meaningful research experience that both promotes learning and generates knowledge and research data through a process of guided discovery"' (Albano et al., 2015). Moreover, this FOSS geospatial approach can lowering the financial barriers to understanding risks at national and sub-national levels through a spatio-temporal domain and can provide better and more complete

  20. Characterization and evaluation of uncertainty in probabilistic risk analysis

    SciTech Connect

    Parry, G.W.; Winter, P.W.

    1981-01-01

    The sources of uncertainty in probabilistic risk analysis are discussed, using the event and fault-tree methodology as an example. The role of statistics in quantifying these uncertainties is investigated. A class of uncertainties is identified which are, at present, unquantifiable using either classical or Bayesian statistics. it is argued that Bayesian statistics is the more appropriate vehicle for the probabilistic analysis of rare events, and a short review is given with some discussion on the representation of ignorance.