Science.gov

Sample records for applying risk analysis

  1. Risk analysis for confined space entries: Critical analysis of four tools applied to three risk scenarios.

    PubMed

    Burlet-Vienney, Damien; Chinniah, Yuvin; Bahloul, Ali; Roberge, Brigitte

    2016-01-01

    Investigation reports of fatal confined space accidents nearly always point to a problem of identifying or underestimating risks. This paper compares 4 different risk analysis tools developed for confined spaces by applying them to 3 hazardous scenarios. The tools were namely 1. a checklist without risk estimation (Tool A), 2. a checklist with a risk scale (Tool B), 3. a risk calculation without a formal hazard identification stage (Tool C), and 4. a questionnaire followed by a risk matrix (Tool D). Each tool's structure and practical application were studied. Tools A and B gave crude results comparable to those of more analytic tools in less time. Their main limitations were lack of contextual information for the identified hazards and greater dependency on the user's expertise and ability to tackle hazards of different nature. Tools C and D utilized more systematic approaches than tools A and B by supporting risk reduction based on the description of the risk factors. Tool D is distinctive because of 1. its comprehensive structure with respect to the steps suggested in risk management, 2. its dynamic approach to hazard identification, and 3. its use of data resulting from the risk analysis.

  2. A Hygrothermal Risk Analysis Applied to Residential Unvented Attics

    SciTech Connect

    Pallin, Simon B; Kehrer, Manfred

    2013-01-01

    Aresidential building, constructed with an unvented attic, is acommonroof assembly in the United States.The expected hygrothermal performance and service life of the roof are difficult to estimate due to a number of varying parameters.Typical parameters expected to vary are the climate, direction, and slope of the roof as well as the radiation properties of the surface material. Furthermore, influential parameters are indoor moisture excess, air leakages through the attic floor, and leakages from air-handling unit and ventilation ducts. In addition, the type of building materials such as the insulation material and closed or open cell spray polyurethane foam will influence the future performance of the roof. A development of a simulation model of the roof assembly will enable a risk and sensitivity analysis, in which the most important varying parameters on the hygrothermal performance can be determined. The model is designed to perform probabilistic simulations using mathematical and hygrothermal calculation tools. The varying input parameters can be chosen from existing measurements, simulations, or standards. An analysis is applied to determine the risk of consequences, such as mold growth, rot, or energy demand of the HVAC unit. Furthermore, the future performance of the roof can be simulated in different climates to facilitate the design of an efficient and reliable roof construction with the most suitable technical solution and to determine the most appropriate building materials for a given climate

  3. Downside Risk analysis applied to the Hedge Funds universe

    NASA Astrophysics Data System (ADS)

    Perelló, Josep

    2007-09-01

    Hedge Funds are considered as one of the portfolio management sectors which shows a fastest growing for the past decade. An optimal Hedge Fund management requires an appropriate risk metrics. The classic CAPM theory and its Ratio Sharpe fail to capture some crucial aspects due to the strong non-Gaussian character of Hedge Funds statistics. A possible way out to this problem while keeping the CAPM simplicity is the so-called Downside Risk analysis. One important benefit lies in distinguishing between good and bad returns, that is: returns greater or lower than investor's goal. We revisit most popular Downside Risk indicators and provide new analytical results on them. We compute these measures by taking the Credit Suisse/Tremont Investable Hedge Fund Index Data and with the Gaussian case as a benchmark. In this way, an unusual transversal lecture of the existing Downside Risk measures is provided.

  4. An advanced method for flood risk analysis in river deltas, applied to societal flood fatality risks in the Netherlands

    NASA Astrophysics Data System (ADS)

    de Bruijn, K. M.; Diermanse, F. L. M.; Beckers, J. V. L.

    2014-02-01

    This paper discusses the new method developed to analyse flood risks in river deltas. Risk analysis of river deltas is complex, because both storm surges and river discharges may cause flooding and since the effect of upstream breaches on downstream water levels and flood risks must be taken into account. A Monte Carlo based flood risk analysis framework for policy making was developed, which considers both storm surges and river flood waves and includes hydrodynamic interaction effects on flood risks. It was applied to analyse societal flood fatality risks (the probability of events with more than N fatalities) in the Rhine-Meuse delta.

  5. An advanced method for flood risk analysis in river deltas, applied to societal flood fatality risk in the Netherlands

    NASA Astrophysics Data System (ADS)

    de Bruijn, K. M.; Diermanse, F. L. M.; Beckers, J. V. L.

    2014-10-01

    This paper discusses a new method for flood risk assessment in river deltas. Flood risk analysis of river deltas is complex, because both storm surges and river discharges may cause flooding and the effect of upstream breaches on downstream water levels and flood risk must be taken into account. This paper presents a Monte Carlo-based flood risk analysis framework for policy making, which considers both storm surges and river flood waves and includes effects from hydrodynamic interaction on flood risk. It was applied to analyse societal flood fatality risk in the Rhine-Meuse delta.

  6. Uncertainty Analysis of A Flood Risk Mapping Procedure Applied In Urban Areas

    NASA Astrophysics Data System (ADS)

    Krause, J.; Uhrich, S.; Bormann, H.; Diekkrüger, B.

    In the framework of IRMA-Sponge program the presented study was part of the joint research project FRHYMAP (flood risk and hydrological mapping). A simple con- ceptual flooding model (FLOODMAP) has been developed to simulate flooded areas besides rivers within cities. FLOODMAP requires a minimum of input data (digital el- evation model (DEM), river line, water level plain) and parameters and calculates the flood extent as well as the spatial distribution of flood depths. of course the simulated model results are affected by errors and uncertainties. Possible sources of uncertain- ties are the model structure, model parameters and input data. Thus after the model validation (comparison of simulated water to observed extent, taken from airborne pictures) the uncertainty of the essential input data set (digital elevation model) was analysed. Monte Carlo simulations were performed to assess the effect of uncertain- ties concerning the statistics of DEM quality and to derive flooding probabilities from the set of simulations. The questions concerning a minimum resolution of a DEM re- quired for flood simulation and concerning the best aggregation procedure of a given DEM was answered by comparing the results obtained using all available standard GIS aggregation procedures. Seven different aggregation procedures were applied to high resolution DEMs (1-2m) in three cities (Bonn, Cologne, Luxembourg). Basing on this analysis the effect of 'uncertain' DEM data was estimated and compared with other sources of uncertainties. Especially socio-economic information and monetary transfer functions required for a damage risk analysis show a high uncertainty. There- fore this study helps to analyse the weak points of the flood risk and damage risk assessment procedure.

  7. Hazard analysis and critical control point systems applied to public health risks: the example of seafood.

    PubMed

    Williams, R A; Zorn, D J

    1997-08-01

    The authors describe the way in which the two components of risk analysis--risk assessment and risk management--can be used in conjunction with the hazard analysis and critical control points concept to determine the allocation of resources at potential critical control points. This approach is examined in the context of risks to human health associated with seafood, and in particular with regard to ciguatera poisoning.

  8. Synthesis of semantic modelling and risk analysis methodology applied to animal welfare.

    PubMed

    Bracke, M B M; Edwards, S A; Metz, J H M; Noordhuizen, J P T M; Algers, B

    2008-07-01

    Decision-making on animal welfare issues requires a synthesis of information. For the assessment of farm animal welfare based on scientific information collected in a database, a methodology called 'semantic modelling' has been developed. To date, however, this methodology has not been generally applied. Recently, a qualitative Risk Assessment approach has been published by the European Food Safety Authority (EFSA) for the first time, concerning the welfare of intensively reared calves. This paper reports on a critical analysis of this Risk Assessment (RA) approach from a semantic-modelling (SM) perspective, emphasizing the importance of several seemingly self-evident principles, including the definition of concepts, application of explicit methodological procedures and specification of how underlying values and scientific information lead to the RA output. In addition, the need to include positive aspects of welfare and overall welfare assessments are emphasized. The analysis shows that the RA approach for animal welfare could benefit from SM methodology to support transparent and science-based decision-making.

  9. Applying Latent Class Analysis to Risk Stratification for Perioperative Mortality in Patients Undergoing Intraabdominal General Surgery.

    PubMed

    Kim, Minjae; Wall, Melanie M; Li, Guohua

    2016-07-01

    Perioperative risk stratification is often performed using individual risk factors without consideration of the syndemic of these risk factors. We used latent class analysis (LCA) to identify the classes of comorbidities and risk factors associated with perioperative mortality in patients presenting for intraabdominal general surgery. The 2005 to 2010 American College of Surgeons National Surgical Quality Improvement Program was used to obtain a cohort of patients undergoing intraabdominal general surgery. Risk factors and comorbidities were entered into LCA models to identify the latent classes, and individuals were assigned to a class based on the highest posterior probability of class membership. Relative risk regression was used to determine the associations between the latent classes and 30-day mortality, with adjustments for procedure. A 9-class model was fit using LCA on 466,177 observations. After combining classes with similar adjusted mortality risks, 5 risk classes were obtained. Compared with the class with average mortality risk (class 4), the risk ratios (95% confidence interval) ranged from 0.020 (0.014-0.027) in the lowest risk class (class 1) to 6.75 (6.46-7.02) in the highest risk class. After adjusting for procedure and ASA physical status, the latent classes remained significantly associated with 30-day mortality. The addition of the risk class variable to a model containing ASA physical status and surgical procedure demonstrated a significant increase in the area under the receiver operator characteristic curve (0.892 vs 0.915; P < 0.0001). Latent classes of risk factors and comorbidities in patients undergoing intraabdominal surgery are predictive of 30-day mortality independent of the ASA physical status and improve risk prediction with the ASA physical status.

  10. Risk-informed criticality analysis as applied to waste packages subject to a subsurface igneous intrusion

    NASA Astrophysics Data System (ADS)

    Kimball, Darby Suzan

    Practitioners of many branches of nuclear facility safety use probabilistic risk assessment (PRA) methodology, which evaluates the reliability of a system along with the consequences of various failure states. One important exception is nuclear criticality safety, which traditionally produces binary results (critical or subcritical, based upon value of the effective multiplication factor, keff). For complex systems, criticality safety can benefit from application of the more flexible PRA techniques. A new risk-based technique in criticality safety analysis is detailed. In addition to identifying the most reactive configuration(s) and determining subcriticality, it yields more information about the relative reactivity contributions of various factors. By analyzing a more complete system, confidence that the system will remain subcritical is increased and areas where additional safety features would be most effective are indicated. The first step in the method is to create a criticality event tree (a specialized form of event tree where multiple outcomes stemming from a single event are acceptable). The tree lists events that impact reactivity by changing a system parameter. Next, the value of keff is calculated for the end states using traditional methods like the MCNP code. As calculations progress, the criticality event tree is modified; event branches demonstrated to have little effect on reactivity may be collapsed (thus reducing the total number of criticality runs), and branches may be added if more information is needed to characterize the system. When the criticality event tree is mature, critical limits are determined according to traditional validation techniques. Finally, results are evaluated. Criticality for the system is determined by comparing the value of k eff for each end state to the critical limit derived for those cases. The relative contributions of various events to criticality are identified by comparing end states resulting from different

  11. Preliminary risk analysis applied to the transmission of Creutzfeldt-Jakob disease.

    PubMed

    Bertrand, E; Schlatter, J

    2011-01-01

    Transmissible spongiform encephalopathy (TSE) is a degenerative disease of the central nervous system. As yet, there is no human screening test and no effective treatment. This disease is invariably fatal. General preventive measures are therefore essential. The objective of this study is to analyze and address on a prioritized basis the risks relating to the transmission of Creutzfeldt-Jakob disease during surgical operations by means of a preliminary risk analysis (PRA). The PRA produces 63 scenarios with maximum risk relating to operational and legal dangers. The study recommends a number of courses of action, such as training and internal controls, in order to reduce the risks identified. A procedure has been drawn up and assessed for each action. This PRA makes it possible to target and significantly reduce the potential dangers for transmission of Creutzfeldt-Jakob disease through the use of medical instruments.

  12. Applying Uncertainty Analysis to a Risk Assessment for the Pesticide Permethrin

    EPA Science Inventory

    We discuss the application of methods of uncertainty analysis from our previous poster to the problem of a risk assessment for exposure to the food-use pesticide permethrin resulting from residential pesticide crack and crevice application. Exposures are simulated by the SHEDS (S...

  13. Applying Uncertainty Analysis to a Risk Assessment for the Pesticide Permethrin

    EPA Science Inventory

    We discuss the application of methods of uncertainty analysis from our previous poster to the problem of a risk assessment for exposure to the food-use pesticide permethrin resulting from residential pesticide crack and crevice application. Exposures are simulated by the SHEDS (S...

  14. Risk-based analysis methods applied to nuclear power plant technical specifications

    SciTech Connect

    Wagner, D.P.; Minton, L.A.; Gaertner, J.P.

    1989-03-01

    A computer-aided methodology and practical applications of risk-based evaluation of technical specifications are described. The methodology, developed for use by the utility industry, is a part of the overall process of improving nuclear power plant technical specifications. The SOCRATES computer program uses the results of a probabilistic risk assessment or a system-level risk analysis to calculate changes in risk due to changes in the surveillance test interval and/or the allowed outage time stated in the technical specification. The computer program can accommodate various testing strategies (such as staggered or simultaneous testing) to allow modeling of component testing as it is carried out at the plant. The methods and computer program are an integral part of a larger decision process aimed at determining benefits from technical specification changes. These benefits can include cost savings to the utilities by reducing forced shutdowns and decreasing labor requirements for test and maintenance activities, with no adverse impacts on risk. The methodology and the SOCRATES computer program have been used extensively toe valuate several actual technical specifications in case studies demonstrating the methods. Summaries of these applications demonstrate the types of results achieved and the usefulness of the risk-based evaluation in improving the technical specifications.

  15. Environmental-sanitary risk analysis procedure applied to artificial turf sports fields.

    PubMed

    Ruffino, Barbara; Fiore, Silvia; Zanetti, Maria Chiara

    2013-07-01

    Owing to the extensive use of artificial turfs worldwide, over the past 10 years there has been much discussion about the possible health and environmental problems originating from styrene-butadiene recycled rubber. In this paper, the authors performed a Tier 2 environmental-sanitary risk analysis on five artificial turf sports fields located in the city of Turin (Italy) with the aid of RISC4 software. Two receptors (adult player and child player) and three routes of exposure (direct contact with crumb rubber, contact with rainwater soaking the rubber mat, inhalation of dusts and gases from the artificial turf fields) were considered in the conceptual model. For all the fields and for all the routes, the cumulative carcinogenic risk proved to be lower than 10(-6) and the cumulative non-carcinogenic risk lower than 1. The outdoor inhalation of dusts and gases was the main route of exposure for both carcinogenic and non-carcinogenic substances. The results given by the inhalation pathway were compared with those of a risk assessment carried out on citizens breathing gases and dusts from traffic emissions every day in Turin. For both classes of substances and for both receptors, the inhalation of atmospheric dusts and gases from vehicular traffic gave risk values of one order of magnitude higher than those due to playing soccer on an artificial field.

  16. Common cause evaluations in applied risk analysis of nuclear power plants. [PWR

    SciTech Connect

    Taniguchi, T.; Ligon, D.; Stamatelatos, M.

    1983-04-01

    Qualitative and quantitative approaches were developed for the evaluation of common cause failures (CCFs) in nuclear power plants and were applied to the analysis of the auxiliary feedwater systems of several pressurized water reactors (PWRs). Key CCF variables were identified through a survey of experts in the field and a review of failure experience in operating PWRs. These variables were classified into categories of high, medium, and low defense against a CCF. Based on the results, a checklist was developed for analyzing CCFs of systems. Several known techniques for quantifying CCFs were also reviewed. The information provided valuable insights in the development of a new model for estimating CCF probabilities, which is an extension of and improvement over the Beta Factor method. As applied to the analysis of the PWR auxiliary feedwater systems, the method yielded much more realistic values than the original Beta Factor method for a one-out-of-three system.

  17. A review of dendrogeomorphological research applied to flood risk analysis in Spain

    NASA Astrophysics Data System (ADS)

    Díez-Herrero, A.; Ballesteros, J. A.; Ruiz-Villanueva, V.; Bodoque, J. M.

    2013-08-01

    Over the last forty years, applying dendrogeomorphology to palaeoflood analysis has improved estimates of the frequency and magnitude of past floods worldwide. This paper reviews the main results obtained by applying dendrogeomorphology to flood research in several case studies in Central Spain. These dendrogeomorphological studies focused on the following topics: (1) anatomical analysis to understand the physiological response of trees to flood damage and improve sampling efficiency; (2) compiling robust flood chronologies in ungauged mountain streams, (3) determining flow depth and estimating flood discharge using two-dimensional hydraulic modelling, and comparing them with other palaeostage indicators; (4) calibrating hydraulic model parameters (i.e. Manning roughness); and (5) implementing stochastic-based, cost-benefit analysis to select optimal mitigation measures. The progress made in these areas is presented with suggestions for further research to improve the applicability of dendrogeochronology to palaeoflood studies. Further developments will include new methods for better identification of the causes of specific types of flood damage to trees (e.g. tilted trees) or stable isotope analysis of tree rings to identify the climatic conditions associated with periods of increasing flood magnitude or frequency.

  18. How to apply the dependence structure analysis to extreme temperature and precipitation for disaster risk assessment

    NASA Astrophysics Data System (ADS)

    Feng, Jieling; Li, Ning; Zhang, Zhengtao; Chen, Xi

    2017-06-01

    IPCC reports that a changing climate can affect the frequency and the intensity of extreme events. However, the extremes appear in the tail of the probability distribution. In order to know the relationship between extreme events in the tail of temperature and precipitation, an important but previously unobserved dependence structure is analyzed in this paper. Here, we examine the dependence structure by building a bivariate joint of Gumbel copula model for temperature and precipitation using monthly average temperature (T) and monthly precipitation (P) data from Beijing station in China covering a period of 1951-2015 and find the dependence structure can be divided into two sections, they are the middle part and the upper tail. We show that T and P have a strong positive correlation in the high tail section (T > 25.85 °C and P > 171.1 mm) (=0.66, p < 0.01) while they do not demonstrate the same relation in the other section, which suggests that the identification of a strong influence of T on extreme P needs help from the dependence structure analysis. We also find that in the high tail section, every 1 °C increase in T is associated with 73.45 mm increase in P. Our results suggested that extreme precipitation fluctuations by changes in temperature will allow the data dependence structure to be included in extreme affect for the disaster risk assessment under future climate change scenarios. Copula bivariate jointed probability distribution is useful to the dependence structure analysis.

  19. Risk, Uncertainty, and Decision Analysis Applied to the Management of Aquatic Nuisance Species

    DTIC Science & Technology

    2006-07-01

    of fishing systems . Journal of the Fisheries Research Board of Canada 33(1): 145-159. Watson, S. R., and D. M. Buede . 1987...alternative hypotheses regarding the impact of an ANS on the system . Budgets are usually limited and the number of possible studies that could be...an ecosystem and the decisions made to manage that system based on that understanding. Bayesian analysis, a fundamental concept in “value of

  20. Applying Multiple Criteria Decision Analysis to Comparative Benefit-Risk Assessment: Choosing among Statins in Primary Prevention.

    PubMed

    Tervonen, Tommi; Naci, Huseyin; van Valkenhoef, Gert; Ades, Anthony E; Angelis, Aris; Hillege, Hans L; Postmus, Douwe

    2015-10-01

    Decision makers in different health care settings need to weigh the benefits and harms of alternative treatment strategies. Such health care decisions include marketing authorization by regulatory agencies, practice guideline formulation by clinical groups, and treatment selection by prescribers and patients in clinical practice. Multiple criteria decision analysis (MCDA) is a family of formal methods that help make explicit the tradeoffs that decision makers accept between the benefit and risk outcomes of different treatment options. Despite the recent interest in MCDA, certain methodological aspects are poorly understood. This paper presents 7 guidelines for applying MCDA in benefit-risk assessment and illustrates their use in the selection of a statin drug for the primary prevention of cardiovascular disease. We provide guidance on the key methodological issues of how to define the decision problem, how to select a set of nonoverlapping evaluation criteria, how to synthesize and summarize the evidence, how to translate relative measures to absolute ones that permit comparisons between the criteria, how to define suitable scale ranges, how to elicit partial preference information from the decision makers, and how to incorporate uncertainty in the analysis. Our example on statins indicates that fluvastatin is likely to be the most preferred drug by our decision maker and that this result is insensitive to the amount of preference information incorporated in the analysis.

  1. Applying quantitative benefit-risk analysis to aid regulatory decision making in diagnostic imaging: methods, challenges, and opportunities.

    PubMed

    Agapova, Maria; Devine, Emily Beth; Bresnahan, Brian W; Higashi, Mitchell K; Garrison, Louis P

    2014-09-01

    Health agencies making regulatory marketing-authorization decisions use qualitative and quantitative approaches to assess expected benefits and expected risks associated with medical interventions. There is, however, no universal standard approach that regulatory agencies consistently use to conduct benefit-risk assessment (BRA) for pharmaceuticals or medical devices, including for imaging technologies. Economics, health services research, and health outcomes research use quantitative approaches to elicit preferences of stakeholders, identify priorities, and model health conditions and health intervention effects. Challenges to BRA in medical devices are outlined, highlighting additional barriers in radiology. Three quantitative methods--multi-criteria decision analysis, health outcomes modeling and stated-choice survey--are assessed using criteria that are important in balancing benefits and risks of medical devices and imaging technologies. To be useful in regulatory BRA, quantitative methods need to: aggregate multiple benefits and risks, incorporate qualitative considerations, account for uncertainty, and make clear whose preferences/priorities are being used. Each quantitative method performs differently across these criteria and little is known about how BRA estimates and conclusions vary by approach. While no specific quantitative method is likely to be the strongest in all of the important areas, quantitative methods may have a place in BRA of medical devices and radiology. Quantitative BRA approaches have been more widely applied in medicines, with fewer BRAs in devices. Despite substantial differences in characteristics of pharmaceuticals and devices, BRA methods may be as applicable to medical devices and imaging technologies as they are to pharmaceuticals. Further research to guide the development and selection of quantitative BRA methods for medical devices and imaging technologies is needed. Copyright © 2014 AUR. Published by Elsevier Inc. All rights

  2. Applied Behavior Analysis

    ERIC Educational Resources Information Center

    Szapacs, Cindy

    2006-01-01

    Teaching strategies that work for typically developing children often do not work for those diagnosed with an autism spectrum disorder. However, teaching strategies that work for children with autism do work for typically developing children. In this article, the author explains how the principles and concepts of Applied Behavior Analysis can be…

  3. Characterising bias in regulatory risk and decision analysis: An analysis of heuristics applied in health technology appraisal, chemicals regulation, and climate change governance.

    PubMed

    MacGillivray, Brian H

    2017-08-01

    In many environmental and public health domains, heuristic methods of risk and decision analysis must be relied upon, either because problem structures are ambiguous, reliable data is lacking, or decisions are urgent. This introduces an additional source of uncertainty beyond model and measurement error - uncertainty stemming from relying on inexact inference rules. Here we identify and analyse heuristics used to prioritise risk objects, to discriminate between signal and noise, to weight evidence, to construct models, to extrapolate beyond datasets, and to make policy. Some of these heuristics are based on causal generalisations, yet can misfire when these relationships are presumed rather than tested (e.g. surrogates in clinical trials). Others are conventions designed to confer stability to decision analysis, yet which may introduce serious error when applied ritualistically (e.g. significance testing). Some heuristics can be traced back to formal justifications, but only subject to strong assumptions that are often violated in practical applications. Heuristic decision rules (e.g. feasibility rules) in principle act as surrogates for utility maximisation or distributional concerns, yet in practice may neglect costs and benefits, be based on arbitrary thresholds, and be prone to gaming. We highlight the problem of rule-entrenchment, where analytical choices that are in principle contestable are arbitrarily fixed in practice, masking uncertainty and potentially introducing bias. Strategies for making risk and decision analysis more rigorous include: formalising the assumptions and scope conditions under which heuristics should be applied; testing rather than presuming their underlying empirical or theoretical justifications; using sensitivity analysis, simulations, multiple bias analysis, and deductive systems of inference (e.g. directed acyclic graphs) to characterise rule uncertainty and refine heuristics; adopting "recovery schemes" to correct for known biases

  4. Analysis of agreement between cardiac risk stratification protocols applied to participants of a center for cardiac rehabilitation

    PubMed Central

    Santos, Ana A. S.; Silva, Anne K. F.; Vanderlei, Franciele M.; Christofaro, Diego G. D.; Gonçalves, Aline F. L.; Vanderlei, Luiz C. M.

    2016-01-01

    ABSTRACT Background Cardiac risk stratification is related to the risk of the occurrence of events induced by exercise. Despite the existence of several protocols to calculate risk stratification, studies indicating that there is similarity between these protocols are still unknown. Objective To evaluate the agreement between the existing protocols on cardiac risk rating in cardiac patients. Method The records of 50 patients from a cardiac rehabilitation program were analyzed, from which the following information was extracted: age, sex, weight, height, clinical diagnosis, medical history, risk factors, associated diseases, and the results from the most recent laboratory and complementary tests performed. This information was used for risk stratification of the patients in the protocols of the American College of Sports Medicine, the Brazilian Society of Cardiology, the American Heart Association, the protocol designed by Frederic J. Pashkow, the American Association of Cardiovascular and Pulmonary Rehabilitation, the Société Française de Cardiologie, and the Sociedad Española de Cardiología. Descriptive statistics were used to characterize the sample and the analysis of agreement between the protocols was calculated using the Kappa coefficient. Differences were considered with a significance level of 5%. Results Of the 21 analyses of agreement, 12 were considered significant between the protocols used for risk classification, with nine classified as moderate and three as low. No agreements were classified as excellent. Different proportions were observed in each risk category, with significant differences between the protocols for all risk categories. Conclusion The agreements between the protocols were considered low and moderate and the risk proportions differed between protocols. PMID:27556385

  5. A New Approach in Applying Systems Engineering Tools and Analysis to Determine Hepatocyte Toxicogenomics Risk Levels to Human Health.

    PubMed

    Gigrich, James; Sarkani, Shahryar; Holzer, Thomas

    2017-03-01

    There is an increasing backlog of potentially toxic compounds that cannot be evaluated with current animal-based approaches in a cost-effective and expeditious manner, thus putting human health at risk. Extrapolation of animal-based test results for human risk assessment often leads to different physiological outcomes. This article introduces the use of quantitative tools and methods from systems engineering to evaluate the risk of toxic compounds by the analysis of the amount of stress that human hepatocytes undergo in vitro when metabolizing GW7647 (1) over extended times and concentrations. Hepatocytes are exceedingly connected systems that make it challenging to understand the highly varied dimensional genomics data to determine risk of exposure. Gene expression data of peroxisome proliferator-activated receptor-α (PPARα) (2) binding was measured over multiple concentrations and varied times of GW7647 exposure and leveraging mahalanombis distance to establish toxicity threshold risk levels. The application of these novel systems engineering tools provides new insight into the intricate workings of human hepatocytes to determine risk threshold levels from exposure. This approach is beneficial to decision makers and scientists, and it can help reduce the backlog of untested chemical compounds due to the high cost and inefficiency of animal-based models.

  6. Applied Behavior Analysis in Education.

    ERIC Educational Resources Information Center

    Cooper, John O.

    1982-01-01

    Applied behavioral analysis in education is expanding rapidly. This article describes the dimensions of applied behavior analysis and the contributions this technology offers teachers in the area of systematic applications, direct and daily measurement, and experimental methodology. (CJ)

  7. Municipal solid waste management health risk assessment from air emissions for China by applying life cycle analysis.

    PubMed

    Li, Hua; Nitivattananon, Vilas; Li, Peng

    2015-05-01

    This study is to quantify and objectively evaluate the extent of environmental health risks from three waste treatment options suggested by the national municipal solid waste management enhancing strategy (No [2011] 9 of the State Council, promulgated on 19 April 2011), which includes sanitary landfill, waste-to-energy incineration and compost, together with the material recovery facility through a case study in Zhangqiu City of China. It addresses potential chronic health risks from air emissions to residential receptors in the impacted area. It combines field survey, analogue survey, design documents and life cycle inventory methods in defining the source strength of chemicals of potential concern. The modelling of life cycle inventory and air dispersion is via integrated waste management(IWM)-2 and Screening Air Dispersion Model (Version 3.0) (SCREEN3). The health risk assessment is in accordance with United States Environmental Protection Agency guidance Risk Assessment Guidance for Superfund (RAGS), Volume I: Human Health Evaluation Manual (Part F, Supplemental Guidance for Inhalation Risk Assessment). The exposure concentration is based on long-term exposure to the maximum ground level contaminant in air under the 'reasonable worst situation' emissions and then directly compared with reference for concentration and unit risk factor/cancer slope factor derived from the national air quality standard (for a conventional pollutant) and toxicological studies (for a specific pollutant). Results from this study suggest that the option of compost with material recovery facility treatment may pose less negative health impacts than other options; the sensitivity analysis shows that the landfill integrated waste management collection rate has a great influence on the impact results. Further investigation is needed to validate or challenge the findings of this study.

  8. Applied Surface Analysis Workshop.

    DTIC Science & Technology

    1979-10-01

    field of surface analysis attended the Workshop. The list of participants follows. 5! A, I Charles Anderson Albert L. Botkin Case Western Reserve...Louis, MO 63166 University of Dayton 300 College Park Richard Chase Dayton, OH 45469 Case Western Reserve University University Circle Brian E. P...Dayton, OH 45469 300 College Park Dayton, OH 45469 Richard W. Hoffman Case Western Reserve University Martin Kordesch Cleveland, OH 44106 Case Western

  9. Risk analysis and international trade principles applied to the importation into Canada of caprine embryos from South Africa.

    PubMed

    Evans, B; Faul, A; Bielanski, A; Renwick, S; Van Derlinden, I

    1997-04-01

    Between November 1994 and February 1995 over nine thousand Boer goat embryos were imported into Canada from the Republic of South Africa. This substantial international movement of animal genetics via embryos was achieved through the application of the risk analysis principles prescribed in Section 1.4. of the International Animal Health Code of the Office International des Epizooties (OIE). Integral to the development of the health certification procedures was the application of the fundamental principles of non-discrimination, harmonisation, equivalence and transparency defined in the World Trade Organisation Agreement on Sanitary and Phytosanitary measures. Risk mitigation interventions were founded upon full consideration of the potential for disease transmission by animal embryos as espoused by the International Embryo Transfer Society and the relevant standards contained in Appendix 4.2.3.3. of the OIE International Animal Health Code. All the embryos imported into Canada were implanted into synchronised recipients on arrival. Twenty months later there has been no evidence of disease in either the recipient animals or the resulting animals born in Canada.

  10. Applying the change vector analysis technique to assess the desertification risk in the south-west of Romania in the period 1984-2011.

    PubMed

    Vorovencii, Iosif

    2017-09-26

    The desertification risk affects around 40% of the agricultural land in various regions of Romania. The purpose of this study is to analyse the risk of desertification in the south-west of Romania in the period 1984-2011 using the change vector analysis (CVA) technique and Landsat thematic mapper (TM) satellite images. CVA was applied to combinations of normalised difference vegetation index (NDVI)-albedo, NDVI-bare soil index (BI) and tasselled cap greenness (TCG)-tasselled cap brightness (TCB). The combination NDVI-albedo proved to be the best in assessing the desertification risk, with an overall accuracy of 87.67%, identifying a desertification risk on 25.16% of the studied period. The classification of the maps was performed for the following classes: desertification risk, re-growing and persistence. Four degrees of desertification risk and re-growing were used: low, medium, high and extreme. Using the combination NDVI-albedo, 0.53% of the analysed surface was assessed as having an extreme degree of desertification risk, 3.93% a high degree, 8.72% a medium degree and 11.98% a low degree. The driving forces behind the risk of desertification are both anthropogenic and climatic causes. The anthropogenic causes include the destruction of the irrigation system, deforestation, the destruction of the forest shelterbelts, the fragmentation of agricultural land and its inefficient management. Climatic causes refer to increase of temperatures, frequent and prolonged droughts and decline of the amount of precipitation.

  11. Risk analysis before launch

    NASA Astrophysics Data System (ADS)

    Behlert, Rene

    1988-08-01

    A quality methodology is proposed based on risk analysis and observation of technical facts. The procedures for the quantization of a risk are described and examples are given. A closed loop quality analysis is described. Overall mission safety goals are described. The concept of maintenance is developed to evolutionary maintenance. It is shown that a large number of data must be processed to apply the proposed methods. The use of computer data processing is required.

  12. Exploring Students at Risk for Reading Comprehension Difficulties in South Korea: The RTI Approach Applying Latent Class Growth Analysis

    ERIC Educational Resources Information Center

    Kim, Dongil; Kim, Woori; Koh, Hyejung; Lee, Jaeho; Shin, Jaehyun; Kim, Heeju

    2014-01-01

    The purpose of this study was to identify students at risk of reading comprehension difficulties by using the responsiveness to intervention (RTI) approach. The participants were 177 students in Grades 1-3 in three elementary schools in South Korea. The students received Tier 1 instruction of RTI from March to May 2011, and their performance was…

  13. Exploring Students at Risk for Reading Comprehension Difficulties in South Korea: The RTI Approach Applying Latent Class Growth Analysis

    ERIC Educational Resources Information Center

    Kim, Dongil; Kim, Woori; Koh, Hyejung; Lee, Jaeho; Shin, Jaehyun; Kim, Heeju

    2014-01-01

    The purpose of this study was to identify students at risk of reading comprehension difficulties by using the responsiveness to intervention (RTI) approach. The participants were 177 students in Grades 1-3 in three elementary schools in South Korea. The students received Tier 1 instruction of RTI from March to May 2011, and their performance was…

  14. Women in applied behavior analysis

    PubMed Central

    McSweeney, Frances K.; Donahoe, Patricia; Swindell, Samantha

    2000-01-01

    The status of women in applied behavior analysis was examined by comparing the participation of women in the Journal of Applied Behavior Analysis (JABA) to their participation in three similar journals. For all journals, the percentage of articles with at least one female author, the percentage of authors who are female, and the percentage of articles with a female first author increased from 1978 to 1997. Participation by women in JABA was equal to or greater than participation by women in the comparison journals. However, women appeared as authors on papers in special sections of Behavior Modification substantially more often when the editor was female than when the editor was male. In addition, female membership on the editorial boards of JABA, Behavior Modification, and Behaviour Research and Therapy failed to increase from 1978 to 1997. We conclude that a “glass ceiling” reduces the participation of women at the highest levels of applied behavior analysis and related fields. PMID:22478351

  15. Applying of Decision Tree Analysis to Risk Factors Associated with Pressure Ulcers in Long-Term Care Facilities

    PubMed Central

    Moon, Mikyung

    2017-01-01

    Objectives The purpose of this study was to use decision tree analysis to explore the factors associated with pressure ulcers (PUs) among elderly people admitted to Korean long-term care facilities. Methods The data were extracted from the 2014 National Inpatient Sample (NIS)—data of Health Insurance Review and Assessment Service (HIRA). A MapReduce-based program was implemented to join and filter 5 tables of the NIS. The outcome predicted by the decision tree model was the prevalence of PUs as defined by the Korean Standard Classification of Disease-7 (KCD-7; code L89*). Using R 3.3.1, a decision tree was generated with the finalized 15,856 cases and 830 variables. Results The decision tree displayed 15 subgroups with 8 variables showing 0.804 accuracy, 0.820 sensitivity, and 0.787 specificity. The most significant primary predictor of PUs was length of stay less than 0.5 day. Other predictors were the presence of an infectious wound dressing, followed by having diagnoses numbering less than 3.5 and the presence of a simple dressing. Among diagnoses, “injuries to the hip and thigh” was the top predictor ranking 5th overall. Total hospital cost exceeding 2,200,000 Korean won (US $2,000) rounded out the top 7. Conclusions These results support previous studies that showed length of stay, comorbidity, and total hospital cost were associated with PUs. Moreover, wound dressings were commonly used to treat PUs. They also show that machine learning, such as a decision tree, could effectively predict PUs using big data. PMID:28261530

  16. The Relative Importance of the Vadose Zone in Multimedia Risk Assessment Modeling Applied at a National Scale: An Analysis of Benzene Using 3MRA

    NASA Astrophysics Data System (ADS)

    Babendreier, J. E.

    2002-05-01

    Evaluating uncertainty and parameter sensitivity in environmental models can be a difficult task, even for low-order, single-media constructs driven by a unique set of site-specific data. The challenge of examining ever more complex, integrated, higher-order models is a formidable one, particularly in regulatory settings applied on a national scale. Quantitative assessment of uncertainty and sensitivity within integrated, multimedia models that simulate hundreds of sites, spanning multiple geographical and ecological regions, will ultimately require a systematic, comparative approach coupled with sufficient computational power. The Multimedia, Multipathway, and Multireceptor Risk Assessment Model (3MRA) is an important code being developed by the United States Environmental Protection Agency for use in site-scale risk assessment (e.g. hazardous waste management facilities). The model currently entails over 700 variables, 185 of which are explicitly stochastic. The 3MRA can start with a chemical concentration in a waste management unit (WMU). It estimates the release and transport of the chemical throughout the environment, and predicts associated exposure and risk. The 3MRA simulates multimedia (air, water, soil, sediments), pollutant fate and transport, multipathway exposure routes (food ingestion, water ingestion, soil ingestion, air inhalation, etc.), multireceptor exposures (resident, gardener, farmer, fisher, ecological habitats and populations), and resulting risk (human cancer and non-cancer effects, ecological population and community effects). The 3MRA collates the output for an overall national risk assessment, offering a probabilistic strategy as a basis for regulatory decisions. To facilitate model execution of 3MRA for purposes of conducting uncertainty and sensitivity analysis, a PC-based supercomputer cluster was constructed. Design of SuperMUSE, a 125 GHz Windows-based Supercomputer for Model Uncertainty and Sensitivity Evaluation is described

  17. Applying STAMP in Accident Analysis

    NASA Technical Reports Server (NTRS)

    Leveson, Nancy; Daouk, Mirna; Dulac, Nicolas; Marais, Karen

    2003-01-01

    Accident models play a critical role in accident investigation and analysis. Most traditional models are based on an underlying chain of events. These models, however, have serious limitations when used for complex, socio-technical systems. Previously, Leveson proposed a new accident model (STAMP) based on system theory. In STAMP, the basic concept is not an event but a constraint. This paper shows how STAMP can be applied to accident analysis using three different views or models of the accident process and proposes a notation for describing this process.

  18. Risk Analysis

    NASA Technical Reports Server (NTRS)

    Morring, Frank, Jr.

    2004-01-01

    A National Academies panel says the Hubble Space Telescope is too valuable ;or gamblingon a long-shot robotic mission to extend its service life, and urges Directly contradicting Administrator Sean O'Keefe, who killed a planned fifth shuttle servicing mission to the telescope on grounds it was too dangerous for a human crew in the post-Challenger environment, the expert committee found that upgrades to shuttle safety actually should make it less hazardous to fly to the telescope than it was before Columbia was lost. Risks of a telescope-servicing mission are only marginally greater than the planned missions to the International Space Station (ISS) O'Keefe has authorized, the panel found. After comparing those risks to the dangers inherent in trying to develop a complex space robot in the 39 months remaining in the Hubble s estimated service life, the panel opted for the human mission to save one of the major achievements of the American space program, in the words of Louis J. Lanzerotti, its chairman.

  19. Simulation Assisted Risk Assessment Applied to Launch Vehicle Conceptual Design

    NASA Technical Reports Server (NTRS)

    Mathias, Donovan L.; Go, Susie; Gee, Ken; Lawrence, Scott

    2008-01-01

    A simulation-based risk assessment approach is presented and is applied to the analysis of abort during the ascent phase of a space exploration mission. The approach utilizes groupings of launch vehicle failures, referred to as failure bins, which are mapped to corresponding failure environments. Physical models are used to characterize the failure environments in terms of the risk due to blast overpressure, resulting debris field, and the thermal radiation due to a fireball. The resulting risk to the crew is dynamically modeled by combining the likelihood of each failure, the severity of the failure environments as a function of initiator and time of the failure, the robustness of the crew module, and the warning time available due to early detection. The approach is shown to support the launch vehicle design process by characterizing the risk drivers and identifying regions where failure detection would significantly reduce the risk to the crew.

  20. Arctic Risk Management (ARMNet) Network: Linking Risk Management Practitioners and Researchers Across the Arctic Regions of Canada and Alaska To Improve Risk, Emergency and Disaster Preparedness and Mitigation Through Comparative Analysis and Applied Research

    NASA Astrophysics Data System (ADS)

    Garland, A.

    2015-12-01

    The Arctic Risk Management Network (ARMNet) was conceived as a trans-disciplinary hub to encourage and facilitate greater cooperation, communication and exchange among American and Canadian academics and practitioners actively engaged in the research, management and mitigation of risks, emergencies and disasters in the Arctic regions. Its aim is to assist regional decision-makers through the sharing of applied research and best practices and to support greater inter-operability and bilateral collaboration through improved networking, joint exercises, workshops, teleconferences, radio programs, and virtual communications (eg. webinars). Most importantly, ARMNet is a clearinghouse for all information related to the management of the frequent hazards of Arctic climate and geography in North America, including new and emerging challenges arising from climate change, increased maritime polar traffic and expanding economic development in the region. ARMNet is an outcome of the Arctic Observing Network (AON) for Long Term Observations, Governance, and Management Discussions, www.arcus.org/search-program. The AON goals continue with CRIOS (www.ariesnonprofit.com/ARIESprojects.php) and coastal erosion research (www.ariesnonprofit.com/webinarCoastalErosion.php) led by the North Slope Borough Risk Management Office with assistance from ARIES (Applied Research in Environmental Sciences Nonprofit, Inc.). The constituency for ARMNet will include all northern academics and researchers, Arctic-based corporations, First Responders (FRs), Emergency Management Offices (EMOs) and Risk Management Offices (RMOs), military, Coast Guard, northern police forces, Search and Rescue (SAR) associations, boroughs, territories and communities throughout the Arctic. This presentation will be of interest to all those engaged in Arctic affairs, describe the genesis of ARMNet and present the results of stakeholder meetings and webinars designed to guide the next stages of the Project.

  1. FOOD RISK ANALYSIS

    USDA-ARS?s Scientific Manuscript database

    Food risk analysis is a holistic approach to food safety because it considers all aspects of the problem. Risk assessment modeling is the foundation of food risk analysis. Proper design and simulation of the risk assessment model is important to properly predict and control risk. Because of knowl...

  2. Applied climate science: the risk management route

    NASA Astrophysics Data System (ADS)

    Harrison, Stephan

    2010-05-01

    While several Masters programmes on climate science exist in the British Isles, until recently there has been little regard for the requirements of business, local government and the insurance market in the provision of postgraduate taught programmes. This talk will discuss the issue of climate change risk management as a way of embedding climate science into the decision-making protocols for commercial and governmental organisations. It will outline several issues for business decision-making that have implications for climate change. These include: climate sensitivity, model uncertainty and rapid climate change. The experience of developing and running the MSc in Climate Change and Risk management at Exeter University will be used to highlight these issues.

  3. Budget Risk & Prioritization Analysis Tool

    SciTech Connect

    Carlos Castillo, Jerel Nelson

    2010-12-31

    BRPAtool performs the following: •Assists managers in making solid decisions on what scope/activities to reduce and/or eliminate, to meet constrained budgets, based on multiple risk factors •Enables analysis of different budget scenarios •Can analyze risks and cost for each activity based on technical, quantifiable risk criteria and management-determined risks •Real-time analysis •Enables managers to determine the multipliers and where funding is best applied •Promotes solid budget defense

  4. Multi-hazard risk assessment applied to hydraulic fracturing operations

    NASA Astrophysics Data System (ADS)

    Garcia-Aristizabal, Alexander; Gasparini, Paolo; Russo, Raffaella; Capuano, Paolo

    2017-04-01

    Without exception, the exploitation of any energy resource produces impacts and intrinsically bears risks. Therefore, to make sound decisions about future energy resource exploitation, it is important to clearly understand the potential environmental impacts in the full life-cycle of an energy development project, distinguishing between the specific impacts intrinsically related to exploiting a given energy resource and those shared with the exploitation of other energy resources. Technological advances as directional drilling and hydraulic fracturing have led to a rapid expansion of unconventional resources (UR) exploration and exploitation; as a consequence, both public health and environmental concerns have risen. The main objective of a multi-hazard risk assessment applied to the development of UR is to assess the rate (or the likelihood) of occurrence of incidents and the relative potential impacts on surrounding environment, considering different hazards and their interactions. Such analyses have to be performed considering the different stages of development of a project; however, the discussion in this paper is mainly focused on the analysis applied to the hydraulic fracturing stage of a UR development project. The multi-hazard risk assessment applied to the development of UR poses a number of challenges, making of this one a particularly complex problem. First, a number of external hazards might be considered as potential triggering mechanisms. Such hazards can be either of natural origin or anthropogenic events caused by the same industrial activities. Second, failures might propagate through the industrial elements, leading to complex scenarios according to the layout of the industrial site. Third, there is a number of potential risk receptors, ranging from environmental elements (as the air, soil, surface water, or groundwater) to local communities and ecosystems. The multi-hazard risk approach for this problem is set by considering multiple hazards

  5. SU-E-T-128: Applying Failure Modes and Effects Analysis to a Risk-Based Quality Management for Stereotactic Radiosurgery in Brazil

    SciTech Connect

    Teixeira, F; Almeida, C de; Huq, M

    2015-06-15

    Purpose: The goal of the present work was to evaluate the process maps for stereotactic radiosurgery (SRS) treatment at three radiotherapy centers in Brazil and apply the FMEA technique to evaluate similarities and differences, if any, of the hazards and risks associated with these processes. Methods: A team, consisting of professionals from different disciplines and involved in the SRS treatment, was formed at each center. Each team was responsible for the development of the process map, and performance of FMEA and FTA. A facilitator knowledgeable in these techniques led the work at each center. The TG100 recommended scales were used for the evaluation of hazard and severity for each step for the major process “treatment planning”. Results: Hazard index given by the Risk Priority Number (RPN) is found to range from 4–270 for various processes and the severity (S) index is found to range from 1–10. The RPN values > 100 and severity value ≥ 7 were chosen to flag safety improvement interventions. Number of steps with RPN ≥100 were found to be 6, 59 and 45 for the three centers. The corresponding values for S ≥ 7 are 24, 21 and 25 respectively. The range of RPN and S values for each center belong to different process steps and failure modes. Conclusion: These results show that interventions to improve safety is different for each center and it is associated with the skill level of the professional team as well as the technology used to provide radiosurgery treatment. The present study will very likely be a model for implementation of risk-based prospective quality management program for SRS treatment in Brazil where currently there are 28 radiotherapy centers performing SRS. A complete FMEA for SRS for these three radiotherapy centers is currently under development.

  6. Evaluation of Cardiovascular Risk Scores Applied to NASA's Astronant Corps

    NASA Technical Reports Server (NTRS)

    Jain, I.; Charvat, J. M.; VanBaalen, M.; Lee, L.; Wear, M. L.

    2014-01-01

    In an effort to improve cardiovascular disease (CVD) risk prediction, this analysis evaluates and compares the applicability of multiple CVD risk scores to the NASA Astronaut Corps which is extremely healthy at selection.

  7. Risk analysis highly valued.

    PubMed

    Gammelsaeter, Håkon; Ramstad, Jens Eirik; Røv, Ann Solberg; Walseth, Frode; Paulsen, Anne Margrethe

    2003-11-01

    It is felt that risk and vulnerability analysis is an excellent means of assessing and communicating risk and inconvenience related to extensive construction activities. The main reasons for this are: It uncovers the risks and inconveniences involved. Risk reducing and alert measures are identified. Preventive action and emergency plans are implemented. It is easy to learn. It is unbureaucratic. It promotes cross-professional communication. It distributes correct information very effectively.

  8. Applying a weed risk assessment approach to GM crops.

    PubMed

    Keese, Paul K; Robold, Andrea V; Myers, Ruth C; Weisman, Sarah; Smith, Joe

    2014-12-01

    Current approaches to environmental risk assessment of genetically modified (GM) plants are modelled on chemical risk assessment methods, which have a strong focus on toxicity. There are additional types of harms posed by plants that have been extensively studied by weed scientists and incorporated into weed risk assessment methods. Weed risk assessment uses robust, validated methods that are widely applied to regulatory decision-making about potentially problematic plants. They are designed to encompass a broad variety of plant forms and traits in different environments, and can provide reliable conclusions even with limited data. The knowledge and experience that underpin weed risk assessment can be harnessed for environmental risk assessment of GM plants. A case study illustrates the application of the Australian post-border weed risk assessment approach to a representative GM plant. This approach is a valuable tool to identify potential risks from GM plants.

  9. Quantitative environmental risk analysis

    SciTech Connect

    Klovning, J.; Nilsen, E.F.

    1995-12-31

    According to regulations relating to implementation and rise of risk analysis in the petroleum activities issued by the Norwegian Petroleum Directorate, it is mandatory for an operator on the Norwegian Continental Shelf to establish acceptance criteria for environmental risk in the activities and carry out environmental risk analysis. This paper presents a {open_quotes}new{close_quotes} method for environmental risk analysis developed by the company. The objective has been to assist the company to meet rules and regulations and to assess and describe the environmental risk in a systematic manner. In the environmental risk analysis the most sensitive biological resource in the affected area is used to assess the environmental damage. The analytical method is based on the methodology for quantitative risk analysis related to loss of life. In addition it incorporates the effect of seasonal fluctuations in the environmental risk evaluations. The paper is describing the function of the main analytical sequences exemplified through an analysis of environmental risk related to exploration drilling in an environmental sensitive area on the Norwegian Continental Shelf.

  10. Risk analysis methodology survey

    NASA Technical Reports Server (NTRS)

    Batson, Robert G.

    1987-01-01

    NASA regulations require that formal risk analysis be performed on a program at each of several milestones as it moves toward full-scale development. Program risk analysis is discussed as a systems analysis approach, an iterative process (identification, assessment, management), and a collection of techniques. These techniques, which range from simple to complex network-based simulation were surveyed. A Program Risk Analysis Handbook was prepared in order to provide both analyst and manager with a guide for selection of the most appropriate technique.

  11. The basic importance of applied behavior analysis

    PubMed Central

    Epling, W. Frank; Pierce, W. David

    1986-01-01

    We argue that applied behavior analysis is relevant to basic research. Modification studies, and a broad range of investigations that focus on the precipitating and maintaining conditions of socially significant human behavior, have basic importance. Applied behavior analysis may aid basic researchers in the design of externally valid experiments and thereby enhance the theoretical significance of basic research for understanding human behavior. Applied research with humans, directed at culturally-important problems, will help to propagate the science of human behavior. Such a science will also be furthered by analogue experiments that model socially important behavior. Analytical-applied studies and analogue experiments are forms of applied behavior analysis that could suggest new environment-behavior relationships. These relationships could lead to basic research and principles that further the prediction, control, and understanding of behavior. PMID:22478650

  12. Is risk analysis scientific?

    PubMed

    Hansson, Sven Ove; Aven, Terje

    2014-07-01

    This article discusses to what extent risk analysis is scientific in view of a set of commonly used definitions and criteria. We consider scientific knowledge to be characterized by its subject matter, its success in developing the best available knowledge in its fields of study, and the epistemic norms and values that guide scientific investigations. We proceed to assess the field of risk analysis according to these criteria. For this purpose, we use a model for risk analysis in which science is used as a base for decision making on risks, which covers the five elements evidence, knowledge base, broad risk evaluation, managerial review and judgment, and the decision; and that relates these elements to the domains experts and decisionmakers, and to the domains fact-based or value-based. We conclude that risk analysis is a scientific field of study, when understood as consisting primarily of (i) knowledge about risk-related phenomena, processes, events, etc., and (ii) concepts, theories, frameworks, approaches, principles, methods and models to understand, assess, characterize, communicate, and manage risk, in general and for specific applications (the instrumental part). © 2014 Society for Risk Analysis.

  13. Applied Behavior Analysis and Statistical Process Control?

    ERIC Educational Resources Information Center

    Hopkins, B. L.

    1995-01-01

    Incorporating statistical process control (SPC) methods into applied behavior analysis is discussed. It is claimed that SPC methods would likely reduce applied behavior analysts' intimate contacts with problems and would likely yield poor treatment and research decisions. Cases and data presented by Pfadt and Wheeler (1995) are cited as examples.…

  14. Applied Behavior Analysis and Statistical Process Control?

    ERIC Educational Resources Information Center

    Hopkins, B. L.

    1995-01-01

    Incorporating statistical process control (SPC) methods into applied behavior analysis is discussed. It is claimed that SPC methods would likely reduce applied behavior analysts' intimate contacts with problems and would likely yield poor treatment and research decisions. Cases and data presented by Pfadt and Wheeler (1995) are cited as examples.…

  15. Applied Behavior Analysis in Flying Training Research.

    DTIC Science & Technology

    1980-01-01

    often referred to as behavior modification ) which promotes improvements in human learning through an analysis of the contingencies surrounding a...Company, in press. Bandura, A. Principles of behavior modification . New York: Holt, Rinehart & Winston, 1969. Bostow, D.E., & Bailey, J.S. Modification of...tutors for kindergarten children. Journal of Applied Behavior Analysis, 1974, 7, 223-232. Kazdin, A.E. Behavior modification in applied settings

  16. How to ensure that the results of climate risk analysis make a difference? - Experience from applied research addressing the challenges of climate change

    NASA Astrophysics Data System (ADS)

    Schneiderbauer, Stefan; Zebisch, Marc; Becker, Daniel; Pedoth, Lydia; Renner, Kathrin; Kienberger, Stefan

    2016-04-01

    Changing climate conditions may have beneficial or adverse effects on the social-ecological systems we are living in. In any case, the possible effects result from complex and interlinked physical and social processes embedded in these systems. Traditional research addresses these bio-physical and societal issues in a separate way. Therefore, in general, studies on risks related to climate change are still mono-disciplinary in nature with an increasing amount of work following a multi-disciplinary approach. The quality and usefulness of the results of such research for policy or decision making in practice may further be limited by study designs that do not acknowledge appropriately the significance of integrating or at least mixing qualitative and quantitative information and knowledge. Finally, the acceptance of study results - particularly when containing some kind of assessments - is often endangered by insufficient and / or late involvement of stakeholders and users. The above mentioned limitations have often been brought up in the recent past. However, despite that a certain consensus could be achieved in the last years recognising the need to tackle these issues, little progress has been made in terms of implementation within the context of (research) studies. This paper elaborates in detail on reasons that hamper the application of - interdisciplinary (i.e. natural and social science), - trans-disciplinary (i.e. co-production of knowledge) and - integrative (i.e. combining qualitative and quantitative approaches) work. It is based on the experience gained through a number of applied climate change vulnerability studies carried out within the context of various GIZ-financed development cooperation projects, a consultancy project for the German Environment Agency as well as the workshop series INQUIMUS, which tackles particularly the issues of mixing qualitative and quantitative research approaches. Potentials and constraints of possible attempts for

  17. Applying evolutionary genetics to developmental toxicology and risk assessment.

    PubMed

    Leung, Maxwell C K; Procter, Andrew C; Goldstone, Jared V; Foox, Jonathan; DeSalle, Robert; Mattingly, Carolyn J; Siddall, Mark E; Timme-Laragy, Alicia R

    2017-03-04

    Evolutionary thinking continues to challenge our views on health and disease. Yet, there is a communication gap between evolutionary biologists and toxicologists in recognizing the connections among developmental pathways, high-throughput screening, and birth defects in humans. To increase our capability in identifying potential developmental toxicants in humans, we propose to apply evolutionary genetics to improve the experimental design and data interpretation with various in vitro and whole-organism models. We review five molecular systems of stress response and update 18 consensual cell-cell signaling pathways that are the hallmark for early development, organogenesis, and differentiation; and revisit the principles of teratology in light of recent advances in high-throughput screening, big data techniques, and systems toxicology. Multiscale systems modeling plays an integral role in the evolutionary approach to cross-species extrapolation. Phylogenetic analysis and comparative bioinformatics are both valuable tools in identifying and validating the molecular initiating events that account for adverse developmental outcomes in humans. The discordance of susceptibility between test species and humans (ontogeny) reflects their differences in evolutionary history (phylogeny). This synthesis not only can lead to novel applications in developmental toxicity and risk assessment, but also can pave the way for applying an evo-devo perspective to the study of developmental origins of health and disease.

  18. The Andrews’ Principles of Risk, Need, and Responsivity as Applied in Drug Abuse Treatment Programs: Meta-Analysis of Crime and Drug Use Outcomes

    PubMed Central

    Prendergast, Michael L.; Pearson, Frank S.; Podus, Deborah; Hamilton, Zachary K.; Greenwell, Lisa

    2013-01-01

    Objectives The purpose of the present meta-analysis was to answer the question: Can the Andrews principles of risk, needs, and responsivity, originally developed for programs that treat offenders, be extended to programs that treat drug abusers? Methods Drawing from a dataset that included 243 independent comparisons, we conducted random-effects meta-regression and ANOVA-analog meta-analyses to test the Andrews principles by averaging crime and drug use outcomes over a diverse set of programs for drug abuse problems. Results For crime outcomes, in the meta-regressions the point estimates for each of the principles were substantial, consistent with previous studies of the Andrews principles. There was also a substantial point estimate for programs exhibiting a greater number of the principles. However, almost all of the 95% confidence intervals included the zero point. For drug use outcomes, in the meta-regressions the point estimates for each of the principles was approximately zero; however, the point estimate for programs exhibiting a greater number of the principles was somewhat positive. All of the estimates for the drug use principles had confidence intervals that included the zero point. Conclusions This study supports previous findings from primary research studies targeting the Andrews principles that those principles are effective in reducing crime outcomes, here in meta-analytic research focused on drug treatment programs. By contrast, programs that follow the principles appear to have very little effect on drug use outcomes. Primary research studies that experimentally test the Andrews principles in drug treatment programs are recommended. PMID:24058325

  19. Translational benchmark risk analysis

    PubMed Central

    Piegorsch, Walter W.

    2010-01-01

    Translational development – in the sense of translating a mature methodology from one area of application to another, evolving area – is discussed for the use of benchmark doses in quantitative risk assessment. Illustrations are presented with traditional applications of the benchmark paradigm in biology and toxicology, and also with risk endpoints that differ from traditional toxicological archetypes. It is seen that the benchmark approach can apply to a diverse spectrum of risk management settings. This suggests a promising future for this important risk-analytic tool. Extensions of the method to a wider variety of applications represent a significant opportunity for enhancing environmental, biomedical, industrial, and socio-economic risk assessments. PMID:20953283

  20. Caldwell University's Department of Applied Behavior Analysis.

    PubMed

    Reeve, Kenneth F; Reeve, Sharon A

    2016-05-01

    Since 2004, faculty members at Caldwell University have developed three successful graduate programs in Applied Behavior Analysis (i.e., PhD, MA, non-degree programs), increased program faculty from two to six members, developed and operated an on-campus autism center, and begun a stand-alone Applied Behavior Analysis Department. This paper outlines a number of strategies used to advance these initiatives, including those associated with an extensive public relations campaign. We also outline challenges that have limited our programs' growth. These strategies, along with a consideration of potential challenges, might prove useful in guiding academicians who are interested in starting their own programs in behavior analysis.

  1. Program risk analysis handbook

    NASA Technical Reports Server (NTRS)

    Batson, R. G.

    1987-01-01

    NASA regulations specify that formal risk analysis be performed on a program at each of several milestones. Program risk analysis is discussed as a systems analysis approach, an iterative process (identification, assessment, management), and a collection of techniques. These techniques, which range from extremely simple to complex network-based simulation, are described in this handbook in order to provide both analyst and manager with a guide for selection of the most appropriate technique. All program risk assessment techniques are shown to be based on elicitation and encoding of subjective probability estimates from the various area experts on a program. Techniques to encode the five most common distribution types are given. Then, a total of twelve distinct approaches to risk assessment are given. Steps involved, good and bad points, time involved, and degree of computer support needed are listed. Why risk analysis should be used by all NASA program managers is discussed. Tools available at NASA-MSFC are identified, along with commercially available software. Bibliography (150 entries) and a program risk analysis check-list are provided.

  2. Risk analysis and meat hygiene.

    PubMed

    Hathaway, S C

    1993-12-01

    Meat hygiene consists of three major activities: post-mortem inspection; monitoring and surveillance for chemical hazards; and maintenance of good hygienic practice throughout all stages between slaughter and consumption of meat. Risk analysis is an applied science of increasing importance to these activities in the following areas: facilitating the distribution of pre-harvest, harvest and post-harvest inspection resources, proportional to the likelihood of public health and animal health hazards; establishing internationally-harmonized standards and specifications which are consistent and science-based; and improving the safety and wholesomeness of meat and meat products in local and international trade. Risk analysis, in one form or another, is well developed with respect to establishing standards and specifications for chemical hazards; methods for risk analysis of post-mortem meat inspection programmes are beginning to emerge. However, risk analysis of microbiological hazards in meat and meat products presents particular difficulties. All areas of application currently suffer from a lack of international agreement on risk assessment and risk management methodology.

  3. Defining applied behavior analysis: An historical analogy

    PubMed Central

    Deitz, Samuel M.

    1982-01-01

    This article examines two criteria for a definition of applied behavior analysis. The criteria are derived from a 19th century attempt to establish medicine as a scientific field. The first criterion, experimental determinism, specifies the methodological boundaries of an experimental science. The second criterion, philosophic doubt, clarifies the tentative nature of facts and theories derived from those facts. Practices which will advance the science of behavior are commented upon within each criteria. To conclude, the problems of a 19th century form of empiricism in medicine are related to current practices in applied behavior analysis. PMID:22478557

  4. Applied Behavior Analysis as Technological Science.

    ERIC Educational Resources Information Center

    Iwata, Brian A.

    1991-01-01

    To the extent that applied behavior analysis represents a scientific and practical approach to the study of behavior, its technological character is essential. The most serious problem evident in the field is not that the research being done is too technical but that more good research of all types is needed. (JDD)

  5. Applied Behavior Analysis: Beyond Discrete Trial Teaching

    ERIC Educational Resources Information Center

    Steege, Mark W.; Mace, F. Charles; Perry, Lora; Longenecker, Harold

    2007-01-01

    We discuss the problem of autism-specific special education programs representing themselves as Applied Behavior Analysis (ABA) programs when the only ABA intervention employed is Discrete Trial Teaching (DTT), and often for limited portions of the school day. Although DTT has many advantages to recommend its use, it is not well suited to teach…

  6. Applied Behavior Analysis: Beyond Discrete Trial Teaching

    ERIC Educational Resources Information Center

    Steege, Mark W.; Mace, F. Charles; Perry, Lora; Longenecker, Harold

    2007-01-01

    We discuss the problem of autism-specific special education programs representing themselves as Applied Behavior Analysis (ABA) programs when the only ABA intervention employed is Discrete Trial Teaching (DTT), and often for limited portions of the school day. Although DTT has many advantages to recommend its use, it is not well suited to teach…

  7. Positive Behavior Support and Applied Behavior Analysis

    ERIC Educational Resources Information Center

    Johnston, J. M.; Foxx, R. M.; Jacobson, J. W.; Green, G.; Mulick, J. A.

    2006-01-01

    This article reviews the origins and characteristics of the positive behavior support (PBS) movement and examines those features in the context of the field of applied behavior analysis (ABA). We raise a number of concerns about PBS as an approach to delivery of behavioral services and its impact on how ABA is viewed by those in human services. We…

  8. Competing Uses of Underground Systems Related to Energy Supply: Applying Single- and Multiphase Simulations for Site Characterization and Risk-Analysis

    NASA Astrophysics Data System (ADS)

    Kissinger, A.; Walter, L.; Darcis, M.; Flemisch, B.; Class, H.

    2012-04-01

    Global climate change, shortage of resources and the resulting turn towards renewable sources of energy lead to a growing demand for the utilization of subsurface systems. Among these competing uses are Carbon Capture and Storage (CCS), geothermal energy, nuclear waste disposal, "renewable" methane or hydrogen storage as well as the ongoing production of fossil resources like oil, gas, and coal. Besides competing among themselves, these technologies may also create conflicts with essential public interests like water supply. For example, the injection of CO2 into the underground causes an increase in pressure reaching far beyond the actual radius of influence of the CO2 plume, potentially leading to large amounts of displaced salt water. Finding suitable sites is a demanding task for several reasons. Natural systems as opposed to technical systems are always characterized by heterogeneity. Therefore, parameter uncertainty impedes reliable predictions towards capacity and safety of a site. State of the art numerical simulations combined with stochastic approaches need to be used to obtain a more reliable assessment of the involved risks and the radii of influence of the different processes. These simulations may include the modeling of single- and multiphase non-isothermal flow, geo-chemical and geo-mechanical processes in order to describe all relevant physical processes adequately. Stochastic approaches have the aim to estimate a bandwidth of the key output parameters based on uncertain input parameters. Risks for these different underground uses can then be made comparable with each other. Along with the importance and the urgency of the competing processes this may lead to a more profound basis for a decision. Communicating risks to stake holders and a concerned public is crucial for the success of finding a suitable site for CCS (or other subsurface utilization). We present and discuss first steps towards an approach for addressing the issue of competitive

  9. Risks, scientific uncertainty and the approach of applying precautionary principle.

    PubMed

    Lo, Chang-fa

    2009-03-01

    The paper intends to clarify the nature and aspects of risks and scientific uncertainty and also to elaborate the approach of application of precautionary principle for the purpose of handling the risk arising from scientific uncertainty. It explains the relations between risks and the application of precautionary principle at international and domestic levels. In the situations where an international treaty has admitted the precautionary principle and in the situation where there is no international treaty admitting the precautionary principle or enumerating the conditions to take measures, the precautionary principle has a role to play. The paper proposes a decision making tool, containing questions to be asked, to help policymakers to apply the principle. It also proposes a "weighing and balancing" procedure to help them decide the contents of the measure to cope with the potential risk and to avoid excessive measures.

  10. DWPF risk analysis summary

    SciTech Connect

    Shedrow, C.B.

    1990-10-01

    This document contains selected risk analysis data from Chapter 9 (Safety Analysis) of the Defense Waste Processing Facility Safety Analysis Report DWPF SAR and draft Addendum 1 to the Waste Tank Farms SAR. Although these data may be revised prior to finalization of the draft SAR and the draft addendum, they are presently the best available information and were therefore used in preparing the risk analysis portion of the DWPF Environmental Analysis (DWPF EA). This information has been extracted from those draft documents an approved under separate cover so that it can be used as reference material for the DWPF EA when it is placed in the public reading rooms. 9 refs., 4 tabs.

  11. Quantitative Microbial Risk Assessment Tutorial: Pour Point Analysis of Land-applied Microbial Loadings and Comparison of Simulated and Gaging Station Results

    EPA Science Inventory

    This tutorial demonstrates a pour point analysis • Initiates execution of the SDMPB.• Navigates the SDMPB.• Chooses a pour point within a watershed, delineates the sub-area that contributes to that pour point, and collects data for it.• Considers land applicat...

  12. Quantitative Microbial Risk Assessment Tutorial: Pour Point Analysis of Land-applied Microbial Loadings and Comparison of Simulated and Gaging Station Results

    EPA Science Inventory

    This tutorial demonstrates a pour point analysis • Initiates execution of the SDMPB.• Navigates the SDMPB.• Chooses a pour point within a watershed, delineates the sub-area that contributes to that pour point, and collects data for it.• Considers land applicat...

  13. Assessment of cardiovascular risk in Tunisia: applying the Framingham risk score to national survey data

    PubMed Central

    Saidi, O; Malouche, D; O'Flaherty, M; Ben Mansour, N; A Skhiri, H; Ben Romdhane, H; Bezdah, L

    2016-01-01

    Objective This paper aims to assess the socioeconomic determinants of a high 10 year cardiovascular risk in Tunisia. Setting We used a national population based cross sectional survey conducted in 2005 in Tunisia comprising 7780 subjects. We applied the non-laboratory version of the Framingham equation to estimate the 10 year cardiovascular risk. Participants 8007 participants, aged 35–74 years, were included in the sample but effective exclusion of individuals with cardiovascular diseases and cancer resulted in 7780 subjects (3326 men and 4454 women) included in the analysis. Results Mean age was 48.7 years. Women accounted for 50.5% of participants. According to the Framingham equation, 18.1% (17.25–18.9%) of the study population had a high risk (≥20% within 10 years). The gender difference was striking and statistically significant: 27.2% (25.7–28.7%) of men had a high risk, threefold higher than women (9.7%; 8.8–10.5%). A higher 10 year global cardiovascular risk was associated with social disadvantage in men and women; thus illiterate and divorced individuals, and adults without a professional activity had a significantly higher risk of developing a cardiovascular event in 10 years. Illiterate men were at higher risk than those with secondary and higher education (OR=7.01; 5.49 to 9.14). The risk in illiterate women was more elevated (OR=13.57; 7.58 to 24.31). Those living in an urban area had a higher risk (OR=1.45 (1.19 to 1.76) in men and OR=1.71 (1.35 to 2.18) in women). Conclusions The 10 year global cardiovascular risk in the Tunisian population is already substantially high, affecting almost a third of men and 1 in 10 women, and concentrated in those more socially disadvantaged. PMID:27903556

  14. Applying the partitioned multiobjective risk method (PMRM) to portfolio selection.

    PubMed

    Reyes Santos, Joost; Haimes, Yacov Y

    2004-06-01

    The analysis of risk-return tradeoffs and their practical applications to portfolio analysis paved the way for Modern Portfolio Theory (MPT), which won Harry Markowitz a 1992 Nobel Prize in Economics. A typical approach in measuring a portfolio's expected return is based on the historical returns of the assets included in a portfolio. On the other hand, portfolio risk is usually measured using volatility, which is derived from the historical variance-covariance relationships among the portfolio assets. This article focuses on assessing portfolio risk, with emphasis on extreme risks. To date, volatility is a major measure of risk owing to its simplicity and validity for relatively small asset price fluctuations. Volatility is a justified measure for stable market performance, but it is weak in addressing portfolio risk under aberrant market fluctuations. Extreme market crashes such as that on October 19, 1987 ("Black Monday") and catastrophic events such as the terrorist attack of September 11, 2001 that led to a four-day suspension of trading on the New York Stock Exchange (NYSE) are a few examples where measuring risk via volatility can lead to inaccurate predictions. Thus, there is a need for a more robust metric of risk. By invoking the principles of the extreme-risk-analysis method through the partitioned multiobjective risk method (PMRM), this article contributes to the modeling of extreme risks in portfolio performance. A measure of an extreme portfolio risk, denoted by f(4), is defined as the conditional expectation for a lower-tail region of the distribution of the possible portfolio returns. This article presents a multiobjective problem formulation consisting of optimizing expected return and f(4), whose solution is determined using Evolver-a software that implements a genetic algorithm. Under business-as-usual market scenarios, the results of the proposed PMRM portfolio selection model are found to be compatible with those of the volatility-based model

  15. Total Risk Approach in Applying PRA to Criticality Safety

    SciTech Connect

    Huang, S T

    2005-03-24

    As nuclear industry continues marching from an expert-base support to more procedure-base support, it is important to revisit the total risk concept to criticality safety. A key objective of criticality safety is to minimize total criticality accident risk. The purpose of this paper is to assess key constituents of total risk concept pertaining to criticality safety from an operations support perspective and to suggest a risk-informed means of utilizing criticality safety resources for minimizing total risk. A PRA methodology was used to assist this assessment. The criticality accident history was assessed to provide a framework for our evaluation. In supporting operations, the work of criticality safety engineers ranges from knowing the scope and configurations of a proposed operation, performing criticality hazards assessment to derive effective controls, assisting in training operators, response to floor questions, surveillance to ensure implementation of criticality controls, and response to criticality mishaps. In a compliance environment, the resource of criticality safety engineers is increasingly being directed towards tedious documentation effort to meet some regulatory requirements to the effect of weakening the floor support for criticality safety. By applying a fault tree model to identify the major contributors of criticality accidents, a total risk picture is obtained to address relative merits of various actions. Overall, human failure is the key culprit in causing criticality accidents. Factors such as failure to follow procedures, lacks of training, lack of expert support at the floor level etc. are main contributors. Other causes may include lack of effective criticality controls such as inadequate criticality safety evaluation. Not all of the causes are equally important in contributing to criticality mishaps. Applying the limited resources to strengthen the weak links would reduce risk more than continuing emphasis on the strong links of

  16. Proceedings of the 2006 Toxicology and Risk Assessment Conference: Applying Mode of Action in Risk Assessment

    DTIC Science & Technology

    2006-07-01

    Ecological Risk Assessment; Benchmark Dose Methods ; Fundamentals of Risk Assessment; and Multi-Criteria Decision Analysis Tools for Managing Complex...Tomato; I Say Lampshade - Different Methods and Mechanisms for Health Risk Assessment.” The program, which is included in this report, gives an...cutting edge chemical mixture health risk assessment risk issues, explanation of state-of-the-art methods , and hands on exercises for several

  17. Wavelet analysis applied to the IRAS cirrus

    NASA Technical Reports Server (NTRS)

    Langer, William D.; Wilson, Robert W.; Anderson, Charles H.

    1994-01-01

    The structure of infrared cirrus clouds is analyzed with Laplacian pyramid transforms, a form of non-orthogonal wavelets. Pyramid and wavelet transforms provide a means to decompose images into their spatial frequency components such that all spatial scales are treated in an equivalent manner. The multiscale transform analysis is applied to IRAS 100 micrometer maps of cirrus emission in the north Galactic pole region to extract features on different scales. In the maps we identify filaments, fragments and clumps by separating all connected regions. These structures are analyzed with respect to their Hausdorff dimension for evidence of the scaling relationships in the cirrus clouds.

  18. Positive Behavior Support and Applied Behavior Analysis

    PubMed Central

    Johnston, J.M; Foxx, Richard M; Jacobson, John W; Green, Gina; Mulick, James A

    2006-01-01

    This article reviews the origins and characteristics of the positive behavior support (PBS) movement and examines those features in the context of the field of applied behavior analysis (ABA). We raise a number of concerns about PBS as an approach to delivery of behavioral services and its impact on how ABA is viewed by those in human services. We also consider the features of PBS that have facilitated its broad dissemination and how ABA might benefit from emulating certain practices of the PBS movement. PMID:22478452

  19. Sneak analysis applied to process systems

    NASA Astrophysics Data System (ADS)

    Whetton, Cris

    Traditional safety analyses, such as HAZOP, FMEA, FTA, and MORT, are less than effective at identifying hazards resulting from incorrect 'flow' - whether this be flow of information, actions, electric current, or even the literal flow of process fluids. Sneak Analysis (SA) has existed since the mid nineteen-seventies as a means of identifying such conditions in electric circuits; in which area, it is usually known as Sneak Circuit Analysis (SCA). This paper extends the ideas of Sneak Circuit Analysis to a general method of Sneak Analysis applied to process plant. The methods of SA attempt to capitalize on previous work in the electrical field by first producing a pseudo-electrical analog of the process and then analyzing the analog by the existing techniques of SCA, supplemented by some additional rules and clues specific to processes. The SA method is not intended to replace any existing method of safety analysis; instead, it is intended to supplement such techniques as HAZOP and FMEA by providing systematic procedures for the identification of a class of potential problems which are not well covered by any other method.

  20. Applying RESRAD-CHEM for chemical risk assessment

    SciTech Connect

    Cheng, J.J.; Yu, C.

    1995-07-01

    RESRAD-CHEM is a multiple pathway analysis computer code to evaluate chemically contaminated sites; it was developed at Argonne National Laboratory for the US Department of Energy. The code is designed to predict human health risks from exposure to hazardous chemicals and to derive cleanup criteria for chemically contaminated soils. It consists of environmental fate and transport models and is capable of predicting chemical concentrations over time in different environmental media. The methodology used in RESRAD-CHEM for exposure assessment and risk characterization follows the US Environmental Protection Agency`s guidance on Human Health Evaluation for Superfund. A user-friendly interface is incorporated for entering data, operating the code, and displaying results. RESRAD-CHEM is easy to use and is a powerful tool to assess chemical risk from environmental exposure.

  1. Applied spectrophotometry: analysis of a biochemical mixture.

    PubMed

    Trumbo, Toni A; Schultz, Emeric; Borland, Michael G; Pugh, Michael Eugene

    2013-01-01

    Spectrophotometric analysis is essential for determining biomolecule concentration of a solution and is employed ubiquitously in biochemistry and molecular biology. The application of the Beer-Lambert-Bouguer Lawis routinely used to determine the concentration of DNA, RNA or protein. There is however a significant difference in determining the concentration of a given species (RNA, DNA, protein) in isolation (a contrived circumstance) as opposed to determining that concentration in the presence of other species (a more realistic situation). To present the student with a more realistic laboratory experience and also to fill a hole that we believe exists in student experience prior to reaching a biochemistry course, we have devised a three week laboratory experience designed so that students learn to: connect laboratory practice with theory, apply the Beer-Lambert-Bougert Law to biochemical analyses, demonstrate the utility and limitations of example quantitative colorimetric assays, demonstrate the utility and limitations of UV analyses for biomolecules, develop strategies for analysis of a solution of unknown biomolecular composition, use digital micropipettors to make accurate and precise measurements, and apply graphing software. Copyright © 2013 Wiley Periodicals, Inc.

  2. Security risk assessment: applying the concepts of fuzzy logic.

    PubMed

    Bajpai, Shailendra; Sachdeva, Anish; Gupta, J P

    2010-01-15

    Chemical process industries (CPI) handling hazardous chemicals in bulk can be attractive targets for deliberate adversarial actions by terrorists, criminals and disgruntled employees. It is therefore imperative to have comprehensive security risk management programme including effective security risk assessment techniques. In an earlier work, it has been shown that security risk assessment can be done by conducting threat and vulnerability analysis or by developing Security Risk Factor Table (SRFT). HAZOP type vulnerability assessment sheets can be developed that are scenario based. In SRFT model, important security risk bearing factors such as location, ownership, visibility, inventory, etc., have been used. In this paper, the earlier developed SRFT model has been modified using the concepts of fuzzy logic. In the modified SRFT model, two linguistic fuzzy scales (three-point and four-point) are devised based on trapezoidal fuzzy numbers. Human subjectivity of different experts associated with previous SRFT model is tackled by mapping their scores to the newly devised fuzzy scale. Finally, the fuzzy score thus obtained is defuzzyfied to get the results. A test case of a refinery is used to explain the method and compared with the earlier work.

  3. Probabilistic risk analysis and terrorism risk.

    PubMed

    Ezell, Barry Charles; Bennett, Steven P; von Winterfeldt, Detlof; Sokolowski, John; Collins, Andrew J

    2010-04-01

    Since the terrorist attacks of September 11, 2001, and the subsequent establishment of the U.S. Department of Homeland Security (DHS), considerable efforts have been made to estimate the risks of terrorism and the cost effectiveness of security policies to reduce these risks. DHS, industry, and the academic risk analysis communities have all invested heavily in the development of tools and approaches that can assist decisionmakers in effectively allocating limited resources across the vast array of potential investments that could mitigate risks from terrorism and other threats to the homeland. Decisionmakers demand models, analyses, and decision support that are useful for this task and based on the state of the art. Since terrorism risk analysis is new, no single method is likely to meet this challenge. In this article we explore a number of existing and potential approaches for terrorism risk analysis, focusing particularly on recent discussions regarding the applicability of probabilistic and decision analytic approaches to bioterrorism risks and the Bioterrorism Risk Assessment methodology used by the DHS and criticized by the National Academies and others.

  4. Tissue Microarray Analysis Applied to Bone Diagenesis

    PubMed Central

    Mello, Rafael Barrios; Silva, Maria Regina Regis; Alves, Maria Teresa Seixas; Evison, Martin Paul; Guimarães, Marco Aurelio; Francisco, Rafaella Arrabaca; Astolphi, Rafael Dias; Iwamura, Edna Sadayo Miazato

    2017-01-01

    Taphonomic processes affecting bone post mortem are important in forensic, archaeological and palaeontological investigations. In this study, the application of tissue microarray (TMA) analysis to a sample of femoral bone specimens from 20 exhumed individuals of known period of burial and age at death is described. TMA allows multiplexing of subsamples, permitting standardized comparative analysis of adjacent sections in 3-D and of representative cross-sections of a large number of specimens. Standard hematoxylin and eosin, periodic acid-Schiff and silver methenamine, and picrosirius red staining, and CD31 and CD34 immunohistochemistry were applied to TMA sections. Osteocyte and osteocyte lacuna counts, percent bone matrix loss, and fungal spheroid element counts could be measured and collagen fibre bundles observed in all specimens. Decalcification with 7% nitric acid proceeded more rapidly than with 0.5 M EDTA and may offer better preservation of histological and cellular structure. No endothelial cells could be detected using CD31 and CD34 immunohistochemistry. Correlation between osteocytes per lacuna and age at death may reflect reported age-related responses to microdamage. Methodological limitations and caveats, and results of the TMA analysis of post mortem diagenesis in bone are discussed, and implications for DNA survival and recovery considered. PMID:28051148

  5. Reliability analysis applied to structural tests

    NASA Technical Reports Server (NTRS)

    Diamond, P.; Payne, A. O.

    1972-01-01

    The application of reliability theory to predict, from structural fatigue test data, the risk of failure of a structure under service conditions because its load-carrying capability is progressively reduced by the extension of a fatigue crack, is considered. The procedure is applicable to both safe-life and fail-safe structures and, for a prescribed safety level, it will enable an inspection procedure to be planned or, if inspection is not feasible, it will evaluate the life to replacement. The theory has been further developed to cope with the case of structures with initial cracks, such as can occur in modern high-strength materials which are susceptible to the formation of small flaws during the production process. The method has been applied to a structure of high-strength steel and the results are compared with those obtained by the current life estimation procedures. This has shown that the conventional methods can be unconservative in certain cases, depending on the characteristics of the structure and the design operating conditions. The suitability of the probabilistic approach to the interpretation of the results from full-scale fatigue testing of aircraft structures is discussed and the assumptions involved are examined.

  6. On differentiation in applied behavior analysis

    PubMed Central

    Fawcett, Stephen B.

    1985-01-01

    Distinct types of activity in the field of applied behavior analysis are noted and discussed. Four metaphorical types of activity are considered: prospecting, farming, building, and guiding. Prospecting consists of time-limited exploration of a variety of beaviors, populations, or settings. Farming consists of producing new behaviors in the same setting using independent variables provided by the researchers or normally available in the setting. Building consists of combining procedural elements to create new programs or systems or to rehabilitate aspects of existing programs. Guiding involves pointing out connections between the principles of human behavior and the problems, populations, settings, and procedures with which researchers are (or could be) working. Advantages of each sphere are noted, and benefits of this division of labor to the field as a whole are discussed. PMID:22478631

  7. Applying ecological risk principles to watershed assessment and management.

    PubMed

    Serveiss, Victor B

    2002-02-01

    Considerable progress in addressing point source (end of pipe) pollution problems has been made, but it is now recognized that further substantial environmental improvements depend on controlling nonpoint source pollution. A watershed approach is being used more frequently to address these problems because traditional regulatory approaches do not focus on nonpoint sources. The watershed approach is organized around the guiding principles of partnerships, geographic focus, and management based on sound science and data. This helps to focus efforts on the highest priority problems within hydrologically-defined geographic areas. Ecological risk assessment is a process to collect, organize, analyze, and present scientific information to improve decision making. The U.S. Environmental Protection Agency (EPA) sponsored three watershed assessments and found that integrating the watershed approach with ecological risk assessment increases the use of environmental monitoring and assessment data in decision making. This paper describes the basics of the watershed approach, the ecological risk assessment process, and how these two frameworks can be integrated. The three major principles of watershed ecological risk assessment found to be most useful for increasing the use of science in decision making are (1) using assessment endpoints and conceptual models, (2) holding regular interactions between scientists and managers, and (3) developing a focus for multiple stressor analysis. Examples are provided illustrating how these principles were implemented in these assessments.

  8. Reachability Analysis Applied to Space Situational Awareness

    NASA Astrophysics Data System (ADS)

    Holzinger, M.; Scheeres, D.

    Several existing and emerging applications of Space Situational Awareness (SSA) relate directly to spacecraft Rendezvous, Proximity Operations, and Docking (RPOD) and Formation / Cluster Flight (FCF). When multiple Resident Space Ob jects (RSOs) are in vicinity of one another with appreciable periods between observations, correlating new RSO tracks to previously known objects becomes a non-trivial problem. A particularly difficult sub-problem is seen when long breaks in observations are coupled with continuous, low- thrust maneuvers. Reachability theory, directly related to optimal control theory, can compute contiguous reachability sets for known or estimated control authority and can support such RSO search and correlation efforts in both ground and on-board settings. Reachability analysis can also directly estimate the minimum control authority of a given RSO. For RPOD and FCF applications, emerging mission concepts such as fractionation drastically increase system complexity of on-board autonomous fault management systems. Reachability theory, as applied to SSA in RPOD and FCF applications, can involve correlation of nearby RSO observations, control authority estimation, and sensor track re-acquisition. Additional uses of reachability analysis are formation reconfiguration, worst-case passive safety, and propulsion failure modes such as a "stuck" thruster. Existing reachability theory is applied to RPOD and FCF regimes. An optimal control policy is developed to maximize the reachability set and optimal control law discontinuities (switching) are examined. The Clohessy-Wiltshire linearized equations of motion are normalized to accentuate relative control authority for spacecraft propulsion systems at both Low Earth Orbit (LEO) and Geostationary Earth Orbit (GEO). Several examples with traditional and low thrust propulsion systems in LEO and GEO are explored to illustrate the effects of relative control authority on the time-varying reachability set surface. Both

  9. Preference Functions for Spatial Risk Analysis.

    PubMed

    Keller, L Robin; Simon, Jay

    2017-09-07

    When outcomes are defined over a geographic region, measures of spatial risk regarding these outcomes can be more complex than traditional measures of risk. One of the main challenges is the need for a cardinal preference function that incorporates the spatial nature of the outcomes. We explore preference conditions that will yield the existence of spatial measurable value and utility functions, and discuss their application to spatial risk analysis. We also present a simple example on household freshwater usage across regions to demonstrate how such functions can be assessed and applied. © 2017 Society for Risk Analysis.

  10. Risk Analysis Virtual ENvironment

    SciTech Connect

    2014-02-10

    RAVEN has 3 major functionalities: 1. Provides a Graphical User Interface for the pre- and post-processing of the RELAP-7 input and output. 2. Provides the capability to model nuclear power plants control logic for the RELAP-7 code and dynamic control of the accident scenario evolution. This capability is based on a software structure that realizes a direct connection between the RELAP-7 solver engine (MOOSE) and a python environment where the variables describing the plant status are accessible in a scripting environment. RAVEN support the generation of the probabilistic scenario control by supplying a wide range of probability and cumulative distribution functions and their inverse functions. 3. Provides a general environment to perform probability risk analysis for RELAP-7, RELAP-5 and any generic MOOSE based applications. The probabilistic analysis is performed by sampling the input space of the coupled code parameters and it is enhanced by using modern artificial intelligence algorithms that accelerate the identification of the areas of major risk (in the input parameter space). This environment also provides a graphical visualization capability to analyze the outcomes. Among other approaches, the classical Monte Carlo and Latin Hypercube sampling algorithms are available. For the acceleration of the convergence of the sampling methodologies, Support Vector Machines, Bayesian regression, and collocation stochastic polynomials chaos are implemented. The same methodologies here described could be used to solve optimization and uncertainties propagation problems using the RAVEN framework.

  11. Cluster analysis applied to multiparameter geophysical dataset

    NASA Astrophysics Data System (ADS)

    Di Giuseppe, M. G.; Troiano, A.; Troise, C.; De Natale, G.

    2012-04-01

    Multi-parameter acquisition is a common geophysical field practice nowadays. Regularly seismic velocity and attenuation, gravity and electromagnetic dataset are acquired in a certain area, to obtain a complete characterization of the some investigate feature of the subsoil. Such a richness of information is often underestimated, although an integration of the analysis could provide a notable improving in the imaging of the investigated structures, mostly because the handling of distinct parameters and their joint inversion still presents several and severe problems. Post-inversion statistical techniques represent a promising approach to these questions, providing a quick, simple and elegant way to obtain this advantageous but complex integration. We present an approach based on the partition of the analyzed multi parameter dataset in a number of different classes, identified as localized regions of high correlation. These classes, or 'Cluster', are structured in such a way that the observations pertaining to a certain group are more similar to each other than the observations belonging to a different one, according to an optimal logical criterion. Regions of the subsoil sharing the same physical characteristic are so identified, without a-priori or empirical relationship linking the distinct measured parameters. The retrieved imaging results highly affordable in a statistical sense, specifically due to this lack of external hypothesis that are, instead, indispensable in a full joint inversion, were works, as matter of fact, just a real constrain for the inversion process, not seldom of relative consistence. We apply our procedure to a certain number of experimental dataset, related to several structures at very different scales presents in the Campanian district (southern Italy). These structures goes from the shallows evidence of the active fault zone originating the M 7.9 Irpinia earthquake to the main feature characterizing the Campi Flegrei Caldera and the Mt

  12. Probabilistic Exposure Analysis for Chemical Risk Characterization

    PubMed Central

    Bogen, Kenneth T.; Cullen, Alison C.; Frey, H. Christopher; Price, Paul S.

    2009-01-01

    This paper summarizes the state of the science of probabilistic exposure assessment (PEA) as applied to chemical risk characterization. Current probabilistic risk analysis methods applied to PEA are reviewed. PEA within the context of risk-based decision making is discussed, including probabilistic treatment of related uncertainty, interindividual heterogeneity, and other sources of variability. Key examples of recent experience gained in assessing human exposures to chemicals in the environment, and other applications to chemical risk characterization and assessment, are presented. It is concluded that, although improvements continue to be made, existing methods suffice for effective application of PEA to support quantitative analyses of the risk of chemically induced toxicity that play an increasing role in key decision-making objectives involving health protection, triage, civil justice, and criminal justice. Different types of information required to apply PEA to these different decision contexts are identified, and specific PEA methods are highlighted that are best suited to exposure assessment in these separate contexts. PMID:19223660

  13. Analysis of the interaction between experimental and applied behavior analysis.

    PubMed

    Virues-Ortega, Javier; Hurtado-Parrado, Camilo; Cox, Alison D; Pear, Joseph J

    2014-01-01

    To study the influences between basic and applied research in behavior analysis, we analyzed the coauthorship interactions of authors who published in JABA and JEAB from 1980 to 2010. We paid particular attention to authors who published in both JABA and JEAB (dual authors) as potential agents of cross-field interactions. We present a comprehensive analysis of dual authors' coauthorship interactions using social networks methodology and key word analysis. The number of dual authors more than doubled (26 to 67) and their productivity tripled (7% to 26% of JABA and JEAB articles) between 1980 and 2010. Dual authors stood out in terms of number of collaborators, number of publications, and ability to interact with multiple groups within the field. The steady increase in JEAB and JABA interactions through coauthors and the increasing range of topics covered by dual authors provide a basis for optimism regarding the progressive integration of basic and applied behavior analysis. © Society for the Experimental Analysis of Behavior.

  14. Finite element analysis applied to cornea reshaping.

    PubMed

    Cabrera Fernández, Delia; Niazy, A M; Kurtz, R M; Djotyan, G P; Juhasz, T

    2005-01-01

    A 2-D finite element model of the cornea is developed to simulate corneal reshaping and the resulting deformation induced by refractive surgery. In the numerical simulations, linear and nonlinear elastic models are applied when stiffness inhomogeneities varying with depth are considered. Multiple simulations are created that employ different geometric configurations for the removal of the corneal tissue. Side-by-side comparisons of the different constitutive laws are also performed. To facilitate the comparison, the material property constants are identified from the same experimental data, which are obtained from mechanical tests on corneal strips and membrane inflation experiments. We then validate the resulting models by comparing computed refractive power changes with clinical results. Tissue deformations created by simulated corneal tissue removal using finite elements are consistent with clinically observed postsurgical results. The model developed provides a much more predictable refractive outcome when the stiffness inhomogeneities of the cornea and nonlinearities of the deformations are included in the simulations. Finite element analysis is a useful tool for modeling surgical effects on the cornea and developing a better understanding of the biomechanics of the cornea. The creation of patient-specific simulations would allow surgical outcomes to be predicted based on individualized finite element models.

  15. Magnetic Analysis Techniques Applied to Desert Varnish

    NASA Technical Reports Server (NTRS)

    Schmidgall, E. R.; Moskowitz, B. M.; Dahlberg, E. D.; Kuhlman, K. R.

    2003-01-01

    Desert varnish is a black or reddish coating commonly found on rock samples from arid regions. Typically, the coating is very thin, less than half a millimeter thick. Previous research has shown that the primary components of desert varnish are silicon oxide clay minerals (60%), manganese and iron oxides (20-30%), and trace amounts of other compounds [1]. Desert varnish is thought to originate when windborne particles containing iron and manganese oxides are deposited onto rock surfaces where manganese oxidizing bacteria concentrate the manganese and form the varnish [4,5]. If desert varnish is indeed biogenic, then the presence of desert varnish on rock surfaces could serve as a biomarker, indicating the presence of microorganisms. This idea has considerable appeal, especially for Martian exploration [6]. Magnetic analysis techniques have not been extensively applied to desert varnish. The only previous magnetic study reported that based on room temperature demagnetization experiments, there were noticeable differences in magnetic properties between a sample of desert varnish and the substrate sandstone [7]. Based upon the results of the demagnetization experiments, the authors concluded that the primary magnetic component of desert varnish was either magnetite (Fe3O4) or maghemite ( Fe2O3).

  16. Image analysis applied to luminescence microscopy

    NASA Astrophysics Data System (ADS)

    Maire, Eric; Lelievre-Berna, Eddy; Fafeur, Veronique; Vandenbunder, Bernard

    1998-04-01

    We have developed a novel approach to study luminescent light emission during migration of living cells by low-light imaging techniques. The equipment consists in an anti-vibration table with a hole for a direct output under the frame of an inverted microscope. The image is directly captured by an ultra low- light level photon-counting camera equipped with an image intensifier coupled by an optical fiber to a CCD sensor. This installation is dedicated to measure in a dynamic manner the effect of SF/HGF (Scatter Factor/Hepatocyte Growth Factor) both on activation of gene promoter elements and on cell motility. Epithelial cells were stably transfected with promoter elements containing Ets transcription factor-binding sites driving a luciferase reporter gene. Luminescent light emitted by individual cells was measured by image analysis. Images of luminescent spots were acquired with a high aperture objective and time exposure of 10 - 30 min in photon-counting mode. The sensitivity of the camera was adjusted to a high value which required the use of a segmentation algorithm dedicated to eliminate the background noise. Hence, image segmentation and treatments by mathematical morphology were particularly indicated in these experimental conditions. In order to estimate the orientation of cells during their migration, we used a dedicated skeleton algorithm applied to the oblong spots of variable intensities emitted by the cells. Kinetic changes of luminescent sources, distance and speed of migration were recorded and then correlated with cellular morphological changes for each spot. Our results highlight the usefulness of the mathematical morphology to quantify kinetic changes in luminescence microscopy.

  17. Risk analysis and risk management in an uncertain world.

    PubMed

    Kunreuther, Howard

    2002-08-01

    The tragic attacks of September 11 and the bioterrorist threats with respect to anthrax that followed have raised a set of issues regarding how we deal with events where there is considerable ambiguity and uncertainty about the likelihood of their occurrence and their potential consequences. This paper discusses how one can link the tools of risk assessment and our knowledge of risk perception to develop risk management options for dealing with extreme events. In particular, it suggests ways that the members of the Society for Risk Analysis can apply their expertise and talent to the risks associated with terrorism and discusses the changing roles of the public and private sectors in dealing with extreme events.

  18. Multidimensional Risk Analysis: MRISK

    NASA Technical Reports Server (NTRS)

    McCollum, Raymond; Brown, Douglas; O'Shea, Sarah Beth; Reith, William; Rabulan, Jennifer; Melrose, Graeme

    2015-01-01

    Multidimensional Risk (MRISK) calculates the combined multidimensional score using Mahalanobis distance. MRISK accounts for covariance between consequence dimensions, which de-conflicts the interdependencies of consequence dimensions, providing a clearer depiction of risks. Additionally, in the event the dimensions are not correlated, Mahalanobis distance reduces to Euclidean distance normalized by the variance and, therefore, represents the most flexible and optimal method to combine dimensions. MRISK is currently being used in NASA's Environmentally Responsible Aviation (ERA) project o assess risk and prioritize scarce resources.

  19. Moving Forward: Positive Behavior Support and Applied Behavior Analysis

    ERIC Educational Resources Information Center

    Tincani, Matt

    2007-01-01

    A controversy has emerged about the relationship between positive behavior support and applied behavior analysis. Some behavior analysts suggest that positive behavior support and applied behavior analysis are the same (e.g., Carr & Sidener, 2002). Others argue that positive behavior support is harmful to applied behavior analysis (e.g., Johnston,…

  20. Risk: a multidisciplinary concept analysis.

    PubMed

    McNeill, Charleen

    2014-01-01

    To analyze the concept of risk utilizing Walker and Avant's method of analysis to determine a conceptual definition applicable within nursing and nursing research. The mental constructs and consequences of risk have a proactive connotation compared with the negative behaviors often identified as illustrations of risk. A new conceptual definition of risk provides insight into an understanding of risk regardless of discipline. Its application to the metaparadigm of nursing should be the impetus for action and education. Formalizing the mental constructs of the concept of risk in a clear manner facilitates the inclusion of its latent constructs in nursing research. © 2013 Wiley Periodicals, Inc.

  1. Compendium on Risk Analysis Techniques

    DTIC Science & Technology

    The evolution of risk analysis in the materiel acquisition process is traced from the Secretary Packard memorandum to current AMC guidance. Risk ... analysis is defined and many of the existing techniques are described in light of this definition and their specific role in program management and

  2. Introduction: Conversation Analysis in Applied Linguistics

    ERIC Educational Resources Information Center

    Sert, Olcay; Seedhouse, Paul

    2011-01-01

    This short, introductory paper presents an up-to-date account of works within the field of Applied Linguistics which have been influenced by a Conversation Analytic paradigm. The article reviews recent studies in classroom interaction, materials development, proficiency assessment and language teacher education. We believe that the publication of…

  3. Applying Mechanics to Swimming Performance Analysis.

    ERIC Educational Resources Information Center

    Barthels, Katharine

    1989-01-01

    Swimming teachers and coaches can improve their feedback to swimmers, when correcting or refining swim movements, by applying some basic biomechanical concepts relevant to swimming. This article focuses on the biomechanical considerations used in analyzing swimming performance. Techniques for spotting and correcting problems that impede…

  4. Risk/benefit analysis

    SciTech Connect

    Crouch, E.A.C.; Wilson, R.

    1982-01-01

    The Reagan administration is intent on rolling back regulations it considers unwise to give new life to American industry, but regulations were instituted to protect individuals against long-term hazards. The authors believe these hazards must be assessed before a regulation is modified, suspended, or implemented. They point out the problems inherent in defining, perceiving, and estimating risk. Throughout, they combine theoretical discussions with actual case studies covering the risk associated with nuclear power plants, saccharin use, mass chest radiography, and others. They believe that risk assessment should be distinct from decision making, with the risk assessor supplying clear and objective information about hazards and the probability of damage as well as pointing out the uncertainties to policy makers. 149 references, 29 figures, 8 tables.

  5. Risk analysis and management

    NASA Technical Reports Server (NTRS)

    Smith, H. E.

    1990-01-01

    Present software development accomplishments are indicative of the emerging interest in and increasing efforts to provide risk assessment backbone tools in the manned spacecraft engineering community. There are indications that similar efforts are underway in the chemical processes industry and are probably being planned for other high risk ground base environments. It appears that complex flight systems intended for extended manned planetary exploration will drive this technology.

  6. Reachability Analysis Applied to Space Situational Awareness

    DTIC Science & Technology

    2009-09-01

    12:207-242. [15] L. S. Breger, G. Inalhan, M. Tillerson, J. P. How, “Cooperative spacecraft Formation Flying: Model Predictive Control With Open And...applying them to the nonlinear relative orbit equations of motion, which are appropriate both for general SSA and spacecraft proximity operations... Nonlinear System (RNS). This assumption does not restrict the scope of these results in the context of SSA, as in orbital scenarios control and

  7. Object Oriented Risk Analysis Workshop

    NASA Astrophysics Data System (ADS)

    Pons, M. Güell I.; Jaboyedoff, M.

    2009-04-01

    In the framework of the RISET Project (Interfaculty Network of Support to Education and Technology) an educational tool for introducing risk analysis has been developed. This workshop enables to carry a group of students (role-play game) through a step-by-step process of risk identification and quantification. The aim is to assess risk in a characteristic alpine village regarding natural hazards (rockfall, snow avalanche, flooding…) and is oriented to affected objects such as buildings, infrastructures... The workshop contains the following steps: 1.- Planning of the study and definition of stakeholders 2.- Hazard identification 3.- Risk analysis 4.- Risk assessment 5.- Proposition of mitigation measures 6- Risk management and cost-benefit analysis. During the process, information related to past events and useful concepts are provided in order to bring up discussion and decision making. The Risk Matrix and other graphical tools allow having a visual representation of the risk level and help to prioritize counter measures. At the end of the workshop, there is the possibility to compare the results between different groups and print out a summarizing report. This approach provides a rapid and comprehensible risk evaluation. The workshop is accessible from the internet and will be used for educational purposes at bachelor and master level as well as for external persons dealing with risk analysis.

  8. Analysis of pressure applied during microlaryngoscopy.

    PubMed

    Fang, Rui; Chen, Hao; Sun, Jingwu

    2012-05-01

    Direct laryngoscopy is the most predominantly used technique in laryngology, with the fulcrum-based laryngoscope serving as the most popular instrument. The purpose of this study was to accurately calculate and measure the pressure acting on the laryngopharynx and the tongue base during microlaryngoscopy. The relationship between postoperative throat pain and the time and pressure applied during microlaryngoscopy were also investigated. Fifty patients undergoing microlaryngeal surgery were included in this prospective study. Parameters that may help predict difficult laryngeal exposure were measured in the patients before microlaryngoscopy. Using static equilibrium and the law of the lever, the pressure acting on the laryngopharynx and the tongue base were calculated and related parameters were then tested for their influence on pressure. The time and pressure applied during microlaryngoscopy of each patient were compared with postoperative throat pain grade. The mean pressure was 292 ± 109 mmHg and was significantly influenced by BMI, neck circumference and full mouth opening, whereas no gender-based differences of any kind were found. The pressure applied during microlaryngoscopy was extremely high in patients with difficult laryngeal exposure (376 ± 62 mmHg), serving as a possible reason for the presence of throat pain or complications present following surgery. However, it was found that the duration of suspension laryngoscopy, not the pressure, had the most significant correlation with postoperative throat pain.

  9. Applied surface analysis in magnetic storage technology

    NASA Astrophysics Data System (ADS)

    Windeln, Johannes; Bram, Christian; Eckes, Heinz-Ludwig; Hammel, Dirk; Huth, Johanna; Marien, Jan; Röhl, Holger; Schug, Christoph; Wahl, Michael; Wienss, Andreas

    2001-07-01

    This paper gives a synopsis of today's challenges and requirements for a surface analysis and materials science laboratory with a special focus on magnetic recording technology. The critical magnetic recording components, i.e. the protective carbon overcoat (COC), the disk layer structure, the read/write head including the giant-magnetoresistive (GMR) sensor, are described and options for their characterization with specific surface and structure analysis techniques are given. For COC investigations, applications of Raman spectroscopy to the structural analysis and determination of thickness, hydrogen and nitrogen content are discussed. Hardness measurements by atomic force microscopy (AFM) scratching techniques are presented. Surface adsorption phenomena on disk substrates or finished disks are characterized by contact angle analysis or so-called piezo-electric mass adsorption systems (PEMAS), also known as quartz crystal microbalance (QCM). A quickly growing field of applications is listed for various X-ray analysis techniques, such as disk magnetic layer texture analysis for X-ray diffraction, compositional characterization via X-ray fluorescence, compositional analysis with high lateral resolution via electron microprobe analysis. X-ray reflectometry (XRR) has become a standard method for the absolute measurement of individual layer thicknesses contained in multi-layer stacks and thus, is the successor of ellipsometry for this application. Due to the ongoing reduction of critical feature sizes, the analytical challenges in terms of lateral resolution, sensitivity limits and dedicated nano-preparation have been consistently growing and can only be met by state-of-the-art Auger electron spectrometers (AES), transmission electron microscopy (TEM) analysis, time-of-flight-secondary ion mass spectroscopy (ToF-SIMS) characterization, focused ion beam (FIB) sectioning and TEM lamella preparation via FIB. The depth profiling of GMR sensor full stacks was significantly

  10. Applying Complexity Theory to Risk in Child Protection Practice

    ERIC Educational Resources Information Center

    Stevens, Irene; Hassett, Peter

    2007-01-01

    This article looks at the application of complexity theory to risk assessment in child protection practice, and how it may help to give a better understanding of risk in relation to protecting vulnerable children. Within the last 20 years increasing use has been made of the term complexity within the natural sciences. In recent times, some of the…

  11. Applying Complexity Theory to Risk in Child Protection Practice

    ERIC Educational Resources Information Center

    Stevens, Irene; Hassett, Peter

    2007-01-01

    This article looks at the application of complexity theory to risk assessment in child protection practice, and how it may help to give a better understanding of risk in relation to protecting vulnerable children. Within the last 20 years increasing use has been made of the term complexity within the natural sciences. In recent times, some of the…

  12. Meta-analysis in applied ecology.

    PubMed

    Stewart, Gavin

    2010-02-23

    This overview examines research synthesis in applied ecology and conservation. Vote counting and pooling unweighted averages are widespread despite the superiority of syntheses based on weighted combination of effects. Such analyses allow exploration of methodological uncertainty in addition to consistency of effects across species, space and time, but exploring heterogeneity remains controversial. Meta-analyses are required to generalize in ecology, and to inform evidence-based decision-making, but the more sophisticated statistical techniques and registers of research used in other disciplines must be employed in ecology to fully realize their benefits.

  13. The Components of Microbiological Risk Analysis.

    PubMed

    Liuzzo, Gaetano; Bentley, Stefano; Giacometti, Federica; Serraino, Andrea

    2015-02-03

    The paper describes the process of risk analysis in a food safety perspective. The steps of risk analysis defined as a process consisting of three interconnected components (risk assessment, risk management, and risk communication) are analysed. The different components of the risk assessment, risk management and risk communication are further described.

  14. The Components of Microbiological Risk Analysis

    PubMed Central

    Bentley, Stefano; Giacometti, Federica; Serraino, Andrea

    2015-01-01

    The paper describes the process of risk analysis in a food safety perspective. The steps of risk analysis defined as a process consisting of three interconnected components (risk assessment, risk management, and risk communication) are analysed. The different components of the risk assessment, risk management and risk communication are further described. PMID:27800384

  15. Natural hazard management high education: laboratory of hydrologic and hydraulic risk management and applied geomorphology

    NASA Astrophysics Data System (ADS)

    Giosa, L.; Margiotta, M. R.; Sdao, F.; Sole, A.; Albano, R.; Cappa, G.; Giammatteo, C.; Pagliuca, R.; Piccolo, G.; Statuto, D.

    2009-04-01

    The Environmental Engineering Faculty of University of Basilicata have higher-level course for students in the field of natural hazard. The curriculum provides expertise in the field of prediction, prevention and management of earthquake risk, hydrologic-hydraulic risk, and geomorphological risk. These skills will contribute to the training of specialists, as well as having a thorough knowledge of the genesis and the phenomenology of natural risks, know how to interpret, evaluate and monitor the dynamic of environment and of territory. In addition to basic training in the fields of mathematics and physics, the course of study provides specific lessons relating to seismic and structural dynamics of land, environmental and computational hydraulics, hydrology and applied hydrogeology. In particular in this course there are organized two connected examination arguments: Laboratory of hydrologic and hydraulic risk management and Applied geomorphology. These course foresee the development and resolution of natural hazard problems through the study of a real natural disaster. In the last year, the work project has regarded the collapse of two decantation basins of fluorspar, extracted from some mines in Stava Valley, 19 July 1985, northern Italy. During the development of the course, data and event information has been collected, a guided tour to the places of the disaster has been organized, and finally the application of mathematical models to simulate the disaster and analysis of the results has been carried out. The student work has been presented in a public workshop.

  16. Applying the welfare model to at-own-risk discharges.

    PubMed

    Krishna, Lalit Kumar Radha; Menon, Sumytra; Kanesvaran, Ravindran

    2017-08-01

    "At-own-risk discharges" or "self-discharges" evidences an irretrievable breakdown in the patient-clinician relationship when patients leave care facilities before completion of medical treatment and against medical advice. Dissolution of the therapeutic relationship terminates the physician's duty of care and professional liability with respect to care of the patient. Acquiescence of an at-own-risk discharge by the clinician is seen as respecting patient autonomy. The validity of such requests pivot on the assumptions that the patient is fully informed and competent to invoke an at-own-risk discharge and that care up to the point of the at-own-risk discharge meets prevailing clinical standards. Palliative care's use of a multidisciplinary team approach challenges both these assumptions. First by establishing multiple independent therapeutic relations between professionals in the multidisciplinary team and the patient who persists despite an at-own-risk discharge. These enduring therapeutic relationships negate the suggestion that no duty of care is owed the patient. Second, the continued employ of collusion, familial determinations, and the circumnavigation of direct patient involvement in family-centric societies compromises the patient's decision-making capacity and raises questions as to the patient's decision-making capacity and their ability to assume responsibility for the repercussions of invoking an at-own-risk discharge. With the validity of at-own-risk discharge request in question and the welfare and patient interest at stake, an alternative approach to assessing at-own-risk discharge requests are called for. The welfare model circumnavigates these concerns and preserves the patient's welfare through the employ of a multidisciplinary team guided holistic appraisal of the patient's specific situation that is informed by clinical and institutional standards and evidenced-based practice. The welfare model provides a robust decision-making framework for

  17. Applied Spectrophotometry: Analysis of a Biochemical Mixture

    ERIC Educational Resources Information Center

    Trumbo, Toni A.; Schultz, Emeric; Borland, Michael G.; Pugh, Michael Eugene

    2013-01-01

    Spectrophotometric analysis is essential for determining biomolecule concentration of a solution and is employed ubiquitously in biochemistry and molecular biology. The application of the Beer-Lambert-Bouguer Lawis routinely used to determine the concentration of DNA, RNA or protein. There is however a significant difference in determining the…

  18. Science, skepticism, and applied behavior analysis.

    PubMed

    Normand, Matthew P

    2008-01-01

    Pseudoscientific claims concerning medical and psychological treatments of all varieties are commonplace. As behavior analysts, a sound skeptical approach to our science and practice is essential. The present paper offers an overview of science and skepticism and discusses the relationship of skepticism to behavior analysis, with an emphasis on the types of issues concerning behavior analysts in practice.

  19. Science, Skepticism, and Applied Behavior Analysis

    PubMed Central

    Normand, Matthew P

    2008-01-01

    Pseudoscientific claims concerning medical and psychological treatments of all varieties are commonplace. As behavior analysts, a sound skeptical approach to our science and practice is essential. The present paper offers an overview of science and skepticism and discusses the relationship of skepticism to behavior analysis, with an emphasis on the types of issues concerning behavior analysts in practice. PMID:22477687

  20. Applied Spectrophotometry: Analysis of a Biochemical Mixture

    ERIC Educational Resources Information Center

    Trumbo, Toni A.; Schultz, Emeric; Borland, Michael G.; Pugh, Michael Eugene

    2013-01-01

    Spectrophotometric analysis is essential for determining biomolecule concentration of a solution and is employed ubiquitously in biochemistry and molecular biology. The application of the Beer-Lambert-Bouguer Lawis routinely used to determine the concentration of DNA, RNA or protein. There is however a significant difference in determining the…

  1. Enrichment analysis applied to disease prognosis

    PubMed Central

    2013-01-01

    Enrichment analysis is well established in the field of transcriptomics, where it is used to identify relevant biological features that characterize a set of genes obtained in an experiment. This article proposes the application of enrichment analysis as a first step in a disease prognosis methodology, in particular of diseases with a strong genetic component. With this analysis the objective is to identify clinical and biological features that characterize groups of patients with a common disease, and that can be used to distinguish between groups of patients associated with disease-related events. Data mining methodologies can then be used to exploit those features, and assist medical doctors in the evaluation of the patients in respect to their predisposition for a specific event. In this work the disease hypertrophic cardiomyopathy (HCM) is used as a case-study, as a first test to assess the feasibility of the application of an enrichment analysis to disease prognosis. To perform this assessment, two groups of patients have been considered: patients that have suffered a sudden cardiac death episode and patients that have not. The results presented were obtained with genetic data and the Gene Ontology, in two enrichment analyses: an enrichment profiling aiming at characterizing a group of patients (e.g. that suffered a disease-related event) based on their mutations; and a differential enrichment aiming at identifying differentiating features between a sub-group of patients and all the patients with the disease. These analyses correspond to an adaptation of the standard enrichment analysis, since multiple sets of genes are being considered, one for each patient. The preliminary results are promising, as the sets of terms obtained reflect the current knowledge about the gene functions commonly altered in HCM patients, thus allowing their characterization. Nevertheless, some factors need to be taken into consideration before the full potential of the enrichment

  2. Thermal analysis applied to irradiated propolis

    NASA Astrophysics Data System (ADS)

    Matsuda, Andrea Harumi; Machado, Luci Brocardo; del Mastro, Nélida Lucia

    2002-03-01

    Propolis is a resinous hive product, collected by bees. Raw propolis requires a decontamination procedure and irradiation appears as a promising technique for this purpose. The valuable properties of propolis for food and pharmaceutical industries have led to increasing interest in its technological behavior. Thermal analysis is a chemical analysis that gives information about changes on heating of great importance for technological applications. Ground propolis samples were 60Co gamma irradiated with 0 and 10 kGy. Thermogravimetry curves shown a similar multi-stage decomposition pattern for both irradiated and unirradiated samples up to 600°C. Similarly, through differential scanning calorimetry , a coincidence of melting point of irradiated and unirradiated samples was found. The results suggest that the irradiation process do not interfere on the thermal properties of propolis when irradiated up to 10 kGy.

  3. Biomechanics and motion analysis applied to sports.

    PubMed

    Zheng, N; Barrentine, S W

    2000-05-01

    The development of motion analysis and the application of biomechanical analysis techniques to sports has paralleled the exponential growth of computational and videographic technology. Technological developments have provided for advances in the investigation of the human body and the action of the human body during sports believed to be unobtainable a few years ago. Technological advancements have brought biomechanical applications into a wide range of fields from orthopedics to entertainment. An area that has made tremendous gains using biomechanics is sports science. Coaches, therapists, and physicians are using biomechanics to improve performance, rehabilitation, and the prevention of sports related injuries. Functional analyses of athletic movements that were impossible a few years ago are available and used today. With new advancements, the possibilities for investigating the way a human interacts and reacts to environmental conditions are ever expanding.

  4. APPLYING NEW METHODS TO RESEARCH REACTOR ANALYSIS.

    SciTech Connect

    DIAMOND,D.J.CHENG,L.HANSON,A.XU,J.CAREW,J.F.

    2004-02-05

    Detailed reactor physics and safety analyses are being performed for the 20 MW D{sub 2}O-moderated research reactor at the National Institute of Standards and Technology (NIST). The analyses employ state-of-the-art calculational methods and will contribute to an update to the Final Safety Analysis Report (FSAR). Three-dimensional MCNP Monte Carlo neutron and photon transport calculations are performed to determine power and reactivity parameters, including feedback coefficients and control element worths. The core depletion and determination of the fuel compositions are performed with MONTEBURNS to model the reactor at the beginning, middle, and end-of-cycle. The time-dependent analysis of the primary loop is determined with a RELAP5 transient analysis model that includes the pump, heat exchanger, fuel element geometry, and flow channels. A statistical analysis used to assure protection from critical heat flux (CHF) is performed using a Monte Carlo simulation of the uncertainties contributing to the CHF calculation. The power distributions used to determine the local fuel conditions and margin to CHF are determined with MCNP. Evaluations have been performed for the following accidents: (1) the control rod withdrawal startup accident, (2) the maximum reactivity insertion accident, (3) loss-of-flow resulting from loss of electrical power, (4) loss-of-flow resulting from a primary pump seizure, (5) loss-of-flow resulting from inadvertent throttling of a flow control valve, (6) loss-of-flow resulting from failure of both shutdown cooling pumps and (7) misloading of a fuel element. These analyses are significantly more rigorous than those performed previously. They have provided insights into reactor behavior and additional assurance that previous analyses were conservative and the reactor was being operated safely.

  5. Applying thiouracil tagging to mouse transcriptome analysis.

    PubMed

    Gay, Leslie; Karfilis, Kate V; Miller, Michael R; Doe, Chris Q; Stankunas, Kryn

    2014-02-01

    Transcriptional profiling is a powerful approach for studying mouse development, physiology and disease models. Here we describe a protocol for mouse thiouracil tagging (TU tagging), a transcriptome analysis technology that includes in vivo covalent labeling, purification and analysis of cell type-specific RNA. TU tagging enables the isolation of RNA from a given cell population of a complex tissue, avoiding transcriptional changes induced by cell isolation trauma, as well as the identification of actively transcribed RNAs and not preexisting transcripts. Therefore, in contrast to other cell-specific transcriptional profiling methods based on the purification of tagged ribosomes or nuclei, TU tagging provides a direct examination of transcriptional regulation. We describe how to (i) deliver 4-thiouracil to transgenic mice to thio-label cell lineage-specific transcripts, (ii) purify TU-tagged RNA and prepare libraries for Illumina sequencing and (iii) follow a straightforward bioinformatics workflow to identify cell type-enriched or differentially expressed genes. Tissue containing TU-tagged RNA can be obtained in 1 d, RNA-seq libraries can be generated within 2 d and, after sequencing, an initial bioinformatics analysis can be completed in 1 additional day.

  6. Subpixels analysis model applied to floodplain monitoring

    NASA Astrophysics Data System (ADS)

    Giraldo Osorio, J. D.; García Galiano, S. G.

    2009-04-01

    The traditional techniques to gauge hydrological events often fail with the extreme events. A particular case is the floods spatial detection. In this work, the remote sensing techniques and Geographic Information Systems (GIS) have been merged to develop a key tool for monitoring of floods. The low density of gauge stations networks in the development countries becomes remote sensing techniques the most suitable and economic way to delimitate the flood area and compute the damages cost. The common classification techniques of satellite images use "hard methods" in the sense of a pixel is assigned to an unique land cover class. For coarse resolution, the pixels inevitably will be mixed, so "soft methods" can be used in order to assign several land cover classes according to the surface fractions covered by each one. The main objective of this work is the dynamic monitoring of floods in large areas, based on satellite images -with moderate spatial resolution but with high time resolution- and Digital Elevation Model (DEM). Classified maps with finer spatial resolution can be built through the methodology of Subpixels Analysis developed. The procedure is supported on both the Linear Mixture Model (LMM) and Spatial Coherence Analysis (SCA) hypothesis. The LMM builds the land cover fraction maps through an optimization procedure which uses Lagrange Multipliers, while the SCA defines the most likely place for the land cover fractions within the coarse pixel using linear programming. A subsequent procedure improves the flooded area identification using both the drainage direction and flow accumulation raster maps derived from DEM of the study zone. The Subpixels Analysis technique was validated using historical data of floods which were obtained from satellite images. The procedure improves the spatial resolution of classified maps from satellite images with coarse resolution, while the "hard methods" keep the spatial resolution from the input coarse satellite image.

  7. Artificial intelligence applied to process signal analysis

    NASA Technical Reports Server (NTRS)

    Corsberg, Dan

    1988-01-01

    Many space station processes are highly complex systems subject to sudden, major transients. In any complex process control system, a critical aspect of the human/machine interface is the analysis and display of process information. Human operators can be overwhelmed by large clusters of alarms that inhibit their ability to diagnose and respond to a disturbance. Using artificial intelligence techniques and a knowledge base approach to this problem, the power of the computer can be used to filter and analyze plant sensor data. This will provide operators with a better description of the process state. Once a process state is recognized, automatic action could be initiated and proper system response monitored.

  8. Scanning methods applied to bitemark analysis

    NASA Astrophysics Data System (ADS)

    Bush, Peter J.; Bush, Mary A.

    2010-06-01

    The 2009 National Academy of Sciences report on forensics focused criticism on pattern evidence subdisciplines in which statements of unique identity are utilized. One principle of bitemark analysis is that the human dentition is unique to the extent that a perpetrator may be identified based on dental traits in a bitemark. Optical and electron scanning methods were used to measure dental minutia and to investigate replication of detail in human skin. Results indicated that being a visco-elastic substrate, skin effectively reduces the resolution of measurement of dental detail. Conclusions indicate caution in individualization statements.

  9. Artificial intelligence applied to process signal analysis

    NASA Technical Reports Server (NTRS)

    Corsberg, Dan

    1988-01-01

    Many space station processes are highly complex systems subject to sudden, major transients. In any complex process control system, a critical aspect of the human/machine interface is the analysis and display of process information. Human operators can be overwhelmed by large clusters of alarms that inhibit their ability to diagnose and respond to a disturbance. Using artificial intelligence techniques and a knowledge base approach to this problem, the power of the computer can be used to filter and analyze plant sensor data. This will provide operators with a better description of the process state. Once a process state is recognized, automatic action could be initiated and proper system response monitored.

  10. Applying the lessons of high risk industries to health care

    PubMed Central

    Hudson, P

    2003-01-01

    High risk industries such as commercial aviation and the oil and gas industry have achieved exemplary safety performance. This paper reviews how they have managed to do that. The primary reasons are the positive attitudes towards safety and the operation of effective formal safety management systems. The safety culture provides an important explanation of why such organisations perform well. An evolutionary model of safety culture is provided in which there is a range of cultures from the pathological through the reactive to the calculative. Later, the proactive culture can evolve towards the generative organisation, an alternative description of the high reliability organisation. The current status of health care is reviewed, arguing that it has a much higher level of accidents and has a reactive culture, lagging behind both high risk industries studied in both attitude and systematic management of patient risks. PMID:14645741

  11. Applying the lessons of high risk industries to health care.

    PubMed

    Hudson, P

    2003-12-01

    High risk industries such as commercial aviation and the oil and gas industry have achieved exemplary safety performance. This paper reviews how they have managed to do that. The primary reasons are the positive attitudes towards safety and the operation of effective formal safety management systems. The safety culture provides an important explanation of why such organisations perform well. An evolutionary model of safety culture is provided in which there is a range of cultures from the pathological through the reactive to the calculative. Later, the proactive culture can evolve towards the generative organisation, an alternative description of the high reliability organisation. The current status of health care is reviewed, arguing that it has a much higher level of accidents and has a reactive culture, lagging behind both high risk industries studied in both attitude and systematic management of patient risks.

  12. Multivariate analysis applied to tomato hybrid production.

    PubMed

    Balasch, S; Nuez, F; Palomares, G; Cuartero, J

    1984-11-01

    Twenty characters were measured on 60 tomato varieties cultivated in the open-air and in polyethylene plastic-house. Data were analyzed by means of principal components, factorial discriminant methods, Mahalanobis D(2) distances and principal coordinate techniques. Factorial discriminant and Mahalanobis D(2) distances methods, both of which require collecting data plant by plant, lead to similar conclusions as the principal components method that only requires taking data by plots. Characters that make up the principal components in both environments studied are the same, although the relative importance of each one of them varies within the principal components. By combining information supplied by multivariate analysis with the inheritance mode of characters, crossings among cultivars can be experimented with that will produce heterotic hybrids showing characters within previously established limits.

  13. Toward applied behavior analysis of life aloft

    NASA Technical Reports Server (NTRS)

    Brady, J. V.

    1990-01-01

    This article deals with systems at multiple levels, at least from cell to organization. It also deals with learning, decision making, and other behavior at multiple levels. Technological development of a human behavioral ecosystem appropriate to space environments requires an analytic and synthetic orientation, explicitly experimental in nature, dictated by scientific and pragmatic considerations, and closely approximating procedures of established effectiveness in other areas of natural science. The conceptual basis of such an approach has its roots in environmentalism which has two main features: (1) knowledge comes from experience rather than from innate ideas, divine revelation, or other obscure sources; and (2) action is governed by consequences rather than by instinct, reason, will, beliefs, attitudes or even the currently fashionable cognitions. Without an experimentally derived data base founded upon such a functional analysis of human behavior, the overgenerality of "ecological systems" approaches render them incapable of ensuring the successful establishment of enduring space habitats. Without an experimentally derived function account of individual behavioral variability, a natural science of behavior cannot exist. And without a natural science of behavior, the social sciences will necessarily remain in their current status as disciplines of less than optimal precision or utility. Such a functional analysis of human performance should provide an operational account of behavior change in a manner similar to the way in which Darwin's approach to natural selection accounted for the evolution of phylogenetic lines (i.e., in descriptive, nonteleological terms). Similarly, as Darwin's account has subsequently been shown to be consonant with information obtained at the cellular level, so too should behavior principles ultimately prove to be in accord with an account of ontogenetic adaptation at a biochemical level. It would thus seem obvious that the most

  14. Toward applied behavior analysis of life aloft.

    PubMed

    Brady, J V

    1990-01-01

    This article deals with systems at multiple levels, at least from cell to organization. It also deals with learning, decision making, and other behavior at multiple levels. Technological development of a human behavioral ecosystem appropriate to space environments requires an analytic and synthetic orientation, explicitly experimental in nature, dictated by scientific and pragmatic considerations, and closely approximating procedures of established effectiveness in other areas of natural science. The conceptual basis of such an approach has its roots in environmentalism which has two main features: (1) knowledge comes from experience rather than from innate ideas, divine revelation, or other obscure sources; and (2) action is governed by consequences rather than by instinct, reason, will, beliefs, attitudes or even the currently fashionable cognitions. Without an experimentally derived data base founded upon such a functional analysis of human behavior, the overgenerality of "ecological systems" approaches render them incapable of ensuring the successful establishment of enduring space habitats. Without an experimentally derived function account of individual behavioral variability, a natural science of behavior cannot exist. And without a natural science of behavior, the social sciences will necessarily remain in their current status as disciplines of less than optimal precision or utility. Such a functional analysis of human performance should provide an operational account of behavior change in a manner similar to the way in which Darwin's approach to natural selection accounted for the evolution of phylogenetic lines (i.e., in descriptive, nonteleological terms). Similarly, as Darwin's account has subsequently been shown to be consonant with information obtained at the cellular level, so too should behavior principles ultimately prove to be in accord with an account of ontogenetic adaptation at a biochemical level. It would thus seem obvious that the most

  15. Toward applied behavior analysis of life aloft

    NASA Technical Reports Server (NTRS)

    Brady, J. V.

    1990-01-01

    This article deals with systems at multiple levels, at least from cell to organization. It also deals with learning, decision making, and other behavior at multiple levels. Technological development of a human behavioral ecosystem appropriate to space environments requires an analytic and synthetic orientation, explicitly experimental in nature, dictated by scientific and pragmatic considerations, and closely approximating procedures of established effectiveness in other areas of natural science. The conceptual basis of such an approach has its roots in environmentalism which has two main features: (1) knowledge comes from experience rather than from innate ideas, divine revelation, or other obscure sources; and (2) action is governed by consequences rather than by instinct, reason, will, beliefs, attitudes or even the currently fashionable cognitions. Without an experimentally derived data base founded upon such a functional analysis of human behavior, the overgenerality of "ecological systems" approaches render them incapable of ensuring the successful establishment of enduring space habitats. Without an experimentally derived function account of individual behavioral variability, a natural science of behavior cannot exist. And without a natural science of behavior, the social sciences will necessarily remain in their current status as disciplines of less than optimal precision or utility. Such a functional analysis of human performance should provide an operational account of behavior change in a manner similar to the way in which Darwin's approach to natural selection accounted for the evolution of phylogenetic lines (i.e., in descriptive, nonteleological terms). Similarly, as Darwin's account has subsequently been shown to be consonant with information obtained at the cellular level, so too should behavior principles ultimately prove to be in accord with an account of ontogenetic adaptation at a biochemical level. It would thus seem obvious that the most

  16. Multiple factors explain injury risk in adolescent elite athletes: applying a biopsychosocial perspective.

    PubMed

    von Rosen, Philip; Frohm, Anna; Kottorp, Anders; Fridén, Cecilia; Heijne, Annette

    2017-02-16

    Many risk factors for injury are presented in the literature, few of those are however consistent and the majority is associated with adult and not adolescent elite athletes. The aim was to identify risk factors for injury in adolescent elite athletes, by applying a biopsychosocial approach. A total of 496 adolescent elite athletes (age range 15-19), participating in 16 different sports, were monitored repeatedly over 52 weeks using a validated questionnaire about injuries, training exposure, sleep, stress, nutrition and competence-based self-esteem. Univariate and multiple cox regression analyses were used to calculate hazard ratios (HR) for risk factors for first reported injury. The main finding was that an increase in training volume, training intensity and at the same time decreasing the sleep volume resulted in a higher risk for injury compared to no change in these variables (HR 2.25, 95% CI, 1.46-3.45, p<0.01), which was the strongest risk factor identified. In addition, an increase by one score of competence-based self-esteem increased the hazard for injury with 1.02 (HR 95% CI, 1.00-1.04, p=0.01). Based on the multiple cox regression analysis, an athlete having the identified risk factors (Risk Index, competence-based self-esteem), with an average competence-based self-esteem score, had more than a threefold increased risk for injury (HR 3.35), compared to an athlete with a low competence-based self-esteem and no change in sleep or training volume.. Our findings confirm injury occurrence as a result of multiple risk factors interacting in complex ways. This article is protected by copyright. All rights reserved.

  17. Experiences of Uav Surveys Applied to Environmental Risk Management

    NASA Astrophysics Data System (ADS)

    Caprioli, M.; Trizzino, R.; Mazzone, F.; Scarano, M.

    2016-06-01

    In this paper the results of some surveys carried out in an area of Apulian territory affected by serious environmental hazard are presented. Unmanned Aerial Vehicles (UAV) are emerging as a key engineering tool for future environmental survey tasks. UAVs are increasingly seen as an attractive low-cost alternative or supplement to aerial and terrestrial photogrammetry due to their low cost, flexibility, availability and readiness for duty. In addition, UAVs can be operated in hazardous or temporarily inaccessible locations, that makes them very suitable for the assessment and management of environmental risk conditions. In order to verify the reliability of these technologies an UAV survey and A LIDAR survey have been carried outalong about 1 km of coast in the Salento peninsula, near the towns of San Foca, Torre dellOrso and SantAndrea( Lecce, Southern Italy). This area is affected by serious environmental risks due to the presence of dangerous rocky cliffs named falesie. The UAV platform was equipped with a photogrammetric measurement system that allowed us to obtain a mobile mapping of the fractured fronts of dangerous rocky cliffs. UAV-images data have been processed using dedicated software (AgisoftPhotoscan). The point clouds obtained from both the UAV and LIDAR surveys have been processed using Cloud Compare software, with the aim of testing the UAV results with respect to the LIDAR ones. The total error obtained was of centimeter-order that is a very satisfactory result. The environmental information has been arranged in an ArcGIS platform in order to assess the risk levels. The possibility to repeat the survey at time intervals more or less close together depending on the measured levels of risk and to compare the output allows following the trend of the dangerous phenomena. In conclusion, for inaccessible locations of dangerous rocky bodies the UAV survey coupled with GIS methodology proved to be a key engineering tool for the management of environmental

  18. Applying artificial neural networks to predict communication risks in the emergency department.

    PubMed

    Bagnasco, Annamaria; Siri, Anna; Aleo, Giuseppe; Rocco, Gennaro; Sasso, Loredana

    2015-10-01

    To describe the utility of artificial neural networks in predicting communication risks. In health care, effective communication reduces the risk of error. Therefore, it is important to identify the predictive factors of effective communication. Non-technical skills are needed to achieve effective communication. This study explores how artificial neural networks can be applied to predict the risk of communication failures in emergency departments. A multicentre observational study. Data were collected between March-May 2011 by observing the communication interactions of 840 nurses with their patients during their routine activities in emergency departments. The tools used for our observation were a questionnaire to collect personal and descriptive data, level of training and experience and Guilbert's observation grid, applying the Situation-Background-Assessment-Recommendation technique to communication in emergency departments. A total of 840 observations were made on the nurses working in the emergency departments. Based on Guilbert's observation grid, the output variables is likely to influence the risk of communication failure were 'terminology'; 'listening'; 'attention' and 'clarity', whereas nurses' personal characteristics were used as input variables in the artificial neural network model. A model based on the multilayer perceptron topology was developed and trained. The receiver operator characteristic analysis confirmed that the artificial neural network model correctly predicted the performance of more than 80% of the communication failures. The application of the artificial neural network model could offer a valid tool to forecast and prevent harmful communication errors in the emergency department. © 2015 John Wiley & Sons Ltd.

  19. Digital photoelastic analysis applied to implant dentistry

    NASA Astrophysics Data System (ADS)

    Ramesh, K.; Hariprasad, M. P.; Bhuvanewari, S.

    2016-12-01

    Development of improved designs of implant systems in dentistry have necessitated the study of stress fields in the implant regions of the mandible/maxilla for better understanding of the biomechanics involved. Photoelasticity has been used for various studies related to dental implants in view of whole field visualization of maximum shear stress in the form of isochromatic contours. The potential of digital photoelasticity has not been fully exploited in the field of implant dentistry. In this paper, the fringe field in the vicinity of the connected implants (All-On-Four® concept) is analyzed using recent advances in digital photoelasticity. Initially, a novel 3-D photoelastic model making procedure, to closely mimic all the anatomical features of the human mandible is proposed. By choosing appropriate orientation of the model with respect to the light path, the essential region of interest were sought to be analysed while keeping the model under live loading conditions. Need for a sophisticated software module to carefully identify the model domain has been brought out. For data extraction, five-step method is used and isochromatics are evaluated by twelve fringe photoelasticity. In addition to the isochromatic fringe field, whole field isoclinic data is also obtained for the first time in implant dentistry, which could throw important information in improving the structural stability of the implant systems. Analysis is carried out for the implant in the molar as well as the incisor region. In addition, the interaction effects of loaded molar implant on the incisor area are also studied.

  20. Rotary spectra analysis applied to static stabilometry.

    PubMed

    Chiaramello, E; Knaflitz, M; Agostini, V

    2011-01-01

    Static stabilometry is a technique aimed at quantifying postural sway during quiet standing in the upright position. Many different models and many different techniques to analyze the trajectories of the Centre of Pressure (CoP) have been proposed. Most of the parameters calculated according to these different approaches are affected by a relevant intra- and inter-subject variability or do not have a clear physiological interpretation. In this study we hypothesize that CoP trajectories have rotational characteristics, therefore we decompose them in clockwise and counter-clockwise components, using the rotary spectra analysis. Rotary spectra obtained studying a population of healthy subjects are described through the group average of spectral parameters, i.e., 95% spectral bandwidth, mean frequency, median frequency, and skewness. Results are reported for the clockwise and the counter-clockwise components and refer to the upright position maintained with eyes open or closed. This study demonstrates that the approach is feasible and that some of the spectral parameters are statistically different between the open and closed eyes conditions. More research is needed to demonstrate the clinical applicability of this approach, but results so far obtained are promising.

  1. Goals Analysis Procedure Guidelines for Applying the Goals Analysis Process

    NASA Technical Reports Server (NTRS)

    Motley, Albert E., III

    2000-01-01

    One of the key elements to successful project management is the establishment of the "right set of requirements", requirements that reflect the true customer needs and are consistent with the strategic goals and objectives of the participating organizations. A viable set of requirements implies that each individual requirement is a necessary element in satisfying the stated goals and that the entire set of requirements, taken as a whole, is sufficient to satisfy the stated goals. Unfortunately, it is the author's experience that during project formulation phases' many of the Systems Engineering customers do not conduct a rigorous analysis of the goals and objectives that drive the system requirements. As a result, the Systems Engineer is often provided with requirements that are vague, incomplete, and internally inconsistent. To complicate matters, most systems development methodologies assume that the customer provides unambiguous, comprehensive and concise requirements. This paper describes the specific steps of a Goals Analysis process applied by Systems Engineers at the NASA Langley Research Center during the formulation of requirements for research projects. The objective of Goals Analysis is to identify and explore all of the influencing factors that ultimately drive the system's requirements.

  2. Risk management in mental health: applying lessons from commercial aviation.

    PubMed

    Hatcher, Simon

    2010-02-01

    Risk management in mental health focuses on risks in patients and fails to predict rare but catastrophic events such as suicide. Commercial aviation has a similar task in preventing rare but catastrophic accidents. This article describes the systems in place in commercial aviation that allows that industry to prevent disasters and contrasts this with the situation in mental health. In mental health we should learn from commercial aviation by having: national policies to promote patient safety; a national body responsible for implementing this policy which maintains a database of safety occurrences, sets targets and investigates adverse outcomes; legislation in place which encourages clinicians to report safety occurrences; and a common method and language for investigating safety occurrences.

  3. Cancer Risk Assessment: Should New Science be Applied? Workgroup summary

    SciTech Connect

    Richard J. Bull; Antone L. Brooks

    2002-12-15

    OAK-B135 A symposium discussing the implications of certain phenomena observed in radiation biology for cancer risk assessment in general. In July of 2002 a workshop was convened that explored some of the intercellular phenomena that appear to condition responses to carcinogen exposure. Effects that result from communication between cells that appear to either increase the sphere of damage or to modify the sensitivity of cells to further damage were of particular interest. Much of the discussion focused on the effects of ionizing radiation that were transmitted from cells directly hit to cells not receiving direct exposure to radiation (bystander cells). In cell culture, increased rates of mutation, chromosomal aberration, apoptosis, genomic instability, and decreased clonogenic survival have all been observed in cells that have experienced no direct radiation. In addition, there is evidence that low doses of radiation or certain chemicals give rise to adaptive responses in which the treated cells develop resistance to the effects of high doses given in subsequent exposures. Data were presented at the workshop indicating that low dose exposure of animals to radiation and some chemicals frequently reduces the spontaneous rate of mutation in vitro and tumor responses in vivo. Finally, it was concluded that considerable improvement in understanding of how genetic variation may modify the impact of these phenomena is necessary before the risk implications can be fully appreciated. The workshop participants discussed the substantive challenge that these data present with respect to simple linear methodologies that are currently used in cancer risk assessment and attempted to identify broad strategies by which these phenomena may start to be used to refine cancer risk assessment methods in the future.

  4. Applying Personal Genetic Data to Injury Risk Assessment in Athletes

    PubMed Central

    Goodlin, Gabrielle T.; Roos, Andrew K.; Roos, Thomas R.; Hawkins, Claire; Beache, Sydney; Baur, Stephen; Kim, Stuart K.

    2015-01-01

    Recent studies have identified genetic markers associated with risk for certain sports-related injuries and performance-related conditions, with the hope that these markers could be used by individual athletes to personalize their training and diet regimens. We found that we could greatly expand the knowledge base of sports genetic information by using published data originally found in health and disease studies. For example, the results from large genome-wide association studies for low bone mineral density in elderly women can be re-purposed for low bone mineral density in young endurance athletes. In total, we found 124 single-nucleotide polymorphisms associated with: anterior cruciate ligament tear, Achilles tendon injury, low bone mineral density and stress fracture, osteoarthritis, vitamin/mineral deficiencies, and sickle cell trait. Of these single nucleotide polymorphisms, 91% have not previously been used in sports genetics. We conducted a pilot program on fourteen triathletes using this expanded knowledge base of genetic variants associated with sports injury. These athletes were genotyped and educated about how their individual genetic make-up affected their personal risk profile during an hour-long personal consultation. Overall, participants were favorable of the program, found it informative, and most acted upon their genetic results. This pilot program shows that recent genetic research provides valuable information to help reduce sports injuries and to optimize nutrition. There are many genetic studies for health and disease that can be mined to provide useful information to athletes about their individual risk for relevant injuries. PMID:25919592

  5. Treatment Integrity in Applied Behavior Analysis with Children.

    ERIC Educational Resources Information Center

    Gresham, Frank M.; And Others

    1993-01-01

    A review of 158 applied behavior analysis studies with children as subjects, published in the "Journal of Applied Behavior Analysis" between 1980 and 1990, found that (1) 16% measured the accuracy of independent variable implementation, and (2) two-thirds did not operationally define components of the independent variable. Specific recommendations…

  6. Risk and value analysis of SETI

    NASA Technical Reports Server (NTRS)

    Billingham, J.

    1990-01-01

    This paper attempts to apply a traditional risk and value analysis to the Search for Extraterrestrial Intelligence--SETI. In view of the difficulties of assessing the probability of success, a comparison is made between SETI and a previous search for extraterrestrial life, the biological component of Project Viking. Our application of simple Utility Theory, given some reasonable assumptions, suggests that SETI is at least as worthwhile as the biological experiment on Viking.

  7. Risk and value analysis of SETI.

    PubMed

    Billingham, J

    1990-01-01

    This paper attempts to apply a traditional risk and value analysis to the Search for Extraterrestrial Intelligence--SETI. In view of the difficulties of assessing the probability of success, a comparison is made between SETI and a previous search for extraterrestrial life, the biological component of Project Viking. Our application of simple Utility Theory, given some reasonable assumptions, suggests that SETI is at least as worthwhile as the biological experiment on Viking.

  8. Risk and value analysis of SETI

    NASA Technical Reports Server (NTRS)

    Billingham, J.

    1990-01-01

    This paper attempts to apply a traditional risk and value analysis to the Search for Extraterrestrial Intelligence--SETI. In view of the difficulties of assessing the probability of success, a comparison is made between SETI and a previous search for extraterrestrial life, the biological component of Project Viking. Our application of simple Utility Theory, given some reasonable assumptions, suggests that SETI is at least as worthwhile as the biological experiment on Viking.

  9. Robust regression applied to fractal/multifractal analysis.

    NASA Astrophysics Data System (ADS)

    Portilla, F.; Valencia, J. L.; Tarquis, A. M.; Saa-Requejo, A.

    2012-04-01

    Fractal and multifractal are concepts that have grown increasingly popular in recent years in the soil analysis, along with the development of fractal models. One of the common steps is to calculate the slope of a linear fit commonly using least squares method. This shouldn't be a special problem, however, in many situations using experimental data the researcher has to select the range of scales at which is going to work neglecting the rest of points to achieve the best linearity that in this type of analysis is necessary. Robust regression is a form of regression analysis designed to circumvent some limitations of traditional parametric and non-parametric methods. In this method we don't have to assume that the outlier point is simply an extreme observation drawn from the tail of a normal distribution not compromising the validity of the regression results. In this work we have evaluated the capacity of robust regression to select the points in the experimental data used trying to avoid subjective choices. Based on this analysis we have developed a new work methodology that implies two basic steps: • Evaluation of the improvement of linear fitting when consecutive points are eliminated based on R p-value. In this way we consider the implications of reducing the number of points. • Evaluation of the significance of slope difference between fitting with the two extremes points and fitted with the available points. We compare the results applying this methodology and the common used least squares one. The data selected for these comparisons are coming from experimental soil roughness transect and simulated based on middle point displacement method adding tendencies and noise. The results are discussed indicating the advantages and disadvantages of each methodology. Acknowledgements Funding provided by CEIGRAM (Research Centre for the Management of Agricultural and Environmental Risks) and by Spanish Ministerio de Ciencia e Innovación (MICINN) through project no

  10. Initial Decision and Risk Analysis

    SciTech Connect

    Engel, David W.

    2012-02-29

    Decision and Risk Analysis capabilities will be developed for industry consideration and possible adoption within Year 1. These tools will provide a methodology for merging qualitative ranking of technology maturity and acknowledged risk contributors with quantitative metrics that drive investment decision processes. Methods and tools will be initially introduced as applications to the A650.1 case study, but modular spreadsheets and analysis routines will be offered to industry collaborators as soon as possible to stimulate user feedback and co-development opportunities.

  11. Applying Qualitative Hazard Analysis to Support Quantitative Safety Analysis for Proposed Reduced Wake Separation Conops

    NASA Technical Reports Server (NTRS)

    Shortle, John F.; Allocco, Michael

    2005-01-01

    This paper describes a scenario-driven hazard analysis process to identify, eliminate, and control safety-related risks. Within this process, we develop selective criteria to determine the applicability of applying engineering modeling to hypothesized hazard scenarios. This provides a basis for evaluating and prioritizing the scenarios as candidates for further quantitative analysis. We have applied this methodology to proposed concepts of operations for reduced wake separation for closely spaced parallel runways. For arrivals, the process identified 43 core hazard scenarios. Of these, we classified 12 as appropriate for further quantitative modeling, 24 that should be mitigated through controls, recommendations, and / or procedures (that is, scenarios not appropriate for quantitative modeling), and 7 that have the lowest priority for further analysis.

  12. Applying Qualitative Hazard Analysis to Support Quantitative Safety Analysis for Proposed Reduced Wake Separation Conops

    NASA Technical Reports Server (NTRS)

    Shortle, John F.; Allocco, Michael

    2005-01-01

    This paper describes a scenario-driven hazard analysis process to identify, eliminate, and control safety-related risks. Within this process, we develop selective criteria to determine the applicability of applying engineering modeling to hypothesized hazard scenarios. This provides a basis for evaluating and prioritizing the scenarios as candidates for further quantitative analysis. We have applied this methodology to proposed concepts of operations for reduced wake separation for closely spaced parallel runways. For arrivals, the process identified 43 core hazard scenarios. Of these, we classified 12 as appropriate for further quantitative modeling, 24 that should be mitigated through controls, recommendations, and / or procedures (that is, scenarios not appropriate for quantitative modeling), and 7 that have the lowest priority for further analysis.

  13. A Course of Instruction in Risk Analysis.

    DTIC Science & Technology

    Contents: Risk analysis course schedule; Problems and perspectives - an introduction to a course of instruction in risk analysis ; Analytical...techniques; Overview of the process of risk analysis ; Network analysis; RISCA: USALMC’s network analyzer program; Case studies in risk analysis ; Armored...vehicle launched bridge (AVLB); Micom-air defense missile warhead/fuze subsystem performance; Helicopter performance risk analysis ; High performance fuze

  14. The dissection of risk: a conceptual analysis.

    PubMed

    O'Byrne, Patrick

    2008-03-01

    Recently, patient safety has gained popularity in the nursing literature. While this topic is used extensively and has been analyzed thoroughly, some of the concepts upon which it relies, such as risk, have remained undertheorized. In fact, despite its considerable use, the term 'risk' has been largely assumed to be inherently neutral - meaning that its definition and discovery is seen as objective and impartial, and that risk avoidance is natural and logical. Such an oversight in evaluation requires that the concept of risk be thoroughly analyzed as it relates to nursing practices, particularly in relation to those practices surrounding bio-political nursing care, such as public health, as well as other more trendy nursing topics, such as patient safety. Thus, this paper applies the Evolutionary Model of concept analysis to explore 'risk', and expose it as one mechanism of maintaining prescribed/ proscribed social practices. Thereby, an analysis of risk results in the definitions and roles of the discipline and profession of nursing expanding from solely being dedicated to patient care, to include, in addition, its functions as a governmental body that unwittingly maintains hegemonic infrastructures.

  15. Negative reinforcement in applied behavior analysis: an emerging technology.

    PubMed

    Iwata, B A

    1987-01-01

    Although the effects of negative reinforcement on human behavior have been studied for a number of years, a comprehensive body of applied research does not exist at this time. This article describes three aspects of negative reinforcement as it relates to applied behavior analysis: behavior acquired or maintained through negative reinforcement, the treatment of negatively reinforced behavior, and negative reinforcement as therapy. A consideration of research currently being done in these areas suggests the emergence of an applied technology on negative reinforcement.

  16. Animal Research in the "Journal of Applied Behavior Analysis"

    ERIC Educational Resources Information Center

    Edwards, Timothy L.; Poling, Alan

    2011-01-01

    This review summarizes the 6 studies with nonhuman animal subjects that have appeared in the "Journal of Applied Behavior Analysis" and offers suggestions for future research in this area. Two of the reviewed articles described translational research in which pigeons were used to illustrate and examine behavioral phenomena of applied significance…

  17. Negative Reinforcement in Applied Behavior Analysis: An Emerging Technology.

    ERIC Educational Resources Information Center

    Iwata, Brian A.

    1987-01-01

    The article describes three aspects of negative reinforcement as it relates to applied behavior analysis: behavior acquired or maintained through negative reinforcement, the treatment of negatively reinforced behavior, and negative reinforcement as therapy. Current research suggests the emergence of an applied technology on negative reinforcement.…

  18. B. F. Skinner's contributions to applied behavior analysis

    PubMed Central

    Morris, Edward K.; Smith, Nathaniel G.; Altus, Deborah E.

    2005-01-01

    Our paper reviews and analyzes B. F. Skinner's contributions to applied behavior analysis in order to assess his role as the field's originator and founder. We found, first, that his contributions fall into five categorizes: the style and content of his science, his interpretations of typical and atypical human behavior, the implications he drew from his science for application, his descriptions of possible applications, and his own applications to nonhuman and human behavior. Second, we found that he explicitly or implicitly addressed all seven dimensions of applied behavior analysis. These contributions and the dimensions notwithstanding, he neither incorporated the field's scientific (e.g., analytic) and social dimensions (e.g., applied) into any program of published research such that he was its originator, nor did he systematically integrate, advance, and promote the dimensions so to have been its founder. As the founder of behavior analysis, however, he was the father of applied behavior analysis. PMID:22478444

  19. Putting problem formulation at the forefront of GMO risk analysis.

    PubMed

    Tepfer, Mark; Racovita, Monica; Craig, Wendy

    2013-01-01

    When applying risk assessment and the broader process of risk analysis to decisions regarding the dissemination of genetically modified organisms (GMOs), the process has a tendency to become remarkably complex. Further, as greater numbers of countries consider authorising the large-scale dissemination of GMOs, and as GMOs with more complex traits reach late stages of development, there has been increasing concern about the burden posed by the complexity of risk analysis. We present here an improved approach for GMO risk analysis that gives a central role to problem formulation. Further, the risk analysis strategy has been clarified and simplified in order to make rigorously scientific risk assessment and risk analysis more broadly accessible to diverse stakeholder groups.

  20. Building a Better Model: A Comprehensive Breast Cancer Risk Model Incorporating Breast Density to Stratify Risk and Apply Resources

    DTIC Science & Technology

    2014-10-01

    o Comparison of Breast Density Measurements with a Mammographic Volumetric and Area Algorithm and Magnetic resonance imaging . O Alonzo-Proulx, JG... Breast Cancer Risk Model Incorporating Breast Density To Stratify Risk and Apply Resources. 5a. CONTRACT NUMBER 5b. GRANT NUMBER W81XWH-11-1-0545 5c...Public Release; Distribution Unlimited 13. SUPPLEMENTARY NOTES 14. ABSTRACT Purpose: Development and validation of a personalized breast cancer risk

  1. Intelligent adversary risk analysis: a bioterrorism risk management model.

    PubMed

    Parnell, Gregory S; Smith, Christopher M; Moxley, Frederick I

    2010-01-01

    The tragic events of 9/11 and the concerns about the potential for a terrorist or hostile state attack with weapons of mass destruction have led to an increased emphasis on risk analysis for homeland security. Uncertain hazards (natural and engineering) have been successfully analyzed using probabilistic risk analysis (PRA). Unlike uncertain hazards, terrorists and hostile states are intelligent adversaries who can observe our vulnerabilities and dynamically adapt their plans and actions to achieve their objectives. This article compares uncertain hazard risk analysis with intelligent adversary risk analysis, describes the intelligent adversary risk analysis challenges, and presents a probabilistic defender-attacker-defender model to evaluate the baseline risk and the potential risk reduction provided by defender investments. The model includes defender decisions prior to an attack; attacker decisions during the attack; defender actions after an attack; and the uncertainties of attack implementation, detection, and consequences. The risk management model is demonstrated with an illustrative bioterrorism problem with notional data.

  2. Applying Systems Analysis to Program Failure in Organizations.

    ERIC Educational Resources Information Center

    Holt, Margaret E.; And Others

    1986-01-01

    Certain systems analysis techniques can be applied to examinations of program failure in continuing education to locate weaknesses in planning and implementing stages. Questions to guide an analysis and various procedures are recommended. Twelve issues that contribute to failures or discontinuations are identified. (Author/MLW)

  3. The Significance of Regional Analysis in Applied Geography.

    ERIC Educational Resources Information Center

    Sommers, Lawrence M.

    Regional analysis is central to applied geographic research, contributing to better planning and policy development for a variety of societal problems facing the United States. The development of energy policy serves as an illustration of the capabilities of this type of analysis. The United States has had little success in formulating a national…

  4. Overview of MSFC's Applied Fluid Dynamics Analysis Group Activities

    NASA Technical Reports Server (NTRS)

    Garcia, Roberto; Griffin, Lisa; Williams, Robert

    2003-01-01

    TD64, the Applied Fluid Dynamics Analysis Group, is one of several groups with high-fidelity fluids design and analysis expertise in the Space Transportation Directorate at Marshall Space Flight Center (MSFC). TD64 assists personnel working on other programs. The group participates in projects in the following areas: turbomachinery activities, nozzle activities, combustion devices, and the Columbia accident investigation.

  5. Evaluating the Risks of Clinical Research: Direct Comparative Analysis

    PubMed Central

    Abdoler, Emily; Roberson-Nay, Roxann; Pine, Daniel S.; Wendler, David

    2014-01-01

    Abstract Objectives: Many guidelines and regulations allow children and adolescents to be enrolled in research without the prospect of clinical benefit when it poses minimal risk. However, few systematic methods exist to determine when research risks are minimal. This situation has led to significant variation in minimal risk judgments, raising concern that some children are not being adequately protected. To address this concern, we describe a new method for implementing the widely endorsed “risks of daily life” standard for minimal risk. This standard defines research risks as minimal when they do not exceed the risks posed by daily life activities or routine examinations. Methods: This study employed a conceptual and normative analysis, and use of an illustrative example. Results: Different risks are composed of the same basic elements: Type, likelihood, and magnitude of harm. Hence, one can compare the risks of research and the risks of daily life by comparing the respective basic elements with each other. We use this insight to develop a systematic method, direct comparative analysis, for implementing the “risks of daily life” standard for minimal risk. The method offers a way of evaluating research procedures that pose the same types of risk as daily life activities, such as the risk of experiencing anxiety, stress, or other psychological harm. We thus illustrate how direct comparative analysis can be applied in practice by using it to evaluate whether the anxiety induced by a respiratory CO2 challenge poses minimal or greater than minimal risks in children and adolescents. Conclusions: Direct comparative analysis is a systematic method for applying the “risks of daily life” standard for minimal risk to research procedures that pose the same types of risk as daily life activities. It thereby offers a method to protect children and adolescents in research, while ensuring that important studies are not blocked because of unwarranted concerns about

  6. Command Process Modeling & Risk Analysis

    NASA Technical Reports Server (NTRS)

    Meshkat, Leila

    2011-01-01

    Commanding Errors may be caused by a variety of root causes. It's important to understand the relative significance of each of these causes for making institutional investment decisions. One of these causes is the lack of standardized processes and procedures for command and control. We mitigate this problem by building periodic tables and models corresponding to key functions within it. These models include simulation analysis and probabilistic risk assessment models.

  7. Command Process Modeling & Risk Analysis

    NASA Technical Reports Server (NTRS)

    Meshkat, Leila

    2011-01-01

    Commanding Errors may be caused by a variety of root causes. It's important to understand the relative significance of each of these causes for making institutional investment decisions. One of these causes is the lack of standardized processes and procedures for command and control. We mitigate this problem by building periodic tables and models corresponding to key functions within it. These models include simulation analysis and probabilistic risk assessment models.

  8. Nano risk analysis: advancing the science for nanomaterials risk management.

    PubMed

    Shatkin, Jo Anne; Abbott, Linda Carolyn; Bradley, Ann E; Canady, Richard Alan; Guidotti, Tee; Kulinowski, Kristen M; Löfstedt, Ragnar E; Louis, Garrick; MacDonell, Margaret; Macdonell, Margaret; Maynard, Andrew D; Paoli, Greg; Sheremeta, Lorraine; Walker, Nigel; White, Ronald; Williams, Richard

    2010-11-01

    Scientists, activists, industry, and governments have raised concerns about health and environmental risks of nanoscale materials. The Society for Risk Analysis convened experts in September 2008 in Washington, DC to deliberate on issues relating to the unique attributes of nanoscale materials that raise novel concerns about health risks. This article reports on the overall themes and findings of the workshop, uncovering the underlying issues for each of these topics that become recurring themes. The attributes of nanoscale particles and other nanomaterials that present novel issues for risk analysis are evaluated in a risk analysis framework, identifying challenges and opportunities for risk analysts and others seeking to assess and manage the risks from emerging nanoscale materials and nanotechnologies. Workshop deliberations and recommendations for advancing the risk analysis and management of nanotechnologies are presented.

  9. Quantitative Analysis of the Interdisciplinarity of Applied Mathematics

    PubMed Central

    Zhang, Pengyuan

    2015-01-01

    The increasing use of mathematical techniques in scientific research leads to the interdisciplinarity of applied mathematics. This viewpoint is validated quantitatively here by statistical and network analysis on the corpus PNAS 1999–2013. A network describing the interdisciplinary relationships between disciplines in a panoramic view is built based on the corpus. Specific network indicators show the hub role of applied mathematics in interdisciplinary research. The statistical analysis on the corpus content finds that algorithms, a primary topic of applied mathematics, positively correlates, increasingly co-occurs, and has an equilibrium relationship in the long-run with certain typical research paradigms and methodologies. The finding can be understood as an intrinsic cause of the interdisciplinarity of applied mathematics. PMID:26352604

  10. Quantitative Analysis of the Interdisciplinarity of Applied Mathematics.

    PubMed

    Xie, Zheng; Duan, Xiaojun; Ouyang, Zhenzheng; Zhang, Pengyuan

    2015-01-01

    The increasing use of mathematical techniques in scientific research leads to the interdisciplinarity of applied mathematics. This viewpoint is validated quantitatively here by statistical and network analysis on the corpus PNAS 1999-2013. A network describing the interdisciplinary relationships between disciplines in a panoramic view is built based on the corpus. Specific network indicators show the hub role of applied mathematics in interdisciplinary research. The statistical analysis on the corpus content finds that algorithms, a primary topic of applied mathematics, positively correlates, increasingly co-occurs, and has an equilibrium relationship in the long-run with certain typical research paradigms and methodologies. The finding can be understood as an intrinsic cause of the interdisciplinarity of applied mathematics.

  11. Risk factors analysis of consecutive exotropia

    PubMed Central

    Gong, Qianwen; Wei, Hong; Zhou, Xu; Li, Ziyuan; Liu, Longqian

    2016-01-01

    Abstract To evaluate clinical factors associated with the onset of consecutive exotropia (XT) following esotropia surgery. By a retrospective nested case-control design, we reviewed the medical records of 193 patients who had undergone initial esotropia surgery between 2008 and 2015, and had follow-up longer than 6 months. The probable risk factors were evaluated between groups 1 (consecutive XT) and 2 (non-consecutive exotropia). Pearson chi-square test and Mann–Whitney U test were used for univariate analysis, and conditional logistic regression model was applied for exploring the potential risk factors of consecutive XT. Consecutive exotropia occurred in 23 (11.9%) of 193 patients. Patients who had undergone large bilateral medial rectus recession (BMR) (P = 0.017) had a high risk of developing consecutive XT. Oblique dysfunction (P = 0.001), adduction limitation (P = 0.000) were associated with a high risk of consecutive XT, which was confirmed in the conditional logistic regression analysis. In addition, large amount of BMR (6 mm or more) was associated with higher incidence of adduction limitation (P = 0.045). The surgical methods and preoperative factors did not appear to influence the risk of developing consecutive XT (P > 0.05). The amount of surgery could be optimized to reduce the risk of consecutive XT. The presence of oblique overaction and postoperative adduction limitation may be associated with a high risk of consecutive XT, which may require close supervision, and/or even earlier operation intervention. PMID:27977611

  12. Applying risk assessment models in non-surgical patients: effective risk stratification.

    PubMed

    Eldor, A

    1999-08-01

    Pulmonary embolism and deep vein thrombosis are serious complications of non-surgical patients, but scarcity of data documenting prophylaxis means antithrombotic therapy is rarely used. Prediction of risk is complicated by the variation in the medical conditions associated with venous thromboembolism (VTE), and lack of data defining risk in different groups. Accurate risk assessment is further confounded by inherited or acquired factors for VTE, additional risk due to medical interventions, and interactions between risk factors. Acquired and inherited risk factors may underlie thromboembolic complications in a range of conditions, including pregnancy, ischaemic stroke, myocardial infarction and cancer. Risk stratification may be feasible in non-surgical patients by considering individual risk factors and their cumulative effects. Current risk assessment models require expansion and modification to reflect emerging evidence in the non-surgical field. A large on-going study of prophylaxis with low-molecular-weight heparin in non-surgical patients will clarify our understanding of the components of risk, and assist in developing therapy recommendations.

  13. Enviromentally sensitive patch index of desertification risk applied to the main habitats of Sicily

    NASA Astrophysics Data System (ADS)

    Duro, A.; Piccione, V.; Ragusa, M. A.; Rapicavoli, V.; Veneziano, V.

    2017-07-01

    The authors applied the MEDALUS - Mediterranean Desertification and Land Use - procedure to the most representative sicilian habitat by extension, socio-economic and environmental importance, in order to assess the risk of desertification. Thanks to the ESPI, Environmentally Sensitive Patch Index, in this paper the authors estimate the current and future regional levels of desertification risk.

  14. Tsunamis: Global Exposure and Local Risk Analysis

    NASA Astrophysics Data System (ADS)

    Harbitz, C. B.; Løvholt, F.; Glimsdal, S.; Horspool, N.; Griffin, J.; Davies, G.; Frauenfelder, R.

    2014-12-01

    The 2004 Indian Ocean tsunami led to a better understanding of the likelihood of tsunami occurrence and potential tsunami inundation, and the Hyogo Framework for Action (HFA) was one direct result of this event. The United Nations International Strategy for Disaster Risk Reduction (UN-ISDR) adopted HFA in January 2005 in order to reduce disaster risk. As an instrument to compare the risk due to different natural hazards, an integrated worldwide study was implemented and published in several Global Assessment Reports (GAR) by UN-ISDR. The results of the global earthquake induced tsunami hazard and exposure analysis for a return period of 500 years are presented. Both deterministic and probabilistic methods (PTHA) are used. The resulting hazard levels for both methods are compared quantitatively for selected areas. The comparison demonstrates that the analysis is rather rough, which is expected for a study aiming at average trends on a country level across the globe. It is shown that populous Asian countries account for the largest absolute number of people living in tsunami prone areas, more than 50% of the total exposed people live in Japan. Smaller nations like Macao and the Maldives are among the most exposed by population count. Exposed nuclear power plants are limited to Japan, China, India, Taiwan, and USA. On the contrary, a local tsunami vulnerability and risk analysis applies information on population, building types, infrastructure, inundation, flow depth for a certain tsunami scenario with a corresponding return period combined with empirical data on tsunami damages and mortality. Results and validation of a GIS tsunami vulnerability and risk assessment model are presented. The GIS model is adapted for optimal use of data available for each study. Finally, the importance of including landslide sources in the tsunami analysis is also discussed.

  15. FORTRAN computer program for seismic risk analysis

    USGS Publications Warehouse

    McGuire, Robin K.

    1976-01-01

    A program for seismic risk analysis is described which combines generality of application, efficiency and accuracy of operation, and the advantage of small storage requirements. The theoretical basis for the program is first reviewed, and the computational algorithms used to apply this theory are described. The information required for running the program is listed. Published attenuation functions describing the variation with earthquake magnitude and distance of expected values for various ground motion parameters are summarized for reference by the program user. Finally, suggestions for use of the program are made, an example problem is described (along with example problem input and output) and the program is listed.

  16. North energy system risk analysis features

    NASA Astrophysics Data System (ADS)

    Prokhorov, V. A.; Prokhorov, D. V.

    2015-12-01

    Risk indicator analysis for a decentralized energy system of the North was carried out. Based on analysis of damages caused by accidents at energy systems, their structure is selected, and a North energy system risk determination method was proposed.

  17. Fuzzy Comprehensive Evaluation Method Applied in the Real Estate Investment Risks Research

    NASA Astrophysics Data System (ADS)

    ML(Zhang Minli), Zhang; Wp(Yang Wenpo), Yang

    Real estate investment is a high-risk and high returned of economic activity, the key of real estate analysis is the identification of their types of investment risk and the risk of different types of effective prevention. But, as the financial crisis sweeping the world, the real estate industry also faces enormous risks, how effective and correct evaluation of real estate investment risks becomes the multitudinous scholar concern[1]. In this paper, real estate investment risks were summarized and analyzed, and comparative analysis method is discussed and finally presented fuzzy comprehensive evaluation method, not only in theory has the advantages of science, in the application also has the reliability, for real estate investment risk assessment provides an effective means for investors in real estate investing guidance on risk factors and forecasts.

  18. Applied behavior analysis: New directions from the laboratory

    PubMed Central

    Epling, W. Frank; Pierce, W. David

    1983-01-01

    Applied behavior analysis began when laboratory based principles were extended to humans inorder to change socially significant behavior. Recent laboratory findings may have applied relevance; however, the majority of basic researchers have not clearly communicated the practical implications of their work. The present paper samples some of the new findings and attempts to demonstrate their applied importance. Schedule-induced behavior which occurs as a by-product of contingencies of reinforcement is discussed. Possible difficulties in treatment and management of induced behaviors are considered. Next, the correlation-based law of effect and the implications of relative reinforcement are explored in terms of applied examples. Relative rate of reinforcement is then extended to the literature dealing with concurrent operants. Concurrent operant models may describe human behavior of applied importance, and several techniques for modification of problem behavior are suggested. As a final concern, the paper discusses several new paradigms. While the practical importance of these models is not clear at the moment, it may be that new practical advantages will soon arise. Thus, it is argued that basic research continues to be of theoretical and practical importance to applied behavior analysis. PMID:22478574

  19. Overview of MSFC's Applied Fluid Dynamics Analysis Group Activities

    NASA Technical Reports Server (NTRS)

    Garcia, Roberto; Griffin, Lisa; Williams, Robert

    2002-01-01

    This viewgraph report presents an overview of activities and accomplishments of NASA's Marshall Space Flight Center's Applied Fluid Dynamics Analysis Group. Expertise in this group focuses on high-fidelity fluids design and analysis with application to space shuttle propulsion and next generation launch technologies. Topics covered include: computational fluid dynamics research and goals, turbomachinery research and activities, nozzle research and activities, combustion devices, engine systems, MDA development and CFD process improvements.

  20. Overview of MSFC's Applied Fluid Dynamics Analysis Group Activities

    NASA Technical Reports Server (NTRS)

    Garcia, Roberto; Wang, Tee-See; Griffin, Lisa; Turner, James E. (Technical Monitor)

    2001-01-01

    This document is a presentation graphic which reviews the activities of the Applied Fluid Dynamics Analysis Group at Marshall Space Flight Center (i.e., Code TD64). The work of this group focused on supporting the space transportation programs. The work of the group is in Computational Fluid Dynamic tool development. This development is driven by hardware design needs. The major applications for the design and analysis tools are: turbines, pumps, propulsion-to-airframe integration, and combustion devices.

  1. Animal research in the Journal of Applied Behavior Analysis.

    PubMed

    Edwards, Timothy L; Poling, Alan

    2011-01-01

    This review summarizes the 6 studies with nonhuman animal subjects that have appeared in the Journal of Applied Behavior Analysis and offers suggestions for future research in this area. Two of the reviewed articles described translational research in which pigeons were used to illustrate and examine behavioral phenomena of applied significance (say-do correspondence and fluency), 3 described interventions that changed animals' behavior (self-injury by a baboon, feces throwing and spitting by a chimpanzee, and unsafe trailer entry by horses) in ways that benefited the animals and the people in charge of them, and 1 described the use of trained rats that performed a service to humans (land-mine detection). We suggest that each of these general research areas merits further attention and that the Journal of Applied Behavior Analysis is an appropriate outlet for some of these publications.

  2. 38 CFR 75.115 - Risk analysis.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... preparation of the risk analysis may include data mining if necessary for the development of relevant... sensitive personal information, the risk analysis must also contain operational recommendations for responding to the data breach. Each risk analysis, regardless of findings and operational...

  3. 38 CFR 75.115 - Risk analysis.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... preparation of the risk analysis may include data mining if necessary for the development of relevant... sensitive personal information, the risk analysis must also contain operational recommendations for responding to the data breach. Each risk analysis, regardless of findings and operational...

  4. 38 CFR 75.115 - Risk analysis.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... preparation of the risk analysis may include data mining if necessary for the development of relevant... sensitive personal information, the risk analysis must also contain operational recommendations for responding to the data breach. Each risk analysis, regardless of findings and operational...

  5. Applied Behavior Analysis Is a Science And, Therefore, Progressive

    ERIC Educational Resources Information Center

    Leaf, Justin B.; Leaf, Ronald; McEachin, John; Taubman, Mitchell; Ala'i-Rosales, Shahla; Ross, Robert K.; Smith, Tristram; Weiss, Mary Jane

    2016-01-01

    Applied behavior analysis (ABA) is a science and, therefore, involves progressive approaches and outcomes. In this commentary we argue that the spirit and the method of science should be maintained in order to avoid reductionist procedures, stifled innovation, and rote, unresponsive protocols that become increasingly removed from meaningful…

  6. The spread of behavior analysis to the applied fields 1

    PubMed Central

    Fraley, Lawrence E.

    1981-01-01

    This paper reviews the status of applied behavioral science as it exists in the various behavioral fields and considers the role of the Association for Behavior Analysis in serving those fields. The confounding effects of the traditions of psychology are discussed. Relevant issues are exemplified in the fields of law, communications, psychology, and education, but broader generalization is implied. PMID:22478537

  7. Applied Behavior Analysis Is a Science And, Therefore, Progressive

    ERIC Educational Resources Information Center

    Leaf, Justin B.; Leaf, Ronald; McEachin, John; Taubman, Mitchell; Ala'i-Rosales, Shahla; Ross, Robert K.; Smith, Tristram; Weiss, Mary Jane

    2016-01-01

    Applied behavior analysis (ABA) is a science and, therefore, involves progressive approaches and outcomes. In this commentary we argue that the spirit and the method of science should be maintained in order to avoid reductionist procedures, stifled innovation, and rote, unresponsive protocols that become increasingly removed from meaningful…

  8. Three Years of Intensive Applied Behavior Analysis: A Case Study

    ERIC Educational Resources Information Center

    Healy, Olive; O'Connor, Jennifer; Leader, Geraldine; Kenny, Neil

    2008-01-01

    A 2 years and 10 months old child receiving an early intensive teaching program using the Comprehensive Application of Behavior Analysis to Schooling (CABAS[R]) was taught using evidence-based teaching strategies and curricula based on existing research in both the applied and basic sciences. Progress was measured using behavioral assessment tools…

  9. Overview af MSFC's Applied Fluid Dynamics Analysis Group Activities

    NASA Technical Reports Server (NTRS)

    Garcia, Roberto; Griffin, Lisa; Williams, Robert

    2004-01-01

    This paper presents viewgraphs on NASA Marshall Space Flight Center's Applied Fluid Dynamics Analysis Group Activities. The topics include: 1) Status of programs at MSFC; 2) Fluid Mechanics at MSFC; 3) Relevant Fluid Dynamics Activities at MSFC; and 4) Shuttle Return to Flight.

  10. Applied Behavior Analysis: Current Myths in Public Education

    ERIC Educational Resources Information Center

    Fielding, Cheryl; Lowdermilk, John; Lanier, Lauren L.; Fannin, Abigail G.; Schkade, Jennifer L.; Rose, Chad A.; Simpson, Cynthia G.

    2013-01-01

    The effective use of behavior management strategies and related policies continues to be a debated issue in public education. Despite overwhelming evidence espousing the benefits of the implementation of procedures derived from principles based on the science of applied behavior analysis (ABA), educators often indicate many common misconceptions…

  11. Positive Behavior Support and Applied Behavior Analysis: A Familial Alliance

    ERIC Educational Resources Information Center

    Dunlap, Glen; Carr, Edward G.; Horner, Robert H.; Zarcone, Jennifer R.; Schwartz, Ilene

    2008-01-01

    Positive behavior support (PBS) emerged in the mid-1980s as an approach for understanding and addressing problem behaviors. PBS was derived primarily from applied behavior analysis (ABA). Over time, however, PBS research and practice has incorporated evaluative methods, assessment and intervention procedures, and conceptual perspectives associated…

  12. Positive Behavior Support and Applied Behavior Analysis: A Familial Alliance

    ERIC Educational Resources Information Center

    Dunlap, Glen; Carr, Edward G.; Horner, Robert H.; Zarcone, Jennifer R.; Schwartz, Ilene

    2008-01-01

    Positive behavior support (PBS) emerged in the mid-1980s as an approach for understanding and addressing problem behaviors. PBS was derived primarily from applied behavior analysis (ABA). Over time, however, PBS research and practice has incorporated evaluative methods, assessment and intervention procedures, and conceptual perspectives associated…

  13. B. F. Skinner's Contributions to Applied Behavior Analysis

    ERIC Educational Resources Information Center

    Morris, Edward K.; Smith, Nathaniel G.; Altus, Deborah E.

    2005-01-01

    Our paper reviews and analyzes B. F. Skinner's contributions to applied behavior analysis in order to assess his role as the field's originator and founder. We found, first, that his contributions fall into five categorizes: the style and content of his science, his interpretations of typical and atypical human behavior, the implications he drew…

  14. Progressive-Ratio Schedules and Applied Behavior Analysis

    ERIC Educational Resources Information Center

    Poling, Alan

    2010-01-01

    Establishing appropriate relations between the basic and applied areas of behavior analysis has been of long and persistent interest to the author. In this article, the author illustrates that there is a direct relation between how hard an organism will work for access to an object or activity, as indexed by the largest ratio completed under a…

  15. Applying programmatic risk assessment to nuclear materials stabilization R and D planning

    SciTech Connect

    Brown-Van Hoozer, S.A.; Kenley, C.R.

    1997-10-01

    A systems engineering approach to programmatic risk assessment, derived from the aerospace industry, was applied to various stabilization technologies to assess their relative maturity and availability for use in stabilizing nuclear materials. The assessment provided valuable information for trading off available technologies and identified the at-risk technologies that will require close tracking by the Department of Energy (DOE) to mitigate programmatic risks. This paper presents the programmatic risk assessment methodology developed for the 1995 R and D Plan and updated for the 1996 R and D Plan. Results of the 1996 assessment also are presented (DOE/ID-10561, 1996).

  16. The possibilities of applying a risk-oriented approach to the NPP reliability and safety enhancement problem

    NASA Astrophysics Data System (ADS)

    Komarov, Yu. A.

    2014-10-01

    An analysis and some generalizations of approaches to risk assessments are presented. Interconnection between different interpretations of the "risk" notion is shown, and the possibility of applying the fuzzy set theory to risk assessments is demonstrated. A generalized formulation of the risk assessment notion is proposed in applying risk-oriented approaches to the problem of enhancing reliability and safety in nuclear power engineering. The solution of problems using the developed risk-oriented approaches aimed at achieving more reliable and safe operation of NPPs is described. The results of studies aimed at determining the need (advisability) to modernize/replace NPP elements and systems are presented together with the results obtained from elaborating the methodical principles of introducing the repair concept based on the equipment technical state. The possibility of reducing the scope of tests and altering the NPP systems maintenance strategy is substantiated using the risk-oriented approach. A probabilistic model for estimating the validity of boric acid concentration measurements is developed.

  17. Risk analysis of heat recovery steam generator with semi quantitative risk based inspection API 581

    SciTech Connect

    Prayogo, Galang Sandy Haryadi, Gunawan Dwi; Ismail, Rifky; Kim, Seon Jin

    2016-04-19

    Corrosion is a major problem that most often occurs in the power plant. Heat recovery steam generator (HRSG) is an equipment that has a high risk to the power plant. The impact of corrosion damage causing HRSG power plant stops operating. Furthermore, it could be threaten the safety of employees. The Risk Based Inspection (RBI) guidelines by the American Petroleum Institute (API) 58 has been used to risk analysis in the HRSG 1. By using this methodology, the risk that caused by unexpected failure as a function of the probability and consequence of failure can be estimated. This paper presented a case study relating to the risk analysis in the HRSG, starting with a summary of the basic principles and procedures of risk assessment and applying corrosion RBI for process industries. The risk level of each HRSG equipment were analyzed: HP superheater has a medium high risk (4C), HP evaporator has a medium-high risk (4C), and the HP economizer has a medium risk (3C). The results of the risk assessment using semi-quantitative method of standard API 581 based on the existing equipment at medium risk. In the fact, there is no critical problem in the equipment components. Damage mechanisms were prominent throughout the equipment is thinning mechanism. The evaluation of the risk approach was done with the aim of reducing risk by optimizing the risk assessment activities.

  18. Risk analysis of heat recovery steam generator with semi quantitative risk based inspection API 581

    NASA Astrophysics Data System (ADS)

    Prayogo, Galang Sandy; Haryadi, Gunawan Dwi; Ismail, Rifky; Kim, Seon Jin

    2016-04-01

    Corrosion is a major problem that most often occurs in the power plant. Heat recovery steam generator (HRSG) is an equipment that has a high risk to the power plant. The impact of corrosion damage causing HRSG power plant stops operating. Furthermore, it could be threaten the safety of employees. The Risk Based Inspection (RBI) guidelines by the American Petroleum Institute (API) 58 has been used to risk analysis in the HRSG 1. By using this methodology, the risk that caused by unexpected failure as a function of the probability and consequence of failure can be estimated. This paper presented a case study relating to the risk analysis in the HRSG, starting with a summary of the basic principles and procedures of risk assessment and applying corrosion RBI for process industries. The risk level of each HRSG equipment were analyzed: HP superheater has a medium high risk (4C), HP evaporator has a medium-high risk (4C), and the HP economizer has a medium risk (3C). The results of the risk assessment using semi-quantitative method of standard API 581 based on the existing equipment at medium risk. In the fact, there is no critical problem in the equipment components. Damage mechanisms were prominent throughout the equipment is thinning mechanism. The evaluation of the risk approach was done with the aim of reducing risk by optimizing the risk assessment activities.

  19. Research in applied mathematics, numerical analysis, and computer science

    NASA Technical Reports Server (NTRS)

    1984-01-01

    Research conducted at the Institute for Computer Applications in Science and Engineering (ICASE) in applied mathematics, numerical analysis, and computer science is summarized and abstracts of published reports are presented. The major categories of the ICASE research program are: (1) numerical methods, with particular emphasis on the development and analysis of basic numerical algorithms; (2) control and parameter identification; (3) computational problems in engineering and the physical sciences, particularly fluid dynamics, acoustics, and structural analysis; and (4) computer systems and software, especially vector and parallel computers.

  20. Measurement uncertainty analysis techniques applied to PV performance measurements

    SciTech Connect

    Wells, C.

    1992-10-01

    The purpose of this presentation is to provide a brief introduction to measurement uncertainty analysis, outline how it is done, and illustrate uncertainty analysis with examples drawn from the PV field, with particular emphasis toward its use in PV performance measurements. The uncertainty information we know and state concerning a PV performance measurement or a module test result determines, to a significant extent, the value and quality of that result. What is measurement uncertainty analysis It is an outgrowth of what has commonly been called error analysis. But uncertainty analysis, a more recent development, gives greater insight into measurement processes and tests, experiments, or calibration results. Uncertainty analysis gives us an estimate of the I interval about a measured value or an experiment's final result within which we believe the true value of that quantity will lie. Why should we take the time to perform an uncertainty analysis A rigorous measurement uncertainty analysis: Increases the credibility and value of research results; allows comparisons of results from different labs; helps improve experiment design and identifies where changes are needed to achieve stated objectives (through use of the pre-test analysis); plays a significant role in validating measurements and experimental results, and in demonstrating (through the post-test analysis) that valid data have been acquired; reduces the risk of making erroneous decisions; demonstrates quality assurance and quality control measures have been accomplished; define Valid Data as data having known and documented paths of: Origin, including theory; measurements; traceability to measurement standards; computations; uncertainty analysis of results.

  1. General Risk Analysis Methodological Implications to Explosives Risk Management Systems,

    DTIC Science & Technology

    An investigation sponsored by the National Science Foundation has produced as one of its results a survey and evaluation of risk analysis methodologies...This paper presents some implications of the survey to risk analysis and decision making for explosives hazards such as may ultimately be

  2. Recent reinforcement-schedule research and applied behavior analysis

    PubMed Central

    Lattal, Kennon A.; Neef, Nancy A.

    1996-01-01

    Reinforcement schedules are considered in relation to applied behavior analysis by examining several recent laboratory experiments with humans and other animals. The experiments are drawn from three areas of contemporary schedule research: behavioral history effects on schedule performance, the role of instructions in schedule performance of humans, and dynamic schedules of reinforcement. All of the experiments are discussed in relation to the role of behavioral history in current schedule performance. The paper concludes by extracting from the experiments some more general issues concerning reinforcement schedules in applied research and practice. PMID:16795888

  3. Validation in principal components analysis applied to EEG data.

    PubMed

    Costa, João Carlos G D; Da-Silva, Paulo José G; Almeida, Renan Moritz V R; Infantosi, Antonio Fernando C

    2014-01-01

    The well-known multivariate technique Principal Components Analysis (PCA) is usually applied to a sample, and so component scores are subjected to sampling variability. However, few studies address their stability, an important topic when the sample size is small. This work presents three validation procedures applied to PCA, based on confidence regions generated by a variant of a nonparametric bootstrap called the partial bootstrap: (i) the assessment of PC scores variability by the spread and overlapping of "confidence regions" plotted around these scores; (ii) the use of the confidence regions centroids as a validation set; and (iii) the definition of the number of nontrivial axes to be retained for analysis. The methods were applied to EEG data collected during a postural control protocol with twenty-four volunteers. Two axes were retained for analysis, with 91.6% of explained variance. Results showed that the area of the confidence regions provided useful insights on the variability of scores and suggested that some subjects were not distinguishable from others, which was not evident from the principal planes. In addition, potential outliers, initially suggested by an analysis of the first principal plane, could not be confirmed by the confidence regions.

  4. Starlink corn: a risk analysis.

    PubMed Central

    Bucchini, Luca; Goldman, Lynn R

    2002-01-01

    Modern biotechnology has dramatically increased our ability to alter the agronomic traits of plants. Among the novel traits that biotechnology has made available, an important group includes Bacillus thuringiensis-derived insect resistance. This technology has been applied to potatoes, cotton, and corn. Benefits of Bt crops, and biotechnology generally, can be realized only if risks are assessed and managed properly. The case of Starlink corn, a plant modified with a gene that encodes the Bt protein Cry9c, was a severe test of U.S. regulatory agencies. The U.S. Environmental Protection Agency had restricted its use to animal feed due to concern about the potential for allergenicity. However, Starlink corn was later found throughout the human food supply, resulting in food recalls by the Food and Drug Administration and significant disruption of the food supply. Here we examine the regulatory history of Starlink, the assessment framework employed by the U.S. government, assumptions and information gaps, and the key elements of government efforts to manage the product. We explore the impacts on regulations, science, and society and conclude that only significant advances in our understanding of food allergies and improvements in monitoring and enforcement will avoid similar events in the future. Specifically, we need to develop a stronger fundamental basis for predicting allergic sensitization and reactions if novel proteins are to be introduced in this fashion. Mechanisms are needed to assure that worker and community aeroallergen risks are considered. Requirements are needed for the development of valid assays so that enforcement and post market surveillance activities can be conducted. PMID:11781159

  5. Starlink corn: a risk analysis.

    PubMed

    Bucchini, Luca; Goldman, Lynn R

    2002-01-01

    Modern biotechnology has dramatically increased our ability to alter the agronomic traits of plants. Among the novel traits that biotechnology has made available, an important group includes Bacillus thuringiensis-derived insect resistance. This technology has been applied to potatoes, cotton, and corn. Benefits of Bt crops, and biotechnology generally, can be realized only if risks are assessed and managed properly. The case of Starlink corn, a plant modified with a gene that encodes the Bt protein Cry9c, was a severe test of U.S. regulatory agencies. The U.S. Environmental Protection Agency had restricted its use to animal feed due to concern about the potential for allergenicity. However, Starlink corn was later found throughout the human food supply, resulting in food recalls by the Food and Drug Administration and significant disruption of the food supply. Here we examine the regulatory history of Starlink, the assessment framework employed by the U.S. government, assumptions and information gaps, and the key elements of government efforts to manage the product. We explore the impacts on regulations, science, and society and conclude that only significant advances in our understanding of food allergies and improvements in monitoring and enforcement will avoid similar events in the future. Specifically, we need to develop a stronger fundamental basis for predicting allergic sensitization and reactions if novel proteins are to be introduced in this fashion. Mechanisms are needed to assure that worker and community aeroallergen risks are considered. Requirements are needed for the development of valid assays so that enforcement and post market surveillance activities can be conducted.

  6. Boston Society's 11th Annual Applied Pharmaceutical Analysis conference.

    PubMed

    Lee, Violet; Liu, Ang; Groeber, Elizabeth; Moghaddam, Mehran; Schiller, James; Tweed, Joseph A; Walker, Gregory S

    2016-02-01

    Boston Society's 11th Annual Applied Pharmaceutical Analysis conference, Hyatt Regency Hotel, Cambridge, MA, USA, 14-16 September 2015 The Boston Society's 11th Annual Applied Pharmaceutical Analysis (APA) conference took place at the Hyatt Regency hotel in Cambridge, MA, on 14-16 September 2015. The 3-day conference affords pharmaceutical professionals, academic researchers and industry regulators the opportunity to collectively participate in meaningful and relevant discussions impacting the areas of pharmaceutical drug development. The APA conference was organized in three workshops encompassing the disciplines of regulated bioanalysis, discovery bioanalysis (encompassing new and emerging technologies) and biotransformation. The conference included a short course titled 'Bioanalytical considerations for the clinical development of antibody-drug conjugates (ADCs)', an engaging poster session, several panel and round table discussions and over 50 diverse talks from leading industry and academic scientists.

  7. RAMS (Risk Analysis - Modular System) methodology

    SciTech Connect

    Stenner, R.D.; Strenge, D.L.; Buck, J.W.

    1996-10-01

    The Risk Analysis - Modular System (RAMS) was developed to serve as a broad scope risk analysis tool for the Risk Assessment of the Hanford Mission (RAHM) studies. The RAHM element provides risk analysis support for Hanford Strategic Analysis and Mission Planning activities. The RAHM also provides risk analysis support for the Hanford 10-Year Plan development activities. The RAMS tool draws from a collection of specifically designed databases and modular risk analysis methodologies and models. RAMS is a flexible modular system that can be focused on targeted risk analysis needs. It is specifically designed to address risks associated with overall strategy, technical alternative, and `what if` questions regarding the Hanford cleanup mission. RAMS is set up to address both near-term and long-term risk issues. Consistency is very important for any comparative risk analysis, and RAMS is designed to efficiently and consistently compare risks and produce risk reduction estimates. There is a wide range of output information that can be generated by RAMS. These outputs can be detailed by individual contaminants, waste forms, transport pathways, exposure scenarios, individuals, populations, etc. However, they can also be in rolled-up form to support high-level strategy decisions.

  8. Developing an interdisciplinary master's program in applied behavior analysis

    PubMed Central

    Lowenkron, Barry; Mitchell, Lynda

    1995-01-01

    At many universities, faculty interested in behavior analysis are spread across disciplines. This makes difficult the development of behavior-analytically oriented programs, and impedes regular contact among colleagues who share common interests. However, this separation by disciplines can be a source of strength if it is used to develop interdisciplinary programs. In this article we describe how a bottom-up strategy was used to develop two complementary interdisciplinary MS programs in applied behavior analysis, and conclude with a description of the benefits—some obvious, some surprising—that can emerge from the development of such programs. PMID:22478230

  9. Concepts in critical thinking applied to caries risk assessment in dental education.

    PubMed

    Guzman-Armstrong, Sandra; Warren, John J; Cunningham-Ford, Marsha A; von Bergmann, HsingChi; Johnsen, David C

    2014-06-01

    Much progress has been made in the science of caries risk assessment and ways to analyze caries risk, yet dental education has seen little movement toward the development of frameworks to guide learning and assess critical thinking in caries risk assessment. In the absence of previous proactive implementation of a learning framework that takes the knowledge of caries risk and critically applies it to the patient with the succinctness demanded in the clinical setting, the purpose of this study was to develop a model learning framework that combines the science of caries risk assessment with principles of critical thinking from the education literature. This article also describes the implementation of that model at one dental school and presents some preliminary assessment data.

  10. Measurement uncertainty analysis techniques applied to PV performance measurements

    SciTech Connect

    Wells, C

    1992-10-01

    The purpose of this presentation is to provide a brief introduction to measurement uncertainty analysis, outline how it is done, and illustrate uncertainty analysis with examples drawn from the PV field, with particular emphasis toward its use in PV performance measurements. The uncertainty information we know and state concerning a PV performance measurement or a module test result determines, to a significant extent, the value and quality of that result. What is measurement uncertainty analysis? It is an outgrowth of what has commonly been called error analysis. But uncertainty analysis, a more recent development, gives greater insight into measurement processes and tests, experiments, or calibration results. Uncertainty analysis gives us an estimate of the I interval about a measured value or an experiment`s final result within which we believe the true value of that quantity will lie. Why should we take the time to perform an uncertainty analysis? A rigorous measurement uncertainty analysis: Increases the credibility and value of research results; allows comparisons of results from different labs; helps improve experiment design and identifies where changes are needed to achieve stated objectives (through use of the pre-test analysis); plays a significant role in validating measurements and experimental results, and in demonstrating (through the post-test analysis) that valid data have been acquired; reduces the risk of making erroneous decisions; demonstrates quality assurance and quality control measures have been accomplished; define Valid Data as data having known and documented paths of: Origin, including theory; measurements; traceability to measurement standards; computations; uncertainty analysis of results.

  11. [Failure mode effect analysis applied to preparation of intravenous cytostatics].

    PubMed

    Santos-Rubio, M D; Marín-Gil, R; Muñoz-de la Corte, R; Velázquez-López, M D; Gil-Navarro, M V; Bautista-Paloma, F J

    2016-01-01

    To proactively identify risks in the preparation of intravenous cytostatic drugs, and to prioritise and establish measures to improve safety procedures. Failure Mode Effect Analysis methodology was used. A multidisciplinary team identified potential failure modes of the procedure through a brainstorming session. The impact associated with each failure mode was assessed with the Risk Priority Number (RPN), which involves three variables: occurrence, severity, and detectability. Improvement measures were established for all identified failure modes, with those with RPN>100 considered critical. The final RPN (theoretical) that would result from the proposed measures was also calculated and the process was redesigned. A total of 34 failure modes were identified. The initial accumulated RPN was 3022 (range: 3-252), and after recommended actions the final RPN was 1292 (range: 3-189). RPN scores >100 were obtained in 13 failure modes; only the dispensing sub-process was free of critical points (RPN>100). A final reduction of RPN>50% was achieved in 9 failure modes. This prospective risk analysis methodology allows the weaknesses of the procedure to be prioritised, optimize use of resources, and a substantial improvement in the safety of the preparation of cytostatic drugs through the introduction of double checking and intermediate product labelling. Copyright © 2015 SECA. Published by Elsevier Espana. All rights reserved.

  12. Applied Examples of Screening Students at Risk of Emotional and Behavioral Disabilities

    ERIC Educational Resources Information Center

    Pierce, Corey D.; Nordness, Philip D.; Epstein, Michael H.; Cullinan, Douglas

    2016-01-01

    Early identification of student behavioral needs allows educators the opportunity to apply appropriate interventions before negative behaviors become more intensive and persistent. A variety of screening tools are available to identify which students are at risk for persistent behavior problems in school. This article provides two examples in…

  13. Two-, Three-, and Four-Factor PCL-R Models in Applied Sex Offender Risk Assessments

    ERIC Educational Resources Information Center

    Weaver, Christopher M.; Meyer, Robert G.; Van Nort, James J.; Tristan, Luciano

    2006-01-01

    The authors compared 2-, 3-, 4-factor, and 2-factor/4-facet Psychopathy Checklist-Revised (PCL-R) models in a previously unpublished sample of 1,566 adult male sex offenders assessed under applied clinical conditions as part of a comprehensive state-mandated community notification risk assessment procedure. "Testlets" significantly…

  14. Two-, Three-, and Four-Factor PCL-R Models in Applied Sex Offender Risk Assessments

    ERIC Educational Resources Information Center

    Weaver, Christopher M.; Meyer, Robert G.; Van Nort, James J.; Tristan, Luciano

    2006-01-01

    The authors compared 2-, 3-, 4-factor, and 2-factor/4-facet Psychopathy Checklist-Revised (PCL-R) models in a previously unpublished sample of 1,566 adult male sex offenders assessed under applied clinical conditions as part of a comprehensive state-mandated community notification risk assessment procedure. "Testlets" significantly…

  15. Analytic concepts for assessing risk as applied to human space flight

    SciTech Connect

    Garrick, B.J.

    1997-04-30

    Quantitative risk assessment (QRA) principles provide an effective framework for quantifying individual elements of risk, including the risk to astronauts and spacecraft of the radiation environment of space flight. The concept of QRA is based on a structured set of scenarios that could lead to different damage states initiated by either hardware failure, human error, or external events. In the context of a spacecraft risk assessment, radiation may be considered as an external event and analyzed in the same basic way as any other contributor to risk. It is possible to turn up the microscope on any particular contributor to risk and ask more detailed questions than might be necessary to simply assess safety. The methods of QRA allow for as much fine structure in the analysis as is desired. For the purpose of developing a basis for comprehensive risk management and considering the tendency to {open_quotes}fear anything nuclear,{close_quotes} radiation risk is a prime candidate for examination beyond that necessary to answer the basic question of risk. Thus, rather than considering only the customary damage states of fatalities or loss of a spacecraft, it is suggested that the full range of damage be analyzed to quantify radiation risk. Radiation dose levels in the form of a risk curve accomplish such a result. If the risk curve is the complementary cumulative distribution function, then it answers the extended question of what is the likelihood of receiving a specific dose of radiation or greater. Such results can be converted to specific health effects as desired. Knowing the full range of the radiation risk of a space mission and the contributors to that risk provides the information necessary to take risk management actions [operational, design, scheduling of missions around solar particle events (SPE), etc.] that clearly control radiation exposure.

  16. Probabilistic Analysis Techniques Applied to Complex Spacecraft Power System Modeling

    NASA Technical Reports Server (NTRS)

    Hojnicki, Jeffrey S.; Rusick, Jeffrey J.

    2005-01-01

    Electric power system performance predictions are critical to spacecraft, such as the International Space Station (ISS), to ensure that sufficient power is available to support all the spacecraft s power needs. In the case of the ISS power system, analyses to date have been deterministic, meaning that each analysis produces a single-valued result for power capability because of the complexity and large size of the model. As a result, the deterministic ISS analyses did not account for the sensitivity of the power capability to uncertainties in model input variables. Over the last 10 years, the NASA Glenn Research Center has developed advanced, computationally fast, probabilistic analysis techniques and successfully applied them to large (thousands of nodes) complex structural analysis models. These same techniques were recently applied to large, complex ISS power system models. This new application enables probabilistic power analyses that account for input uncertainties and produce results that include variations caused by these uncertainties. Specifically, N&R Engineering, under contract to NASA, integrated these advanced probabilistic techniques with Glenn s internationally recognized ISS power system model, System Power Analysis for Capability Evaluation (SPACE).

  17. Probabilistic Analysis Techniques Applied to Complex Spacecraft Power System Modeling

    NASA Technical Reports Server (NTRS)

    Hojnicki, Jeffrey S.; Rusick, Jeffrey J.

    2005-01-01

    Electric power system performance predictions are critical to spacecraft, such as the International Space Station (ISS), to ensure that sufficient power is available to support all the spacecraft s power needs. In the case of the ISS power system, analyses to date have been deterministic, meaning that each analysis produces a single-valued result for power capability because of the complexity and large size of the model. As a result, the deterministic ISS analyses did not account for the sensitivity of the power capability to uncertainties in model input variables. Over the last 10 years, the NASA Glenn Research Center has developed advanced, computationally fast, probabilistic analysis techniques and successfully applied them to large (thousands of nodes) complex structural analysis models. These same techniques were recently applied to large, complex ISS power system models. This new application enables probabilistic power analyses that account for input uncertainties and produce results that include variations caused by these uncertainties. Specifically, N&R Engineering, under contract to NASA, integrated these advanced probabilistic techniques with Glenn s internationally recognized ISS power system model, System Power Analysis for Capability Evaluation (SPACE).

  18. Differential item functioning analysis by applying multiple comparison procedures.

    PubMed

    Eusebi, Paolo; Kreiner, Svend

    2015-01-01

    Analysis within a Rasch measurement framework aims at development of valid and objective test score. One requirement of both validity and objectivity is that items do not show evidence of differential item functioning (DIF). A number of procedures exist for the assessment of DIF including those based on analysis of contingency tables by Mantel-Haenszel tests and partial gamma coefficients. The aim of this paper is to illustrate Multiple Comparison Procedures (MCP) for analysis of DIF relative to a variable defining a very large number of groups, with an unclear ordering with respect to the DIF effect. We propose a single step procedure controlling the false discovery rate for DIF detection. The procedure applies for both dichotomous and polytomous items. In addition to providing evidence against a hypothesis of no DIF, the procedure also provides information on subset of groups that are homogeneous with respect to the DIF effect. A stepwise MCP procedure for this purpose is also introduced.

  19. Predicting pathogen transport and risk of infection from land-applied biosolids

    NASA Astrophysics Data System (ADS)

    Olson, M. S.; Teng, J.; Kumar, A.; Gurian, P.

    2011-12-01

    Biosolids have been recycled as fertilizer to sustainably improve and maintain productive soils and to stimulate plant growth for over forty years, but may contain low levels of microbial pathogens. The Spreadsheet Microbial Assessment of Risk: Tool for Biosolids ("SMART Biosolids") is an environmental transport, exposure and risk model that compiles knowledge on the occurrence, environmental dispersion and attenuation of biosolids-associated pathogens to estimate microbial risk from biosolids land application. The SMART Biosolids model calculates environmental pathogen concentrations and assesses risk associated with exposure to pathogens from land-applied biosolids through five pathways: 1) inhalation of aerosols from land application sites, 2) consumption of groundwater contaminated by land-applied biosolids, 3) direct ingestion of biosolids-amended soils, 4) ingestion of plants contaminated by land-applied biosolids, and 5) consumption of surface water contaminated by runoff from a land application site. The SMART Biosolids model can be applied under a variety of scenarios, thereby providing insight into effective management practices. This study presents example results of the SMART Biosolids model, focusing on the groundwater and surface water pathways, following biosolids application to a typical site in Michigan. Volumes of infiltration and surface water runoff are calculated following a 100-year storm event. Pathogen transport and attenuation through the subsurface and via surface runoff are modeled, and pathogen concentrations in a downstream well and an adjacent pond are calculated. Risks are calculated for residents of nearby properties. For a 100-year storm event occurring immediately after biosolids application, the surface water pathway produces risks that may be of some concern, but best estimates do not exceed the bounds of what has been considered acceptable risk for recreational water use (Table 1); groundwater risks are very uncertain and at the

  20. Activity anorexia: An interplay between basic and applied behavior analysis

    PubMed Central

    Pierce, W. David; Epling, W. Frank; Dews, Peter B.; Estes, William K.; Morse, William H.; Van Orman, Willard; Herrnstein, Richard J.

    1994-01-01

    The relationship between basic research with nonhumans and applied behavior analysis is illustrated by our work on activity anorexia. When rats are fed one meal a day and allowed to run on an activity wheel, they run excessively, stop eating, and die of starvation. Convergent evidence, from several different research areas, indicates that the behavior of these animals and humans who self-starve is functionally similar. A biobehavioral theory of activity anorexia is presented that details the cultural contingencies, behavioral processes, and physiology of anorexia. Diagnostic criteria and a three-stage treatment program for activity-based anorexia are outlined. The animal model permits basic research on anorexia that for practical and ethical reasons cannot be conducted with humans. Thus, basic research can have applied importance. PMID:22478169

  1. Activity anorexia: An interplay between basic and applied behavior analysis.

    PubMed

    Pierce, W D; Epling, W F; Dews, P B; Estes, W K; Morse, W H; Van Orman, W; Herrnstein, R J

    1994-01-01

    The relationship between basic research with nonhumans and applied behavior analysis is illustrated by our work on activity anorexia. When rats are fed one meal a day and allowed to run on an activity wheel, they run excessively, stop eating, and die of starvation. Convergent evidence, from several different research areas, indicates that the behavior of these animals and humans who self-starve is functionally similar. A biobehavioral theory of activity anorexia is presented that details the cultural contingencies, behavioral processes, and physiology of anorexia. Diagnostic criteria and a three-stage treatment program for activity-based anorexia are outlined. The animal model permits basic research on anorexia that for practical and ethical reasons cannot be conducted with humans. Thus, basic research can have applied importance.

  2. Revealing the underlying drivers of disaster risk: a global analysis

    NASA Astrophysics Data System (ADS)

    Peduzzi, Pascal

    2017-04-01

    Disasters events are perfect examples of compound events. Disaster risk lies at the intersection of several independent components such as hazard, exposure and vulnerability. Understanding the weight of each component requires extensive standardisation. Here, I show how footprints of past disastrous events were generated using GIS modelling techniques and used for extracting population and economic exposures based on distribution models. Using past event losses, it was possible to identify and quantify a wide range of socio-politico-economic drivers associated with human vulnerability. The analysis was applied to about nine thousand individual past disastrous events covering earthquakes, floods and tropical cyclones. Using a multiple regression analysis on these individual events it was possible to quantify each risk component and assess how vulnerability is influenced by various hazard intensities. The results show that hazard intensity, exposure, poverty, governance as well as other underlying factors (e.g. remoteness) can explain the magnitude of past disasters. Analysis was also performed to highlight the role of future trends in population and climate change and how this may impacts exposure to tropical cyclones in the future. GIS models combined with statistical multiple regression analysis provided a powerful methodology to identify, quantify and model disaster risk taking into account its various components. The same methodology can be applied to various types of risk at local to global scale. This method was applied and developed for the Global Risk Analysis of the Global Assessment Report on Disaster Risk Reduction (GAR). It was first applied on mortality risk in GAR 2009 and GAR 2011. New models ranging from global assets exposure and global flood hazard models were also recently developed to improve the resolution of the risk analysis and applied through CAPRA software to provide probabilistic economic risk assessments such as Average Annual Losses (AAL

  3. Sensitivity analysis in quantitative microbial risk assessment.

    PubMed

    Zwieterin, M H; van Gerwen, S J

    2000-07-15

    The occurrence of foodborne disease remains a widespread problem in both the developing and the developed world. A systematic and quantitative evaluation of food safety is important to control the risk of foodborne diseases. World-wide, many initiatives are being taken to develop quantitative risk analysis. However, the quantitative evaluation of food safety in all its aspects is very complex, especially since in many cases specific parameter values are not available. Often many variables have large statistical variability while the quantitative effect of various phenomena is unknown. Therefore, sensitivity analysis can be a useful tool to determine the main risk-determining phenomena, as well as the aspects that mainly determine the inaccuracy in the risk estimate. This paper presents three stages of sensitivity analysis. First, deterministic analysis selects the most relevant determinants for risk. Overlooking of exceptional, but relevant cases is prevented by a second, worst-case analysis. This analysis finds relevant process steps in worst-case situations, and shows the relevance of variations of factors for risk. The third, stochastic analysis, studies the effects of variations of factors for the variability of risk estimates. Care must be taken that the assumptions made as well as the results are clearly communicated. Stochastic risk estimates are, like deterministic ones, just as good (or bad) as the available data, and the stochastic analysis must not be used to mask lack of information. Sensitivity analysis is a valuable tool in quantitative risk assessment by determining critical aspects and effects of variations.

  4. Cladistic analysis applied to the classification of volcanoes

    NASA Astrophysics Data System (ADS)

    Hone, D. W. E.; Mahony, S. H.; Sparks, R. S. J.; Martin, K. T.

    2007-11-01

    Cladistics is a systematic method of classification that groups entities on the basis of sharing similar characteristics in the most parsimonious manner. Here cladistics is applied to the classification of volcanoes using a dataset of 59 Quaternary volcanoes and 129 volcanic edifices of the Tohoku region, Northeast Japan. Volcano and edifice characteristics recorded in the database include attributes of volcano size, chemical composition, dominant eruptive products, volcano morphology, dominant landforms, volcano age and eruptive history. Without characteristics related to time the volcanic edifices divide into two groups, with characters related to volcano size, dominant composition and edifice morphology being the most diagnostic. Analysis including time based characteristics yields four groups with a good correlation between these groups and the two groups from the analysis without time for 108 out of 129 volcanic edifices. Thus when characters are slightly changed the volcanoes still form similar groupings. Analysis of the volcanoes both with and without time yields three groups based on compositional, eruptive products and morphological characters. Spatial clusters of volcanic centres have been recognised in the Tohoku region by Tamura et al. ( Earth Planet Sci Lett 197:105 106, 2002). The groups identified by cladistic analysis are distributed unevenly between the clusters, indicating a tendency for individual clusters to form similar kinds of volcanoes with distinctive but coherent styles of volcanism. Uneven distribution of volcano types between clusters can be explained by variations in dominant magma compositions through time, which are reflected in eruption products and volcanic landforms. Cladistic analysis can be a useful tool for elucidating dynamic igneous processes that could be applied to other regions and globally. Our exploratory study indicates that cladistics has promise as a method for classifying volcanoes and potentially elucidating dynamic

  5. Dynamic Blowout Risk Analysis Using Loss Functions.

    PubMed

    Abimbola, Majeed; Khan, Faisal

    2017-08-11

    Most risk analysis approaches are static; failing to capture evolving conditions. Blowout, the most feared accident during a drilling operation, is a complex and dynamic event. The traditional risk analysis methods are useful in the early design stage of drilling operation while falling short during evolving operational decision making. A new dynamic risk analysis approach is presented to capture evolving situations through dynamic probability and consequence models. The dynamic consequence models, the focus of this study, are developed in terms of loss functions. These models are subsequently integrated with the probability to estimate operational risk, providing a real-time risk analysis. The real-time evolving situation is considered dependent on the changing bottom-hole pressure as drilling progresses. The application of the methodology and models are demonstrated with a case study of an offshore drilling operation evolving to a blowout. © 2017 Society for Risk Analysis.

  6. Synchronisation and coupling analysis: applied cardiovascular physics in sleep medicine.

    PubMed

    Wessel, Niels; Riedl, Maik; Kramer, Jan; Muller, Andreas; Penzel, Thomas; Kurths, Jurgen

    2013-01-01

    Sleep is a physiological process with an internal program of a number of well defined sleep stages and intermediate wakefulness periods. The sleep stages modulate the autonomous nervous system and thereby the sleep stages are accompanied by different regulation regimes for the cardiovascular and respiratory system. The differences in regulation can be distinguished by new techniques of cardiovascular physics. The number of patients suffering from sleep disorders increases unproportionally with the increase of the human population and aging, leading to very high expenses in the public health system. Therefore, the challenge of cardiovascular physics is to develop highly-sophisticated methods which are able to, on the one hand, supplement and replace expensive medical devices and, on the other hand, improve the medical diagnostics with decreasing the patient's risk. Methods of cardiovascular physics are used to analyze heart rate, blood pressure and respiration to detect changes of the autonomous nervous system in different diseases. Data driven modeling analysis, synchronization and coupling analysis and their applications to biosignals in healthy subjects and patients with different sleep disorders are presented. Newly derived methods of cardiovascular physics can help to find indicators for these health risks.

  7. Sensitivity analysis applied to stalled airfoil wake and steady control

    NASA Astrophysics Data System (ADS)

    Patino, Gustavo; Gioria, Rafael; Meneghini, Julio

    2014-11-01

    The sensitivity of an eigenvalue to base flow modifications induced by an external force is applied to the global unstable modes associated to the onset of vortex shedding in the wake of a stalled airfoil. In this work, the flow regime is close to the first instability of the system and its associated eigenvalue/eigenmode is determined. The sensitivity analysis to a general punctual external force allows establishing the regions where control devices must be in order to stabilize the global modes. Different types of steady control devices, passive and active, are used in the regions predicted by the sensitivity analysis to check the vortex shedding suppression, i.e. the primary instability bifurcation is delayed. The new eigenvalue, modified by the action of the device, is also calculated. Finally the spectral finite element method is employed to determine flow characteristics before and after of the bifurcation in order to cross check the results.

  8. Classical mechanics approach applied to analysis of genetic oscillators.

    PubMed

    Vasylchenkova, Anastasiia; Mraz, Miha; Zimic, Nikolaj; Moskon, Miha

    2016-04-05

    Biological oscillators present a fundamental part of several regulatory mechanisms that control the response of various biological systems. Several analytical approaches for their analysis have been reported recently. They are, however, limited to only specific oscillator topologies and/or to giving only qualitative answers, i.e., is the dynamics of an oscillator given the parameter space oscillatory or not. Here we present a general analytical approach that can be applied to the analysis of biological oscillators. It relies on the projection of biological systems to classical mechanics systems. The approach is able to provide us with relatively accurate results in the meaning of type of behaviour system reflects (i.e. oscillatory or not) and periods of potential oscillations without the necessity to conduct expensive numerical simulations. We demonstrate and verify the proposed approach on three different implementations of amplified negative feedback oscillator.

  9. Compatibility of person-centered planning and applied behavior analysis

    PubMed Central

    Holburn, Steve

    2001-01-01

    In response to Osborne (1999), the aims and practices of person-centered planning (PCP) are compared to the basic principles of applied behavior analysis set forth by Baer, Wolf, and Risley (1968, 1987). The principal goal of PCP is social integration of people with disabilities; it qualifies as a socially important behavior, and its problems have been displayed sufficiently. However, social integration is a complex social problem whose solution requires access to system contingencies that influence lifestyles. Nearly all of the component goals of PCP proposed by O'Brien (1987b) have been reliably quantified, although concurrent measurement of outcomes such as friendship, autonomy, and respect presents a formidable challenge. Behavioral principles such as contingency and contextual control are operative within PCP, but problems in achieving reliable implementation appear to impede an experimental analysis. PMID:22478371

  10. Shape analysis applied in heavy ion reactions near Fermi energy

    NASA Astrophysics Data System (ADS)

    Zhang, S.; Huang, M.; Wada, R.; Liu, X.; Lin, W.; Wang, J.

    2017-03-01

    A new method is proposed to perform shape analyses and to evaluate their validity in heavy ion collisions near the Fermi energy. In order to avoid erroneous values of shape parameters in the calculation, a test particle method is utilized in which each nucleon is represented by n test particles, similar to that used in the Boltzmann–Uehling–Uhlenbeck (BUU) calculations. The method is applied to the events simulated by an antisymmetrized molecular dynamics model. The geometrical shape of fragments is reasonably extracted when n = 100 is used. A significant deformation is observed for all fragments created in the multifragmentation process. The method is also applied to the shape of the momentum distribution for event classification. In the momentum case, the errors in the eigenvalue calculation become much smaller than those of the geometrical shape analysis and the results become similar between those with and without the test particle method, indicating that in intermediate heavy ion collisions the shape analysis of momentum distribution can be used for the event classification without the test particle method.

  11. Applying MORT maintenance safety analysis in Finnish industry

    NASA Astrophysics Data System (ADS)

    Ruuhilehto, Kaarin; Virolainen, Kimmo

    1992-02-01

    A safety analysis method based on MORT (Management Oversight and Risk Tree) method, especially on the version developed for safety considerations in the evaluation of maintenance programs, is presented. The MORT maintenance safety analysis is intended especially for the use maintenance safety management. The analysis helps managers evaluate the goals of their safety work and measures taken to reach them. The analysis is done by a team or teams. The team ought to have expert knowledge of the organization both vertically and horizontally in order to be able to identify factors that may contribute to accidents or other interruptions in the maintenance work. Identification is made by using the MORT maintenance key question set as a check list. The questions check the way safety matters are connnected with the maintenance planning and managing, as well as the safety management itself. In the second stage, means to eliminate the factors causing problems are developed. New practices are established to improve safety of maintenance planning and managing in the enterprise.

  12. Carbon Fiber Risk Analysis. [conference

    NASA Technical Reports Server (NTRS)

    1979-01-01

    The scope and status of the effort to assess the risks associated with the accidental release of carbon/graphite fibers from civil aircraft is presented. Vulnerability of electrical and electronic equipment to carbon fibers, dispersal of carbon fibers, effectiveness of filtering systems, impact of fiber induced failures, and risk methodology are among the topics covered.

  13. Advances in Risk Analysis with Big Data.

    PubMed

    Choi, Tsan-Ming; Lambert, James H

    2017-08-01

    With cloud computing, Internet-of-things, wireless sensors, social media, fast storage and retrieval, etc., organizations and enterprises have access to unprecedented amounts and varieties of data. Current risk analysis methodology and applications are experiencing related advances and breakthroughs. For example, highway operations data are readily available, and making use of them reduces risks of traffic crashes and travel delays. Massive data of financial and enterprise systems support decision making under risk by individuals, industries, regulators, etc. In this introductory article, we first discuss the meaning of big data for risk analysis. We then examine recent advances in risk analysis with big data in several topic areas. For each area, we identify and introduce the relevant articles that are featured in the special issue. We conclude with a discussion on future research opportunities. © 2017 Society for Risk Analysis.

  14. Automated SEM Modal Analysis Applied to the Diogenites

    NASA Technical Reports Server (NTRS)

    Bowman, L. E.; Spilde, M. N.; Papike, James J.

    1996-01-01

    Analysis of volume proportions of minerals, or modal analysis, is routinely accomplished by point counting on an optical microscope, but the process, particularly on brecciated samples such as the diogenite meteorites, is tedious and prone to error by misidentification of very small fragments, which may make up a significant volume of the sample. Precise volume percentage data can be gathered on a scanning electron microscope (SEM) utilizing digital imaging and an energy dispersive spectrometer (EDS). This form of automated phase analysis reduces error, and at the same time provides more information than could be gathered using simple point counting alone, such as particle morphology statistics and chemical analyses. We have previously studied major, minor, and trace-element chemistry of orthopyroxene from a suite of diogenites. This abstract describes the method applied to determine the modes on this same suite of meteorites and the results of that research. The modal abundances thus determined add additional information on the petrogenesis of the diogenites. In addition, low-abundance phases such as spinels were located for further analysis by this method.

  15. Thermographic techniques applied to solar collector systems analysis

    SciTech Connect

    Eden, A.

    1980-02-01

    The use of thermography to analyze large solar collector array systems under dynamic operating conditions is discussed. The research at the Solar Energy Research Institute (SERI) in this area has focused on thermographic techniques and equipment to determine temperature distributions, flow patterns, and air blockages in solar collectors. The results of this extensive study, covering many sites and types of collectors, illustrate the capabilities of infrared (IR) analysis as a qualitative analysis tool and operation and maintenance procedure when applied to large arrays. Thermographic analysis of most collector systems qualitatively showed relative temperature distributions that indicated balanced flow patterns. In three significant cases, blocked or broken collector arrays, which previously had gone undetected, were discovered. Using this analysis, validation studies of large computer codes could examine collector arrays for flow patterns or blockages that could cause disagreement between actual and predicted performance. Initial operation and balancing of large systems could be accomplished without complicated sensor systems not needed for normal operations. Maintenance personnel could quickly check their systems without climbing onto the roof and without complicated sensor systems.

  16. Automated SEM Modal Analysis Applied to the Diogenites

    NASA Technical Reports Server (NTRS)

    Bowman, L. E.; Spilde, M. N.; Papike, James J.

    1996-01-01

    Analysis of volume proportions of minerals, or modal analysis, is routinely accomplished by point counting on an optical microscope, but the process, particularly on brecciated samples such as the diogenite meteorites, is tedious and prone to error by misidentification of very small fragments, which may make up a significant volume of the sample. Precise volume percentage data can be gathered on a scanning electron microscope (SEM) utilizing digital imaging and an energy dispersive spectrometer (EDS). This form of automated phase analysis reduces error, and at the same time provides more information than could be gathered using simple point counting alone, such as particle morphology statistics and chemical analyses. We have previously studied major, minor, and trace-element chemistry of orthopyroxene from a suite of diogenites. This abstract describes the method applied to determine the modes on this same suite of meteorites and the results of that research. The modal abundances thus determined add additional information on the petrogenesis of the diogenites. In addition, low-abundance phases such as spinels were located for further analysis by this method.

  17. Topographic Avalanche Risk: DEM Sensitivity Analysis

    NASA Astrophysics Data System (ADS)

    Nazarkulova, Ainura; Strobl, Josef

    2015-04-01

    GIS-based models are frequently used to assess the risk and trigger probabilities of (snow) avalanche releases, based on parameters and geomorphometric derivatives like elevation, exposure, slope, proximity to ridges and local relief energy. Numerous models, and model-based specific applications and project results have been published based on a variety of approaches and parametrizations as well as calibrations. Digital Elevation Models (DEM) come with many different resolution (scale) and quality (accuracy) properties, some of these resulting from sensor characteristics and DEM generation algorithms, others from different DEM processing workflows and analysis strategies. This paper explores the impact of using different types and characteristics of DEMs for avalanche risk modeling approaches, and aims at establishing a framework for assessing the uncertainty of results. The research question is derived from simply demonstrating the differences in release risk areas and intensities by applying identical models to DEMs with different properties, and then extending this into a broader sensitivity analysis. For the quantification and calibration of uncertainty parameters different metrics are established, based on simple value ranges, probabilities, as well as fuzzy expressions and fractal metrics. As a specific approach the work on DEM resolution-dependent 'slope spectra' is being considered and linked with the specific application of geomorphometry-base risk assessment. For the purpose of this study focusing on DEM characteristics, factors like land cover, meteorological recordings and snowpack structure and transformation are kept constant, i.e. not considered explicitly. Key aims of the research presented here are the development of a multi-resolution and multi-scale framework supporting the consistent combination of large area basic risk assessment with local mitigation-oriented studies, and the transferability of the latter into areas without availability of

  18. Risk Analysis and Uncertainty: Implications for Counselling

    ERIC Educational Resources Information Center

    Hassenzahl, David

    2004-01-01

    Over the past two decades, the risk analysis community has made substantial advances in understanding and describing uncertainty. Uncertainty is ubiquitous, complex, both quantitative and qualitative in nature, and often irreducible. Uncertainty thus creates a challenge when using risk analysis to evaluate the rationality of group and individual…

  19. Risk Analysis for Resource Planning Optimization

    NASA Technical Reports Server (NTRS)

    Chueng, Kar-Ming

    2008-01-01

    The main purpose of this paper is to introduce a risk management approach that allows planners to quantify the risk and efficiency tradeoff in the presence of uncertainties, and to make forward-looking choices in the development and execution of the plan. Demonstrate a planning and risk analysis framework that tightly integrates mathematical optimization, empirical simulation, and theoretical analysis techniques to solve complex problems.

  20. Alcohol Consumption and Gastric Cancer Risk: A Meta-Analysis

    PubMed Central

    Ma, Ke; Baloch, Zulqarnain; He, Ting-Ting; Xia, Xueshan

    2017-01-01

    Background We sought to determine by meta-analysis the relationship between drinking alcohol and the risk of gastric cancer. Material/Methods A systematic Medline search was performed to identify all published reports of drinking alcohol and the associated risk of gastric cancer. Initially we retrieved 2,494 studies, but after applying inclusion and exclusion criteria, only ten studies were found to be eligible for our meta-analysis. Results Our meta-analysis showed that alcohol consumption elevated the risk of gastric cancer with an odds ratio (OR) of 1.39 (95% CI 1.20–1.61). Additionally, subgroup analysis showed that only a nested case-control report from Sweden did not support this observation. Subgroup analysis of moderate drinking and heavy drinking also confirmed that drinking alcohol increased the risk of gastric cancer. Publication bias analysis (Begg’s and Egger’s tests) showed p values were more than 0.05, suggesting that the 10 articles included in our analysis did not have a publication bias. Conclusions The results from this meta-analysis support the hypothesis that alcohol consumption can increase the risk of gastric cancer; suggesting that effective moderation of alcohol drinking may reduce the risk of gastric cancer. PMID:28087989

  1. Resource allocation using risk analysis

    SciTech Connect

    Bott, T. F.; Eisenhawer, S. W.

    2003-01-01

    Allocating limited resources among competing priorities is an important problem in management. In this paper we describe an approach to resource allocation using risk as a metric. We call this approach the Logic-Evolved Decision (LED) approach because we use logic-models to generate an exhaustive set of competing options and to describe the often highly complex model used for evaluating the risk reduction achieved by different resource allocations among these options. The risk evaluation then proceeds using probabilistic or linguistic input data.

  2. Quantitative Microbial Risk Assessment in Occupational Settings Applied to the Airborne Human Adenovirus Infection

    PubMed Central

    Carducci, Annalaura; Donzelli, Gabriele; Cioni, Lorenzo; Verani, Marco

    2016-01-01

    Quantitative Microbial Risk Assessment (QMRA) methodology, which has already been applied to drinking water and food safety, may also be applied to risk assessment and management at the workplace. The present study developed a preliminary QMRA model to assess microbial risk that is associated with inhaling bioaerosols that are contaminated with human adenovirus (HAdV). This model has been applied to air contamination data from different occupational settings, including wastewater systems, solid waste landfills, and toilets in healthcare settings and offices, with different exposure times. Virological monitoring showed the presence of HAdVs in all the evaluated settings, thus confirming that HAdV is widespread, but with different average concentrations of the virus. The QMRA results, based on these concentrations, showed that toilets had the highest probability of viral infection, followed by wastewater treatment plants and municipal solid waste landfills. Our QMRA approach in occupational settings is novel, and certain caveats should be considered. Nonetheless, we believe it is worthy of further discussions and investigations. PMID:27447658

  3. Quantitative Microbial Risk Assessment in Occupational Settings Applied to the Airborne Human Adenovirus Infection.

    PubMed

    Carducci, Annalaura; Donzelli, Gabriele; Cioni, Lorenzo; Verani, Marco

    2016-07-20

    Quantitative Microbial Risk Assessment (QMRA) methodology, which has already been applied to drinking water and food safety, may also be applied to risk assessment and management at the workplace. The present study developed a preliminary QMRA model to assess microbial risk that is associated with inhaling bioaerosols that are contaminated with human adenovirus (HAdV). This model has been applied to air contamination data from different occupational settings, including wastewater systems, solid waste landfills, and toilets in healthcare settings and offices, with different exposure times. Virological monitoring showed the presence of HAdVs in all the evaluated settings, thus confirming that HAdV is widespread, but with different average concentrations of the virus. The QMRA results, based on these concentrations, showed that toilets had the highest probability of viral infection, followed by wastewater treatment plants and municipal solid waste landfills. Our QMRA approach in occupational settings is novel, and certain caveats should be considered. Nonetheless, we believe it is worthy of further discussions and investigations.

  4. Automating Risk Analysis of Software Design Models

    PubMed Central

    Ruiz, Guifré; Heymann, Elisa; César, Eduardo; Miller, Barton P.

    2014-01-01

    The growth of the internet and networked systems has exposed software to an increased amount of security threats. One of the responses from software developers to these threats is the introduction of security activities in the software development lifecycle. This paper describes an approach to reduce the need for costly human expertise to perform risk analysis in software, which is common in secure development methodologies, by automating threat modeling. Reducing the dependency on security experts aims at reducing the cost of secure development by allowing non-security-aware developers to apply secure development with little to no additional cost, making secure development more accessible. To automate threat modeling two data structures are introduced, identification trees and mitigation trees, to identify threats in software designs and advise mitigation techniques, while taking into account specification requirements and cost concerns. These are the components of our model for automated threat modeling, AutSEC. We validated AutSEC by implementing it in a tool based on data flow diagrams, from the Microsoft security development methodology, and applying it to VOMS, a grid middleware component, to evaluate our model's performance. PMID:25136688

  5. Automating risk analysis of software design models.

    PubMed

    Frydman, Maxime; Ruiz, Guifré; Heymann, Elisa; César, Eduardo; Miller, Barton P

    2014-01-01

    The growth of the internet and networked systems has exposed software to an increased amount of security threats. One of the responses from software developers to these threats is the introduction of security activities in the software development lifecycle. This paper describes an approach to reduce the need for costly human expertise to perform risk analysis in software, which is common in secure development methodologies, by automating threat modeling. Reducing the dependency on security experts aims at reducing the cost of secure development by allowing non-security-aware developers to apply secure development with little to no additional cost, making secure development more accessible. To automate threat modeling two data structures are introduced, identification trees and mitigation trees, to identify threats in software designs and advise mitigation techniques, while taking into account specification requirements and cost concerns. These are the components of our model for automated threat modeling, AutSEC. We validated AutSEC by implementing it in a tool based on data flow diagrams, from the Microsoft security development methodology, and applying it to VOMS, a grid middleware component, to evaluate our model's performance.

  6. Pathogen risk assessment of land applied wastewater and biosolids: A fuzzy set approach

    SciTech Connect

    Dahab, M.F.; Fuerhacker, M.; Zibuschka, F.

    1998-07-01

    There are major concerns associated with land application of wastewater and biosolids including the potential risk to public health from water-borne pollutants that may enter the food chain and from pathogens that may be present in the wastewater. These risks are of particular concern when wastewater is applied to land where crops are grown as part of the human food chain or when direct human contact with the wastewater may occur. In many communities, toxic chemicals may not be present in the biosolids, or their concentrations may be reduced through source control measures. However, pathogens that enter wastewater from infected individuals cannot be controlled at the source and are often found in wastewater or biosolids applied to land. Public health officials have emphasized that microbial pathogens (or pathogen indicators) should not occur in areas where exposure to humans is likely. Under this criteria, the concept of risk assessment which requires the characterization of the occurrence of pathogens, almost seems to be contradictory to basic public health goals. As the understanding of pathogen and pathogen indicator occurrence becomes better refined, the arguments for finding practical application of risk assessment for pathogenic organisms become more compelling.

  7. Carbon Fiber Risk Analysis: Conclusions

    NASA Technical Reports Server (NTRS)

    Huston, R. J.

    1979-01-01

    It was concluded that preliminary estimates indicate that the public risk due to accidental release of carbon fiber from air transport aircraft is small. It was also concluded that further work is required to increase confidence in these estimates.

  8. Supply-Chain Risk Analysis

    DTIC Science & Technology

    2016-06-07

    security score upon first submission – 3/1/2010 Measured Against CWE/SANS Top-25 Errors 24 SQL Database Query Output: All records with ID = 48983...exploitable design or coding errors • Very little data for software supply chains 8 Software Supply Chain Complexity-1 Composite inherits risk from any point... Relative Effort Operational Capabilities Knowledge of Supplier Capabilities Knowledge of Product Attributes 13 Supply-Chain Risk Categories Category

  9. Probabilistic risk assessment of veterinary medicines applied to four major aquaculture species produced in Asia.

    PubMed

    Rico, Andreu; Van den Brink, Paul J

    2014-01-15

    Aquaculture production constitutes one of the main sources of pollution with veterinary medicines into the environment. About 90% of the global aquaculture production is produced in Asia and the potential environmental risks associated with the use of veterinary medicines in Asian aquaculture have not yet been properly evaluated. In this study we performed a probabilistic risk assessment for eight different aquaculture production scenarios in Asia by combining up-to-date information on the use of veterinary medicines and aquaculture production characteristics. The ERA-AQUA model was used to perform mass balances of veterinary medicinal treatments applied to aquaculture ponds and to characterize risks for primary producers, invertebrates, and fish potentially exposed to chemical residues through aquaculture effluents. The mass balance calculations showed that, on average, about 25% of the applied drug mass to aquaculture ponds is released into the environment, although this percentage varies with the chemical's properties, the mode of application, the cultured species density, and the water exchange rates in the aquaculture pond scenario. In general, the highest potential environmental risks were calculated for parasitic treatments, followed by disinfection and antibiotic treatments. Pangasius catfish production in Vietnam, followed by shrimp production in China, constitute possible hot-spots for environmental pollution due to the intensity of the aquaculture production and considerable discharge of toxic chemical residues into surrounding aquatic ecosystems. A risk-based ranking of compounds is provided for each of the evaluated scenarios, which offers crucial information for conducting further chemical and biological field and laboratory monitoring research. In addition, we discuss general knowledge gaps and research priorities for performing refined risk assessments of aquaculture medicines in the near future.

  10. Risk analysis based on hazards interactions

    NASA Astrophysics Data System (ADS)

    Rossi, Lauro; Rudari, Roberto; Trasforini, Eva; De Angeli, Silvia; Becker, Joost

    2017-04-01

    Despite an increasing need for open, transparent, and credible multi-hazard risk assessment methods, models, and tools, the availability of comprehensive risk information needed to inform disaster risk reduction is limited, and the level of interaction across hazards is not systematically analysed. Risk assessment methodologies for different hazards often produce risk metrics that are not comparable. Hazard interactions (consecutive occurrence two or more different events) are generally neglected, resulting in strongly underestimated risk assessment in the most exposed areas. This study presents cases of interaction between different hazards, showing how subsidence can affect coastal and river flood risk (Jakarta and Bandung, Indonesia) or how flood risk is modified after a seismic event (Italy). The analysis of well documented real study cases, based on a combination between Earth Observation and in-situ data, would serve as basis the formalisation of a multi-hazard methodology, identifying gaps and research frontiers. Multi-hazard risk analysis is performed through the RASOR platform (Rapid Analysis and Spatialisation Of Risk). A scenario-driven query system allow users to simulate future scenarios based on existing and assumed conditions, to compare with historical scenarios, and to model multi-hazard risk both before and during an event (www.rasor.eu).

  11. Drinking trajectories of at-risk groups: Does the theory of the collectivity of drinking apply?

    PubMed

    Norström, Thor; Raninen, Jonas

    2017-08-02

    Alcohol consumption among Swedish adolescents has halved during the last decade. We aim to: (i) investigate whether the overall decrease in drinking may conceal an underlying heterogeneity in drinking trajectories across at-risk groups that differ with respect to risk for drinking and; (ii) assess to what degree alcohol-related harm has responded to this decrease. Data were obtained from the nationally representative annual school survey of alcohol and drug habits among Swedish ninth-grade students covering the period 2000-2012 (n ≈ 5000/year). Respondents were divided into five at-risk groups ranging from low to high based on their relative ranking on a risk scale for drinking. Alcohol consumption was measured by beverage-specific quantity and frequency items summarised into a measure of overall drinking in litres of 100% alcohol per year. Alcohol-related harm was measured by eight items asking about whether the respondent had experienced various alcohol-related negative consequences. Drinking and alcohol-related harm decreased in all five at-risk groups. There was a marked relation between the overall consumption and the mean consumption in each of the five at-risk groups. Self-reported alcohol-related harm decreased during the study period to an extent that was expected from the decrease in alcohol consumption. Alcohol consumption among Swedish youth has declined in five groups that were delineated based on their relative ranking on a risk factor index. The findings are consistent with Skog's theory of the collectivity of drinking behaviour. [Norström T, Raninen J. Drinking trajectories of at-risk groups: Does the theory of the collectivity of drinking apply?. Drug Alcohol Rev 2017;00:000-000]. © 2017 Australasian Professional Society on Alcohol and other Drugs.

  12. Receiver function analysis applied to refraction survey data

    NASA Astrophysics Data System (ADS)

    Subaru, T.; Kyosuke, O.; Hitoshi, M.

    2008-12-01

    For the estimation of the thickness of oceanic crust or petrophysical investigation of subsurface material, refraction or reflection seismic exploration is one of the methods frequently practiced. These explorations use four-component (x,y,z component of acceleration and pressure) seismometer, but only compressional wave or vertical component of seismometers tends to be used in the analyses. Hence, it is needed to use shear wave or lateral component of seismograms for more precise investigation to estimate the thickness of oceanic crust. Receiver function is a function at a place that can be used to estimate the depth of velocity interfaces by receiving waves from teleseismic signal including shear wave. Receiver function analysis uses both vertical and horizontal components of seismograms and deconvolves the horizontal with the vertical to estimate the spectral difference of P-S converted waves arriving after the direct P wave. Once the phase information of the receiver function is obtained, then one can estimate the depth of the velocity interface. This analysis has advantage in the estimation of the depth of velocity interface including Mohorovicic discontinuity using two components of seismograms when P-to-S converted waves are generated at the interface. Our study presents results of the preliminary study using synthetic seismograms. First, we use three types of geological models that are composed of a single sediment layer, a crust layer, and a sloped Moho, respectively, for underground sources. The receiver function can estimate the depth and shape of Moho interface precisely for the three models. Second, We applied this method to synthetic refraction survey data generated not by earthquakes but by artificial sources on the ground or sea surface. Compressional seismic waves propagate under the velocity interface and radiate converted shear waves as well as at the other deep underground layer interfaces. However, the receiver function analysis applied to the

  13. Applying machine learning techniques to DNA sequence analysis

    SciTech Connect

    Shavlik, J.W.

    1992-01-01

    We are developing a machine learning system that modifies existing knowledge about specific types of biological sequences. It does this by considering sample members and nonmembers of the sequence motif being learned. Using this information (which we call a domain theory''), our learning algorithm produces a more accurate representation of the knowledge needed to categorize future sequences. Specifically, the KBANN algorithm maps inference rules, such as consensus sequences, into a neural (connectionist) network. Neural network training techniques then use the training examples of refine these inference rules. We have been applying this approach to several problems in DNA sequence analysis and have also been extending the capabilities of our learning system along several dimensions.

  14. Applied Behavior Analysis is a Science and, Therefore, Progressive.

    PubMed

    Leaf, Justin B; Leaf, Ronald; McEachin, John; Taubman, Mitchell; Ala'i-Rosales, Shahla; Ross, Robert K; Smith, Tristram; Weiss, Mary Jane

    2016-02-01

    Applied behavior analysis (ABA) is a science and, therefore, involves progressive approaches and outcomes. In this commentary we argue that the spirit and the method of science should be maintained in order to avoid reductionist procedures, stifled innovation, and rote, unresponsive protocols that become increasingly removed from meaningful progress for individuals diagnosed with autism spectrum disorder (ASD). We describe this approach as progressive. In a progressive approach to ABA, the therapist employs a structured yet flexible process, which is contingent upon and responsive to child progress. We will describe progressive ABA, contrast it to reductionist ABA, and provide rationales for both the substance and intent of ABA as a progressive scientific method for improving conditions of social relevance for individuals with ASD.

  15. Evaluating the effectiveness of teacher training in Applied Behaviour Analysis.

    PubMed

    Grey, Ian M; Honan, Rita; McClean, Brian; Daly, Michael

    2005-09-01

    Interventions for children with autism based upon Applied Behaviour Analysis (ABA) has been repeatedly shown to be related both to educational gains and to reductions in challenging behaviours. However, to date, comprehensive training in ABA for teachers and others have been limited. Over 7 months, 11 teachers undertook 90 hours of classroom instruction and supervision in ABA. Each teacher conducted a comprehensive functional assessment and designed a behaviour support plan targeting one behaviour for one child with an autistic disorder. Target behaviours included aggression, non-compliance and specific educational skills. Teachers recorded observational data for the target behaviour for both baseline and intervention sessions. Support plans produced an average 80 percent change in frequency of occurrence of target behaviours. Questionnaires completed by parents and teachers at the end of the course indicated a beneficial effect for the children and the educational environment. The potential benefits of teacher implemented behavioural intervention are discussed.

  16. Finite element analysis applied to dentoalveolar trauma: methodology description.

    PubMed

    da Silva, B R; Moreira Neto, J J S; da Silva, F I; de Aguiar, A S W

    2011-01-01

    Dentoalveolar traumatic injuries are among the clinical conditions most frequently treated in dental practice. However, few studies so far have addressed the biomechanical aspects of these events, probably as a result of difficulties in carrying out satisfactory experimental and clinical studies as well as the unavailability of truly scientific methodologies. The aim of this paper was to describe the use of finite element analysis applied to the biomechanical evaluation of dentoalveolar trauma. For didactic purposes, the methodological process was divided into steps that go from the creation of a geometric model to the evaluation of final results, always with a focus on methodological characteristics, advantages, and disadvantages, so as to allow the reader to customize the methodology according to specific needs. Our description shows that the finite element method can faithfully reproduce dentoalveolar trauma, provided the methodology is closely followed and thoroughly evaluated.

  17. Finite Element Analysis Applied to Dentoalveolar Trauma: Methodology Description

    PubMed Central

    da Silva, B. R.; Moreira Neto, J. J. S.; da Silva, F. I.; de Aguiar, A. S. W.

    2011-01-01

    Dentoalveolar traumatic injuries are among the clinical conditions most frequently treated in dental practice. However, few studies so far have addressed the biomechanical aspects of these events, probably as a result of difficulties in carrying out satisfactory experimental and clinical studies as well as the unavailability of truly scientific methodologies. The aim of this paper was to describe the use of finite element analysis applied to the biomechanical evaluation of dentoalveolar trauma. For didactic purposes, the methodological process was divided into steps that go from the creation of a geometric model to the evaluation of final results, always with a focus on methodological characteristics, advantages, and disadvantages, so as to allow the reader to customize the methodology according to specific needs. Our description shows that the finite element method can faithfully reproduce dentoalveolar trauma, provided the methodology is closely followed and thoroughly evaluated. PMID:21991463

  18. Applying machine learning techniques to DNA sequence analysis

    SciTech Connect

    Shavlik, J.W. . Dept. of Computer Sciences); Noordewier, M.O. . Dept. of Computer Science)

    1992-01-01

    We are primarily developing a machine teaming (ML) system that modifies existing knowledge about specific types of biological sequences. It does this by considering sample members and nonmembers of the sequence motif being teamed. Using this information, our teaming algorithm produces a more accurate representation of the knowledge needed to categorize future sequences. Specifically, our KBANN algorithm maps inference rules about a given recognition task into a neural network. Neural network training techniques then use the training examples to refine these inference rules. We call these rules a domain theory, following the convention in the machine teaming community. We have been applying this approach to several problems in DNA sequence analysis. In addition, we have been extending the capabilities of our teaming system along several dimensions. We have also been investigating parallel algorithms that perform sequence alignments in the presence of frameshift errors.

  19. Statistical model applied to motor evoked potentials analysis.

    PubMed

    Ma, Ying; Thakor, Nitish V; Jia, Xiaofeng

    2011-01-01

    Motor evoked potentials (MEPs) convey information regarding the functional integrity of the descending motor pathways. Absence of the MEP has been used as a neurophysiological marker to suggest cortico-spinal abnormalities in the operating room. Due to their high variability and sensitivity, detailed quantitative studies of MEPs are lacking. This paper applies a statistical method to characterize MEPs by estimating the number of motor units and single motor unit potential amplitudes. A clearly increasing trend of single motor unit potential amplitudes in the MEPs after each pulse of the stimulation pulse train is revealed by this method. This statistical method eliminates the effects of anesthesia, and provides an objective assessment of MEPs. Consequently this statistical method has high potential to be useful in future quantitative MEPs analysis.

  20. RiskSOAP: Introducing and applying a methodology of risk self-awareness in road tunnel safety.

    PubMed

    Chatzimichailidou, Maria Mikela; Dokas, Ioannis M

    2016-05-01

    Complex socio-technical systems, such as road tunnels, can be designed and developed with more or less elements that can either positively or negatively affect the capability of their agents to recognise imminent threats or vulnerabilities that possibly lead to accidents. This capability is called risk Situation Awareness (SA) provision. Having as a motive the introduction of better tools for designing and developing systems that are self-aware of their vulnerabilities and react to prevent accidents and losses, this paper introduces the Risk Situation Awareness Provision (RiskSOAP) methodology to the field of road tunnel safety, as a means to measure this capability in this kind of systems. The main objective is to test the soundness and the applicability of RiskSOAP to infrastructure, which is advanced in terms of technology, human integration, and minimum number of safety requirements imposed by international bodies. RiskSOAP is applied to a specific road tunnel in Greece and the accompanying indicator is calculated twice, once for the tunnel design as defined by updated European safety standards and once for the 'as-is' tunnel composition, which complies with the necessary safety requirements, but calls for enhancing safety according to what EU and PIARC further suggest. The derived values indicate the extent to which each tunnel version is capable of comprehending its threats and vulnerabilities based on its elements. The former tunnel version seems to be more enhanced both in terms of it risk awareness capability and safety as well. Another interesting finding is that despite the advanced tunnel safety specifications, there is still room for enriching the safe design and maintenance of the road tunnel.

  1. [Profitability analysis of clinical risk management].

    PubMed

    Banduhn, C; Schlüchtermann, J

    2013-05-01

    Medical treatment entails many risks. Increasingly, the negative impact of these risks on patients' health is revealed and corresponding cases are reported to hospital insurances. A systematic clinical risk management can reduce risks. This analysis is designed to demonstrate the financial profitability of implementing a clinical risk management. The decision analysis of a clinical risk management includes information from published articles and studies, publicly available data from the Federal Statistical Office and expert interviews and was conducted in 2 scenarios. The 2 scenarios result from a maximum and minimum value of preventable adverse events reported in Germany. The planning horizon was a 1-year ­period. The analysis was performed from a hospital's perspective. Subsequently, a threshold-analysis of the reduction of preventable adverse events as an effect of clinical risk management was executed. Furthermore, a static capital budgeting over a 5-year period was added, complemented by a risk analysis. Regarding the given assumptions, the implementation of clinical risk management would save about 53 000 € or 175 000 €, respectively, for an average hospital within the first year. Only if the reduction of preventable adverse events is as low as 5.6 or 2.8%, respectively, will the implementation of clinical risk management produce losses. According to a comprehensive risk simulation this happens in less than one out of 1 million cases. The investment in a clinical risk management, based on a 5-year period and an interest rate of 5%, has an annually pay off of 81 000 € or 211 000 €, respectively. The implementation of clinical risk management in a hospital pays off within the first year. In the subsequent years the surplus is even higher due to the elimination of implementation costs. © Georg Thieme Verlag KG Stuttgart · New York.

  2. Bridging the two cultures of risk analysis

    SciTech Connect

    Jasanoff, S. )

    1993-04-01

    During the past 15 years, risk analysis has come of age as an interdisciplinary field of remarkable breadth, nurturing connections among fields as diverse as mathematics, biostatistics, toxicology, and engineering on one hand, and law, psychology, sociology, and economics on the other hand. In this editorial, the author addresses the question: What has the presence of social scientists in the network meant to the substantive development of the field of risk analysis The answers offered here discuss the substantial progress in bridging the two cultures of risk analysis. Emphasis is made of the continual need for monitoring risk analysis. Topics include: the micro-worlds of risk assessment; constraining assumptions; and exchange programs. 14 refs.

  3. A strategy to apply quantitative epistasis analysis on developmental traits.

    PubMed

    Labocha, Marta K; Yuan, Wang; Aleman-Meza, Boanerges; Zhong, Weiwei

    2017-05-15

    Genetic interactions are keys to understand complex traits and evolution. Epistasis analysis is an effective method to map genetic interactions. Large-scale quantitative epistasis analysis has been well established for single cells. However, there is a substantial lack of such studies in multicellular organisms and their complex phenotypes such as development. Here we present a method to extend quantitative epistasis analysis to developmental traits. In the nematode Caenorhabditis elegans, we applied RNA interference on mutants to inactivate two genes, used an imaging system to quantitatively measure phenotypes, and developed a set of statistical methods to extract genetic interactions from phenotypic measurement. Using two different C. elegans developmental phenotypes, body length and sex ratio, as examples, we showed that this method could accommodate various metazoan phenotypes with performances comparable to those methods in single cell growth studies. Comparing with qualitative observations, this method of quantitative epistasis enabled detection of new interactions involving subtle phenotypes. For example, several sex-ratio genes were found to interact with brc-1 and brd-1, the orthologs of the human breast cancer genes BRCA1 and BARD1, respectively. We confirmed the brc-1 interactions with the following genes in DNA damage response: C34F6.1, him-3 (ortholog of HORMAD1, HORMAD2), sdc-1, and set-2 (ortholog of SETD1A, SETD1B, KMT2C, KMT2D), validating the effectiveness of our method in detecting genetic interactions. We developed a reliable, high-throughput method for quantitative epistasis analysis of developmental phenotypes.

  4. Intelligent Adversary Risk Analysis: A Bioterrorism Risk Management Model (PREPRINT)

    DTIC Science & Technology

    2009-02-20

    4302. Respondents should be aware that notwithstanding any other provision of law, no person shall be subject to a penalty for failing to comply with a...OF: 17. LIMITATION OF ABSTRACT Same as Report (SAR) 18. NUMBER OF PAGES 29 19a. NAME OF RESPONSIBLE PERSON a. REPORT unclassified b. ABSTRACT...Society for Risk Analysis, February 20, 2009    1. INTELLIGENT ADVERSARY RISK  ANALISIS  IS DIFFERENT THAN  HAZARD RISK ANALYSIS  Risk analysis has

  5. Applying Atherosclerotic Risk Prevention Guidelines to Elderly Patients: A Bridge Too Far?

    PubMed

    Feldman, Ross D; Harris, Stewart B; Hegele, Robert A; Pickering, J Geoffrey; Rockwood, Kenneth

    2016-05-01

    The primary prevention of atherosclerotic disease is on the basis of optimal management of the major risk factors. For the major risk factors of diabetes, hypertension, and dyslipidemia, management for most patients is on the basis of well developed and extensive evidence-based diagnostic and therapeutic guidelines. However, for a growing segment of the population who are at the highest risk for atherosclerotic disease (ie, older adults), the application of these guidelines is problematic. First, few studies that form the evidence base for these primary prevention guidelines actually include substantial numbers of elderly subjects. Second, elderly patients represent a special population from multiple perspectives related to their accumulation of health deficits and in their development of frailty. These patients with frailty and multiple comorbidities have been mostly excluded from the primary prevention studies upon which the guidelines are based yet comprise a very significant proportion of the very elderly population. Third, elderly people are at most risk from adverse drug reactions because of the increasing number of medications prescribed in this patient population. When applying the existing guidelines to elderly people the limitations of our knowledge must be recognized regarding how best to mitigate the high risk of heart disease in our aging population and how to generalize these recommendations to the management of the largest subgroup of elderly patients (ie, those with multiple comorbidities and frail older adults). Copyright © 2016 Canadian Cardiovascular Society. Published by Elsevier Inc. All rights reserved.

  6. A risk assessment tool applied to the study of shale gas resources.

    PubMed

    Veiguela, Miguel; Hurtado, Antonio; Eguilior, Sonsoles; Recreo, Fernando; Roqueñi, Nieves; Loredo, Jorge

    2016-11-15

    The implementation of a risk assessment tool with the capacity to evaluate the risks for health, safety and the environment (HSE) from extraction of non-conventional fossil fuel resources by the hydraulic fracturing (fracking) technique can be a useful tool to boost development and progress of the technology and winning public trust and acceptance of this. At the early project stages, the lack of data related the selection of non-conventional gas deposits makes it difficult the use of existing approaches to risk assessment of fluids injected into geologic formations. The qualitative risk assessment tool developed in this work is based on the approach that shale gas exploitation risk is dependent on both the geologic site and the technological aspects. It follows from the Oldenburg's 'Screening and Ranking Framework (SRF)' developed to evaluate potential geologic carbon dioxide (CO2) storage sites. These two global characteristics: (1) characteristics centered on the natural aspects of the site and (2) characteristics centered on the technological aspects of the Project, have been evaluated through user input of Property values, which define Attributes, which define the Characteristics. In order to carry out an individual evaluation of each of the characteristics and the elements of the model, the tool has been implemented in a spreadsheet. The proposed model has been applied to a site with potential for the exploitation of shale gas in Asturias (northwestern Spain) with tree different technological options to test the approach.

  7. Extended Kramers-Moyal analysis applied to optical trapping.

    PubMed

    Honisch, Christoph; Friedrich, Rudolf; Hörner, Florian; Denz, Cornelia

    2012-08-01

    The Kramers-Moyal analysis is a well-established approach to analyze stochastic time series from complex systems. If the sampling interval of a measured time series is too low, systematic errors occur in the analysis results. These errors are labeled as finite time effects in the literature. In the present article, we present some new insights about these effects and discuss the limitations of a previously published method to estimate Kramers-Moyal coefficients at the presence of finite time effects. To increase the reliability of this method and to avoid misinterpretations, we extend it by the computation of error estimates for estimated parameters using a Monte Carlo error propagation technique. Finally, the extended method is applied to a data set of an optical trapping experiment yielding estimations of the forces acting on a Brownian particle trapped by optical tweezers. We find an increased Markov-Einstein time scale of the order of the relaxation time of the process, which can be traced back to memory effects caused by the interaction of the particle and the fluid. Above the Markov-Einstein time scale, the process can be very well described by the classical overdamped Markov model for Brownian motion.

  8. Risk Based Inspection Methodology and Software Applied to Atmospheric Storage Tanks

    NASA Astrophysics Data System (ADS)

    Topalis, P.; Korneliussen, G.; Hermanrud, J.; Steo, Y.

    2012-05-01

    A new risk-based inspection (RBI) methodology and software is presented in this paper. The objective of this work is to allow management of the inspections of atmospheric storage tanks in the most efficient way, while, at the same time, accident risks are minimized. The software has been built on the new risk framework architecture, a generic platform facilitating efficient and integrated development of software applications using risk models. The framework includes a library of risk models and the user interface is automatically produced on the basis of editable schemas. This risk-framework-based RBI tool has been applied in the context of RBI for above-ground atmospheric storage tanks (AST) but it has been designed with the objective of being generic enough to allow extension to the process plants in general. This RBI methodology is an evolution of an approach and mathematical models developed for Det Norske Veritas (DNV) and the American Petroleum Institute (API). The methodology assesses damage mechanism potential, degradation rates, probability of failure (PoF), consequence of failure (CoF) in terms of environmental damage and financial loss, risk and inspection intervals and techniques. The scope includes assessment of the tank floor for soil-side external corrosion and product-side internal corrosion and the tank shell courses for atmospheric corrosion and internal thinning. It also includes preliminary assessment for brittle fracture and cracking. The data are structured according to an asset hierarchy including Plant, Production Unit, Process Unit, Tag, Part and Inspection levels and the data are inherited / defaulted seamlessly from a higher hierarchy level to a lower level. The user interface includes synchronized hierarchy tree browsing, dynamic editor and grid-view editing and active reports with drill-in capability.

  9. Initial Risk Analysis and Decision Making Framework

    SciTech Connect

    Engel, David W.

    2012-02-01

    Commercialization of new carbon capture simulation initiative (CCSI) technology will include two key elements of risk management, namely, technical risk (will process and plant performance be effective, safe, and reliable) and enterprise risk (can project losses and costs be controlled within the constraints of market demand to maintain profitability and investor confidence). Both of these elements of risk are incorporated into the risk analysis subtask of Task 7. Thus far, this subtask has developed a prototype demonstration tool that quantifies risk based on the expected profitability of expenditures when retrofitting carbon capture technology on a stylized 650 MW pulverized coal electric power generator. The prototype is based on the selection of specific technical and financial factors believed to be important determinants of the expected profitability of carbon capture, subject to uncertainty. The uncertainty surrounding the technical performance and financial variables selected thus far is propagated in a model that calculates the expected profitability of investments in carbon capture and measures risk in terms of variability in expected net returns from these investments. Given the preliminary nature of the results of this prototype, additional work is required to expand the scope of the model to include additional risk factors, additional information on extant and proposed risk factors, the results of a qualitative risk factor elicitation process, and feedback from utilities and other interested parties involved in the carbon capture project. Additional information on proposed distributions of these risk factors will be integrated into a commercial implementation framework for the purpose of a comparative technology investment analysis.

  10. Dealing with Uncertainty in Chemical Risk Analysis

    DTIC Science & Technology

    1988-12-01

    0 * (OF 41 C-DEALING WITH UNCERTAINTY IN - CHEMICAL RISK ANALYSIS THESIS David S. Clement Captain, USAF AFIT/GOR/MA/8CD-2 DT[C. ~ELECTEf 2 9 MAR 18...AFIT/GOR/MA/88D-2 DEALING WITH UNCERTAINTY IN CHEMICAL RISK ANALYSIS THESIS David S. Clement Captain, USAF AFIT/GOR/MA/88D-2 DTIC V ~ 27989 Approved...for public release; distribution unlimited S . AFIT/GOR/KA/88D-2 DEALING WITH UNCERTAINTY IN CHEMICAL RISK ANALYSIS THESIS Presented to the Faculty

  11. Quantitative Microbial Risk Assessment Tutorial: Land-applied Microbial Loadings within a 12-Digit HUC

    EPA Science Inventory

    This tutorial reviews screens, icons, and basic functions of the SDMProjectBuilder (SDMPB). It demonstrates how one chooses a 12-digit HUC for analysis, performs an assessment of land-applied microbes by simulating microbial fate and transport using HSPF, and analyzes and visuali...

  12. Quantitative Microbial Risk Assessment Tutorial: Land-applied Microbial Loadings within a 12-Digit HUC

    EPA Science Inventory

    This tutorial reviews screens, icons, and basic functions of the SDMProjectBuilder (SDMPB). It demonstrates how one chooses a 12-digit HUC for analysis, performs an assessment of land-applied microbes by simulating microbial fate and transport using HSPF, and analyzes and visuali...

  13. Applying risk and resilience models to predicting the effects of media violence on development.

    PubMed

    Prot, Sara; Gentile, Douglas A

    2014-01-01

    Although the effects of media violence on children and adolescents have been studied for over 50 years, they remain controversial. Much of this controversy is driven by a misunderstanding of causality that seeks the cause of atrocities such as school shootings. Luckily, several recent developments in risk and resilience theories offer a way out of this controversy. Four risk and resilience models are described, including the cascade model, dose-response gradients, pathway models, and turning-point models. Each is described and applied to the existing media effects literature. Recommendations for future research are discussed with regard to each model. In addition, we examine current developments in theorizing that stressors have sensitizing versus steeling effects and recent interest in biological and gene by environment interactions. We also discuss several of the cultural aspects that have supported the polarization and misunderstanding of the literature, and argue that applying risk and resilience models to the theories and data offers a more balanced way to understand the subtle effects of media violence on aggression within a multicausal perspective.

  14. First Attempt of Applying Factor Analysis in Moving Base Gravimetry

    NASA Astrophysics Data System (ADS)

    Li, X.; Roman, D. R.

    2014-12-01

    For gravimetric observation systems on mobile platforms (land/sea/airborne), the Low Signal to Noise Ratio (SNR) issue is the main barrier to achieving an accurate, high resolution gravity signal. Normally, low-pass filters (Childers et al 1999, Forsberg et al 2000, Kwon and Jekeli 2000, Hwang et al 2006) are applied to smooth or remove the high frequency "noise" - even though some of the high frequency component is not necessarily noise. This is especially true for aerogravity surveys such as those from the Gravity for the Redefinition of the American Vertical Datum (GRAV-D) project. These gravity survey flights have a spatial resolution of 10 km between tracks but higher resolution along track. The along track resolution is improved due to the lower flight height (6.1 km), equipment sensitivity, and improved modeling of potential errors. Additionally, these surveys suffer from a loss of signal power due to the increased flight elevation. Hence, application of a low-pass filter removes possible signal sensed in the along-track direction that might otherwise prove useful for various geophysical and geodetic applications. Some cutting-edge developments in Wavelets and Artificial Neural Networks had been successfully applied for obtaining improved results (Li 2008 and 2011, Liang and Liu 2013). However, a clearer and fundamental understanding of the error characteristics will further improve the quality of the gravity estimates out of these gravimetric systems. Here, instead of using any predefined basis function or any a priori model, the idea of Factor Analysis is first employed to try to extract the underlying factors of the noises in the systems. Real data sets collected by both land vehicle and aircraft will be processed as the examples.

  15. The Evidence-Based Practice of Applied Behavior Analysis.

    PubMed

    Slocum, Timothy A; Detrich, Ronnie; Wilczynski, Susan M; Spencer, Trina D; Lewis, Teri; Wolfe, Katie

    2014-05-01

    Evidence-based practice (EBP) is a model of professional decision-making in which practitioners integrate the best available evidence with client values/context and clinical expertise in order to provide services for their clients. This framework provides behavior analysts with a structure for pervasive use of the best available evidence in the complex settings in which they work. This structure recognizes the need for clear and explicit understanding of the strength of evidence supporting intervention options, the important contextual factors including client values that contribute to decision making, and the key role of clinical expertise in the conceptualization, intervention, and evaluation of cases. Opening the discussion of EBP in this journal, Smith (The Behavior Analyst, 36, 7-33, 2013) raised several key issues related to EBP and applied behavior analysis (ABA). The purpose of this paper is to respond to Smith's arguments and extend the discussion of the relevant issues. Although we support many of Smith's (The Behavior Analyst, 36, 7-33, 2013) points, we contend that Smith's definition of EBP is significantly narrower than definitions that are used in professions with long histories of EBP and that this narrowness conflicts with the principles that drive applied behavior analytic practice. We offer a definition and framework for EBP that aligns with the foundations of ABA and is consistent with well-established definitions of EBP in medicine, psychology, and other professions. In addition to supporting the systematic use of research evidence in behavior analytic decision making, this definition can promote clear communication about treatment decisions across disciplines and with important outside institutions such as insurance companies and granting agencies.

  16. Multitaper Spectral Analysis and Wavelet Denoising Applied to Helioseismic Data

    NASA Technical Reports Server (NTRS)

    Komm, R. W.; Gu, Y.; Hill, F.; Stark, P. B.; Fodor, I. K.

    1999-01-01

    Estimates of solar normal mode frequencies from helioseismic observations can be improved by using Multitaper Spectral Analysis (MTSA) to estimate spectra from the time series, then using wavelet denoising of the log spectra. MTSA leads to a power spectrum estimate with reduced variance and better leakage properties than the conventional periodogram. Under the assumption of stationarity and mild regularity conditions, the log multitaper spectrum has a statistical distribution that is approximately Gaussian, so wavelet denoising is asymptotically an optimal method to reduce the noise in the estimated spectra. We find that a single m-upsilon spectrum benefits greatly from MTSA followed by wavelet denoising, and that wavelet denoising by itself can be used to improve m-averaged spectra. We compare estimates using two different 5-taper estimates (Stepian and sine tapers) and the periodogram estimate, for GONG time series at selected angular degrees l. We compare those three spectra with and without wavelet-denoising, both visually, and in terms of the mode parameters estimated from the pre-processed spectra using the GONG peak-fitting algorithm. The two multitaper estimates give equivalent results. The number of modes fitted well by the GONG algorithm is 20% to 60% larger (depending on l and the temporal frequency) when applied to the multitaper estimates than when applied to the periodogram. The estimated mode parameters (frequency, amplitude and width) are comparable for the three power spectrum estimates, except for modes with very small mode widths (a few frequency bins), where the multitaper spectra broadened the modest compared with the periodogram. We tested the influence of the number of tapers used and found that narrow modes at low n values are broadened to the extent that they can no longer be fit if the number of tapers is too large. For helioseismic time series of this length and temporal resolution, the optimal number of tapers is less than 10.

  17. Analysis of labour risks in the Spanish industrial aerospace sector.

    PubMed

    Laguardia, Juan; Rubio, Emilio; Garcia, Ana; Garcia-Foncillas, Rafael

    2016-01-01

    Labour risk prevention is an activity integrated within Safety and Hygiene at Work in Spain. In 2003, the Electronic Declaration for Accidents at Work, Delt@ (DELTA) was introduced. The industrial aerospace sector is subject to various risks. Our objective is to analyse the Spanish Industrial Aerospace Sector (SIAS) using the ACSOM methodology to assess its labour risks and to prioritise preventive actions. The SIAS and the Services Subsector (SS) were created and the relevant accident rate data were obtained. The ACSOM method was applied through double contrast (deviation and translocation) of the SIAS or SS risk polygon with the considered pattern, accidents from all sectors (ACSOM G) or the SIAS. A list of risks was obtained, ordered by action phases. In the SIAS vs. ACSOM G analysis, radiation risks were the worst, followed by overstrains. Accidents caused by living beings were also significant in the SS vs. SIAE, which will be able to be used to improve Risk Prevention. Radiation is the most significant risk in the SIAS and the SS. Preventive actions will be primary and secondary. ACSOM has shown itself to be a valid tool for the analysis of labour risks.

  18. "Applying" Conversation Analysis in Applied Linguistics: Evaluating English as a Second Language Textbook Dialogue.

    ERIC Educational Resources Information Center

    Wong, Jean

    This article examines English-as-a-Second-Language (ESL) textbook telephone dialogues against the backdrop about what is reported about real telephone interaction based in research in conversation analysis (CA). An analysis of eight ESL textbooks reveals that the fit between what conversation analysts say about the nature of natural telephone…

  19. To apply or not to apply: a survey analysis of grant writing costs and benefits.

    PubMed

    von Hippel, Ted; von Hippel, Courtney

    2015-01-01

    We surveyed 113 astronomers and 82 psychologists active in applying for federally funded research on their grant-writing history between January, 2009 and November, 2012. We collected demographic data, effort levels, success rates, and perceived non-financial benefits from writing grant proposals. We find that the average proposal takes 116 PI hours and 55 CI hours to write; although time spent writing was not related to whether the grant was funded. Effort did translate into success, however, as academics who wrote more grants received more funding. Participants indicated modest non-monetary benefits from grant writing, with psychologists reporting a somewhat greater benefit overall than astronomers. These perceptions of non-financial benefits were unrelated to how many grants investigators applied for, the number of grants they received, or the amount of time they devoted to writing their proposals. We also explored the number of years an investigator can afford to apply unsuccessfully for research grants and our analyses suggest that funding rates below approximately 20%, commensurate with current NIH and NSF funding, are likely to drive at least half of the active researchers away from federally funded research. We conclude with recommendations and suggestions for individual investigators and for department heads.

  20. To Apply or Not to Apply: A Survey Analysis of Grant Writing Costs and Benefits

    PubMed Central

    von Hippel, Ted; von Hippel, Courtney

    2015-01-01

    We surveyed 113 astronomers and 82 psychologists active in applying for federally funded research on their grant-writing history between January, 2009 and November, 2012. We collected demographic data, effort levels, success rates, and perceived non-financial benefits from writing grant proposals. We find that the average proposal takes 116 PI hours and 55 CI hours to write; although time spent writing was not related to whether the grant was funded. Effort did translate into success, however, as academics who wrote more grants received more funding. Participants indicated modest non-monetary benefits from grant writing, with psychologists reporting a somewhat greater benefit overall than astronomers. These perceptions of non-financial benefits were unrelated to how many grants investigators applied for, the number of grants they received, or the amount of time they devoted to writing their proposals. We also explored the number of years an investigator can afford to apply unsuccessfully for research grants and our analyses suggest that funding rates below approximately 20%, commensurate with current NIH and NSF funding, are likely to drive at least half of the active researchers away from federally funded research. We conclude with recommendations and suggestions for individual investigators and for department heads. PMID:25738742

  1. CUMULATIVE RISK ANALYSIS FOR ORGANOPHOSPHORUS PESTICIDES

    EPA Science Inventory

    Cumulative Risk Analysis for Organophosphorus Pesticides
    R. Woodrow Setzer, Jr. NHEERL MD-74, USEPA, RTP, NC 27711

    The US EPA has recently completed a risk assessment of the effects of exposure to 33 organophosphorous pesticides (OPs) through the diet, water, and resi...

  2. CUMULATIVE RISK ANALYSIS FOR ORGANOPHOSPHORUS PESTICIDES

    EPA Science Inventory

    Cumulative Risk Analysis for Organophosphorus Pesticides
    R. Woodrow Setzer, Jr. NHEERL MD-74, USEPA, RTP, NC 27711

    The US EPA has recently completed a risk assessment of the effects of exposure to 33 organophosphorous pesticides (OPs) through the diet, water, and resi...

  3. Transportation scenarios for risk analysis.

    SciTech Connect

    Weiner, Ruth F.

    2010-09-01

    Transportation risk, like any risk, is defined by the risk triplet: what can happen (the scenario), how likely it is (the probability), and the resulting consequences. This paper evaluates the development of transportation scenarios, the associated probabilities, and the consequences. The most likely radioactive materials transportation scenario is routine, incident-free transportation, which has a probability indistinguishable from unity. Accident scenarios in radioactive materials transportation are of three different types: accidents in which there is no impact on the radioactive cargo, accidents in which some gamma shielding may be lost but there is no release of radioactive material, and accident in which radioactive material may potentially be released. Accident frequencies, obtainable from recorded data validated by the U.S. Department of Transportation, are considered equivalent to accident probabilities in this study. Probabilities of different types of accidents are conditional probabilities, conditional on an accident occurring, and are developed from event trees. Development of all of these probabilities and the associated highway and rail accident event trees are discussed in this paper.

  4. The ABC’s of Suicide Risk Assessment: Applying a Tripartite Approach to Individual Evaluations

    PubMed Central

    Harris, Keith M.; Syu, Jia-Jia; Lello, Owen D.; Chew, Y. L. Eileen; Willcox, Christopher H.; Ho, Roger H. M.

    2015-01-01

    There is considerable need for accurate suicide risk assessment for clinical, screening, and research purposes. This study applied the tripartite affect-behavior-cognition theory, the suicidal barometer model, classical test theory, and item response theory (IRT), to develop a brief self-report measure of suicide risk that is theoretically-grounded, reliable and valid. An initial survey (n = 359) employed an iterative process to an item pool, resulting in the six-item Suicidal Affect-Behavior-Cognition Scale (SABCS). Three additional studies tested the SABCS and a highly endorsed comparison measure. Studies included two online surveys (Ns = 1007, and 713), and one prospective clinical survey (n = 72; Time 2, n = 54). Factor analyses demonstrated SABCS construct validity through unidimensionality. Internal reliability was high (α = .86-.93, split-half = .90-.94)). The scale was predictive of future suicidal behaviors and suicidality (r = .68, .73, respectively), showed convergent validity, and the SABCS-4 demonstrated clinically relevant sensitivity to change. IRT analyses revealed the SABCS captured more information than the comparison measure, and better defined participants at low, moderate, and high risk. The SABCS is the first suicide risk measure to demonstrate no differential item functioning by sex, age, or ethnicity. In all comparisons, the SABCS showed incremental improvements over a highly endorsed scale through stronger predictive ability, reliability, and other properties. The SABCS is in the public domain, with this publication, and is suitable for clinical evaluations, public screening, and research. PMID:26030590

  5. Risk analysis of dust explosion scenarios using Bayesian networks.

    PubMed

    Yuan, Zhi; Khakzad, Nima; Khan, Faisal; Amyotte, Paul

    2015-02-01

    In this study, a methodology has been proposed for risk analysis of dust explosion scenarios based on Bayesian network. Our methodology also benefits from a bow-tie diagram to better represent the logical relationships existing among contributing factors and consequences of dust explosions. In this study, the risks of dust explosion scenarios are evaluated, taking into account common cause failures and dependencies among root events and possible consequences. Using a diagnostic analysis, dust particle properties, oxygen concentration, and safety training of staff are identified as the most critical root events leading to dust explosions. The probability adaptation concept is also used for sequential updating and thus learning from past dust explosion accidents, which is of great importance in dynamic risk assessment and management. We also apply the proposed methodology to a case study to model dust explosion scenarios, to estimate the envisaged risks, and to identify the vulnerable parts of the system that need additional safety measures.

  6. 38 CFR 75.115 - Risk analysis.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... preparation of the risk analysis may include data mining if necessary for the development of relevant... degree of protection for the data, e.g., unencrypted, plain text; (6) Time the data has been out of VA...

  7. 38 CFR 75.115 - Risk analysis.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... preparation of the risk analysis may include data mining if necessary for the development of relevant... degree of protection for the data, e.g., unencrypted, plain text; (6) Time the data has been out of VA...

  8. Cassini nuclear risk analysis with SPARRC

    NASA Astrophysics Data System (ADS)

    Ha, Chuong T.; Deane, Nelson A.

    1998-01-01

    The nuclear risk analysis of the Cassini mission is one of the most comprehensive risk analyses ever conducted for a space nuclear mission. The complexity of postulated accident scenarios and source term definitions, from launch to Earth swingby, has necessitated an extensive series of analyses in order to provide best-estimates of potential consequence results and bounding uncertainty intervals. The Space Accident Radiological Release and Consequence (SPARRC) family of codes, developed by Lockheed Martin to analyze polydispersed source terms and a combination of different atmospheric transport patterns, have been used for the Cassini Final Safety Analysis Report (FSAR). By identifying dominant contributors, the nuclear risk of each mission segment is understood with a high level of confidence. This paper provides the overall analysis process and insights developed from the risk analysis.

  9. Cassini nuclear risk analysis with SPARRC

    SciTech Connect

    Ha, Chuong T.; Deane, Nelson A.

    1998-01-15

    The nuclear risk analysis of the Cassini mission is one of the most comprehensive risk analyses ever conducted for a space nuclear mission. The complexity of postulated accident scenarios and source term definitions, from launch to Earth swingby, has necessitated an extensive series of analyses in order to provide best-estimates of potential consequence results and bounding uncertainty intervals. The Space Accident Radiological Release and Consequence (SPARRC) family of codes, developed by Lockheed Martin to analyze polydispersed source terms and a combination of different atmospheric transport patterns, have been used for the Cassini Final Safety Analysis Report (FSAR). By identifying dominant contributors, the nuclear risk of each mission segment is understood with a high level of confidence. This paper provides the overall analysis process and insights developed from the risk analysis.

  10. Cassini nuclear risk analysis with SPARRC

    SciTech Connect

    Ha, C.T.; Deane, N.A.

    1998-01-01

    The nuclear risk analysis of the Cassini mission is one of the most comprehensive risk analyses ever conducted for a space nuclear mission. The complexity of postulated accident scenarios and source term definitions, from launch to Earth swingby, has necessitated an extensive series of analyses in order to provide best-estimates of potential consequence results and bounding uncertainty intervals. The Space Accident Radiological Release and Consequence (SPARRC) family of codes, developed by Lockheed Martin to analyze polydispersed source terms and a combination of different atmospheric transport patterns, have been used for the Cassini Final Safety Analysis Report (FSAR). By identifying dominant contributors, the nuclear risk of each mission segment is understood with a high level of confidence. This paper provides the overall analysis process and insights developed from the risk analysis. {copyright} {ital 1998 American Institute of Physics.}

  11. Risk analysis approach. [of carbon fiber release

    NASA Technical Reports Server (NTRS)

    Huston, R. J.

    1979-01-01

    The assessment of the carbon fiber hazard is outlined. Program objectives, requirements of the risk analysis, and elements associated with the physical phenomena of the accidental release are described.

  12. Two-, three-, and four-factor PCL-R models in applied sex offender risk assessments.

    PubMed

    Weaver, Christopher M; Meyer, Robert G; Van Nort, James J; Tristan, Luciano

    2006-06-01

    The authors compared 2-, 3-, 4-factor, and 2-factor/4-facet Psychopathy Checklist-Revised (PCL-R) models in a previously unpublished sample of 1,566 adult male sex offenders assessed under applied clinical conditions as part of a comprehensive state-mandated community notification risk assessment procedure. "Testlets" significantly improved the performance of all models. The 3-factor model provided the best fit to the current data, followed by the 2-factor/4-facet model. The 2-factor model was not supported.

  13. Risk assessment of land-applied biosolids-borne triclocarban (TCC).

    PubMed

    Snyder, Elizabeth Hodges; O'Connor, George A

    2013-01-01

    Triclocarban (TCC) is monitored under the USEPA High Production Volume (HPV) chemical program and is predominantly used as the active ingredient in select antibacterial bar soaps and other personal care products. The compound commonly occurs at parts-per-million concentrations in processed wastewater treatment residuals (i.e. biosolids), which are frequently land-applied as fertilizers and soil conditioners. Human and ecological risk assessment parameters measured by the authors in previous studies were integrated with existing data to perform a two-tiered human health and ecological risk assessment of land-applied biosolids-borne TCC. The 14 exposure pathways identified in the Part 503 Biosolids Rule were expanded, and conservative screening-level hazard quotients (HQ values) were first calculated to estimate risk to humans and a variety of terrestrial and aquatic organisms (Tier 1). The majority of biosolids-borne TCC exposure pathways resulted in no screening-level HQ values indicative of significant risks to exposed organisms (including humans), even under worst-case land application scenarios. The two pathways for which the conservative screening-level HQ values exceeded one (i.e. Pathway 10: biosolids➔soil➔soil organism➔predator, and Pathway 16: biosolids➔soil➔surface water➔aquatic organism) were then reexamined using modified parameters and scenarios (Tier 2). Adjusted HQ values remained greater than one for Exposure Pathway 10, with the exception of the final adjusted HQ values under a one-time 5 Mg ha(-1) (agronomic) biosolids loading rate scenario for the American woodcock (Scolopax minor) and short-tailed shrew (Blarina brevicauda). Results were used to prioritize recommendations for future biosolids-borne TCC research, which include additional measurements of toxicological effects and TCC concentrations in environmental matrices at the field level.

  14. RISK ANALYSIS, ANALYSIS OF VARIANCE: GETTING MORE FROM OUR DATA

    USDA-ARS?s Scientific Manuscript database

    Analysis of variance (ANOVA) and regression are common statistical techniques used to analyze agronomic experimental data and determine significant differences among yields due to treatments or other experimental factors. Risk analysis provides an alternate and complimentary examination of the same...

  15. Applying microscopy to the analysis of nuclear structure and function.

    PubMed

    Iborra, Francisco; Cook, Peter R; Jackson, Dean A

    2003-02-01

    One of the ultimate goals of biological research is to understand mechanisms of cell function within living organisms. With this in mind, many sophisticated technologies that allow us to inspect macromolecular structure in exquisite detail have been developed. Although knowledge of structure derived from techniques such as X-ray crystallography and nuclear magnetic resonance is of vital importance, these approaches cannot reveal the remarkable complexity of molecular interactions that exists in vivo. With this in mind, this review focuses on the use of microscopy techniques to analyze cell structure and function. We describe the different basic microscopic methodologies and how the routine techniques are best applied to particular biological problems. We also emphasize the specific capabilities and uses of light and electron microscopy and highlight their individual advantages and disadvantages. For completion, we also comment on the alternative possibilities provided by a variety of advanced imaging technologies. We hope that this brief analysis of the undoubted power of microscopy techniques will be enough to stimulate a wider participation in this rapidly developing area of biological discovery.

  16. Applying importance-performance analysis to patient safety culture.

    PubMed

    Lee, Yii-Ching; Wu, Hsin-Hung; Hsieh, Wan-Lin; Weng, Shao-Jen; Hsieh, Liang-Po; Huang, Chih-Hsuan

    2015-01-01

    The Sexton et al.'s (2006) safety attitudes questionnaire (SAQ) has been widely used to assess staff's attitudes towards patient safety in healthcare organizations. However, to date there have been few studies that discuss the perceptions of patient safety both from hospital staff and upper management. The purpose of this paper is to improve and to develop better strategies regarding patient safety in healthcare organizations. The Chinese version of SAQ based on the Taiwan Joint Commission on Hospital Accreditation is used to evaluate the perceptions of hospital staff. The current study then lies in applying importance-performance analysis technique to identify the major strengths and weaknesses of the safety culture. The results show that teamwork climate, safety climate, job satisfaction, stress recognition and working conditions are major strengths and should be maintained in order to provide a better patient safety culture. On the contrary, perceptions of management and hospital handoffs and transitions are important weaknesses and should be improved immediately. Research limitations/implications - The research is restricted in generalizability. The assessment of hospital staff in patient safety culture is physicians and registered nurses. It would be interesting to further evaluate other staff's (e.g. technicians, pharmacists and others) opinions regarding patient safety culture in the hospital. Few studies have clearly evaluated the perceptions of healthcare organization management regarding patient safety culture. Healthcare managers enable to take more effective actions to improve the level of patient safety by investigating key characteristics (either strengths or weaknesses) that healthcare organizations should focus on.

  17. Applying DNA computation to intractable problems in social network analysis.

    PubMed

    Chen, Rick C S; Yang, Stephen J H

    2010-09-01

    From ancient times to the present day, social networks have played an important role in the formation of various organizations for a range of social behaviors. As such, social networks inherently describe the complicated relationships between elements around the world. Based on mathematical graph theory, social network analysis (SNA) has been developed in and applied to various fields such as Web 2.0 for Web applications and product developments in industries, etc. However, some definitions of SNA, such as finding a clique, N-clique, N-clan, N-club and K-plex, are NP-complete problems, which are not easily solved via traditional computer architecture. These challenges have restricted the uses of SNA. This paper provides DNA-computing-based approaches with inherently high information density and massive parallelism. Using these approaches, we aim to solve the three primary problems of social networks: N-clique, N-clan, and N-club. Their accuracy and feasible time complexities discussed in the paper will demonstrate that DNA computing can be used to facilitate the development of SNA. Copyright 2010 Elsevier Ireland Ltd. All rights reserved.

  18. Local Bifurcation and Instability Theory Applied to Formability Analysis

    NASA Astrophysics Data System (ADS)

    Buco, B. D.; Oliveira, M. C.; Alves, J. L.; Menezes, L. F.; Ito, K.; Mori, N.

    2010-06-01

    Sheet metal forming components are typically studied with the aid of finite element method based virtual tryout tools, since they allow to save money, time and effort in the design, production and process set-up of deep drawn parts. In all these development phases the analysis of defects is performed with the aid of the material forming limit diagram (FLD), since it allows defining a safe region that reduces: (i) necking; (ii) wrinkling and (iii) large deformation occurrence. It is known that the FLD represented in the strain space presents some disadvantages. The local bifurcation criterion proposed by Ito and Goya defines the critical state for a local bifurcation to set in, as a function of the stress level to work-hardening rate ratio. Thus, the main advantage is that the FLD represented in the stress plane is completely objective [1]. In this work the Ito and Goya model is used to evaluate formability, as well fracture mode and direction along different strain paths: (i) uniaxial tension; (ii) equibiaxial stretch; and (iii) plane-strain. All numerical simulations are performed with the in-house code DD3IMP [2, 3] and NXT [4] in which the Ito and Goya model is implemented, is applied to analyze the results.

  19. Can Link Analysis Be Applied to Identify Behavioral Patterns in Train Recorder Data?

    PubMed

    Strathie, Ailsa; Walker, Guy H

    2016-03-01

    A proof-of-concept analysis was conducted to establish whether link analysis could be applied to data from on-train recorders to detect patterns of behavior that could act as leading indicators of potential safety issues. On-train data recorders capture data about driving behavior on thousands of routine journeys every day and offer a source of untapped data that could be used to offer insights into human behavior. Data from 17 journeys undertaken by six drivers on the same route over a 16-hr period were analyzed using link analysis, and four key metrics were examined: number of links, network density, diameter, and sociometric status. The results established that link analysis can be usefully applied to data captured from on-vehicle recorders. The four metrics revealed key differences in normal driver behavior. These differences have promising construct validity as leading indicators. Link analysis is one method that could be usefully applied to exploit data routinely gathered by on-vehicle data recorders. It facilitates a proactive approach to safety based on leading indicators, offers a clearer understanding of what constitutes normal driving behavior, and identifies trends at the interface of people and systems, which is currently a key area of strategic risk. These research findings have direct applications in the field of transport data monitoring. They offer a means of automatically detecting patterns in driver behavior that could act as leading indicators of problems during operation and that could be used in the proactive monitoring of driver competence, risk management, and even infrastructure design. © 2015, Human Factors and Ergonomics Society.

  20. Filling Terrorism Gaps: VEOs, Evaluating Databases, and Applying Risk Terrain Modeling to Terrorism

    SciTech Connect

    Hagan, Ross F.

    2016-08-29

    This paper aims to address three issues: the lack of literature differentiating terrorism and violent extremist organizations (VEOs), terrorism incident databases, and the applicability of Risk Terrain Modeling (RTM) to terrorism. Current open source literature and publicly available government sources do not differentiate between terrorism and VEOs; furthermore, they fail to define them. Addressing the lack of a comprehensive comparison of existing terrorism data sources, a matrix comparing a dozen terrorism databases is constructed, providing insight toward the array of data available. RTM, a method for spatial risk analysis at a micro level, has some applicability to terrorism research, particularly for studies looking at risk indicators of terrorism. Leveraging attack data from multiple databases, combined with RTM, offers one avenue for closing existing research gaps in terrorism literature.

  1. Adversarial risk analysis for counterterrorism modeling.

    PubMed

    Rios, Jesus; Rios Insua, David

    2012-05-01

    Recent large-scale terrorist attacks have raised interest in models for resource allocation against terrorist threats. The unifying theme in this area is the need to develop methods for the analysis of allocation decisions when risks stem from the intentional actions of intelligent adversaries. Most approaches to these problems have a game-theoretic flavor although there are also several interesting decision-analytic-based proposals. One of them is the recently introduced framework for adversarial risk analysis, which deals with decision-making problems that involve intelligent opponents and uncertain outcomes. We explore how adversarial risk analysis addresses some standard counterterrorism models: simultaneous defend-attack models, sequential defend-attack-defend models, and sequential defend-attack models with private information. For each model, we first assess critically what would be a typical game-theoretic approach and then provide the corresponding solution proposed by the adversarial risk analysis framework, emphasizing how to coherently assess a predictive probability model of the adversary's actions, in a context in which we aim at supporting decisions of a defender versus an attacker. This illustrates the application of adversarial risk analysis to basic counterterrorism models that may be used as basic building blocks for more complex risk analysis of counterterrorism problems.

  2. Applying Geostatistical Analysis to Crime Data: Car-Related Thefts in the Baltic States

    PubMed Central

    Kerry, Ruth; Goovaerts, Pierre; Haining, Robert P.; Ceccato, Vania

    2011-01-01

    Geostatistical methods have rarely been applied to area-level offense data. This article demonstrates their potential for improving the interpretation and understanding of crime patterns using previously analyzed data about car-related thefts for Estonia, Latvia, and Lithuania in 2000. The variogram is used to inform about the scales of variation in offense, social, and economic data. Area-to-area and area-to-point Poisson kriging are used to filter the noise caused by the small number problem. The latter is also used to produce continuous maps of the estimated crime risk (expected number of crimes per 10,000 habitants), thereby reducing the visual bias of large spatial units. In seeking to detect the most likely crime clusters, the uncertainty attached to crime risk estimates is handled through a local cluster analysis using stochastic simulation. Factorial kriging analysis is used to estimate the local- and regional-scale spatial components of the crime risk and explanatory variables. Then regression modeling is used to determine which factors are associated with the risk of car-related theft at different scales. PMID:22190762

  3. Fire Risk Implications in Safety Analysis Reports

    SciTech Connect

    Blanchard, A.

    1999-03-31

    Fire can be a significant risk for facilities that store and handle radiological material. Such events must be evaluated as part of a comprehensive safety analysis. SRS has been developing methods to evaluate radiological fire risk in such facilities. These methods combined with the analysis techniques proposed by DOE-STD-3009-94 have provided a better understanding of how fire risks in nuclear facilities should be managed. To ensure that these new insights are properly disseminated the DOE Savannah River Office and the Defense Nuclear Facility Safety Board (DNFSB) requested Westinghouse Savannah River Company (WSRC) prepare this paper.

  4. Fuzzy risk analysis for nuclear safeguards

    SciTech Connect

    Zardecki, A.

    1993-01-01

    Analysis of a safeguards system, based on the notion of fuzzy sets and linguistic variables, concerns such as complexity and inherent imprecision in estimating the possibility of loss or compromise. The automated risk analysis allows the risk to be determined for an entire system based on estimates for lowest level components and the component proportion. In addition, for each component (asset) the most effective combination of protection mechanisms against a given set of threats is determined. A distinction between bar and featured risk is made.

  5. Causal modelling applied to the risk assessment of a wastewater discharge.

    PubMed

    Paul, Warren L; Rokahr, Pat A; Webb, Jeff M; Rees, Gavin N; Clune, Tim S

    2016-03-01

    Bayesian networks (BNs), or causal Bayesian networks, have become quite popular in ecological risk assessment and natural resource management because of their utility as a communication and decision-support tool. Since their development in the field of artificial intelligence in the 1980s, however, Bayesian networks have evolved and merged with structural equation modelling (SEM). Unlike BNs, which are constrained to encode causal knowledge in conditional probability tables, SEMs encode this knowledge in structural equations, which is thought to be a more natural language for expressing causal information. This merger has clarified the causal content of SEMs and generalised the method such that it can now be performed using standard statistical techniques. As it was with BNs, the utility of this new generation of SEM in ecological risk assessment will need to be demonstrated with examples to foster an understanding and acceptance of the method. Here, we applied SEM to the risk assessment of a wastewater discharge to a stream, with a particular focus on the process of translating a causal diagram (conceptual model) into a statistical model which might then be used in the decision-making and evaluation stages of the risk assessment. The process of building and testing a spatial causal model is demonstrated using data from a spatial sampling design, and the implications of the resulting model are discussed in terms of the risk assessment. It is argued that a spatiotemporal causal model would have greater external validity than the spatial model, enabling broader generalisations to be made regarding the impact of a discharge, and greater value as a tool for evaluating the effects of potential treatment plant upgrades. Suggestions are made on how the causal model could be augmented to include temporal as well as spatial information, including suggestions for appropriate statistical models and analyses.

  6. Relative risk regression analysis of epidemiologic data.

    PubMed

    Prentice, R L

    1985-11-01

    Relative risk regression methods are described. These methods provide a unified approach to a range of data analysis problems in environmental risk assessment and in the study of disease risk factors more generally. Relative risk regression methods are most readily viewed as an outgrowth of Cox's regression and life model. They can also be viewed as a regression generalization of more classical epidemiologic procedures, such as that due to Mantel and Haenszel. In the context of an epidemiologic cohort study, relative risk regression methods extend conventional survival data methods and binary response (e.g., logistic) regression models by taking explicit account of the time to disease occurrence while allowing arbitrary baseline disease rates, general censorship, and time-varying risk factors. This latter feature is particularly relevant to many environmental risk assessment problems wherein one wishes to relate disease rates at a particular point in time to aspects of a preceding risk factor history. Relative risk regression methods also adapt readily to time-matched case-control studies and to certain less standard designs. The uses of relative risk regression methods are illustrated and the state of development of these procedures is discussed. It is argued that asymptotic partial likelihood estimation techniques are now well developed in the important special case in which the disease rates of interest have interpretations as counting process intensity functions. Estimation of relative risks processes corresponding to disease rates falling outside this class has, however, received limited attention. The general area of relative risk regression model criticism has, as yet, not been thoroughly studied, though a number of statistical groups are studying such features as tests of fit, residuals, diagnostics and graphical procedures. Most such studies have been restricted to exponential form relative risks as have simulation studies of relative risk estimation

  7. Risk of boron and heavy metal pollution from agro-industrial wastes applied for plant nutrition.

    PubMed

    Seçer, Müzeyyen; Ceylan, Safak; Elmaci, Omer Lütfü; Akdemir, Hüseyin

    2010-09-01

    In this study, the effects of various agro-industrial wastes were investigated when applied to soil alone or in combination with chemical fertilizers, regarding the risks of boron and heavy metal pollution of soils and plants. Nine combinations of production residues from various agro-industries, urban wastes, and mineral fertilizers were applied to potatoes in a field experiment. The content of available boron in the soil differed significantly (p < 0.05) among the applications. Generally, B values were found to be slightly higher when soapstock, prina, and blood were used alone or in combination. Although total Co, Cd, and Pb contents of soils showed no significant differences between the applications, Cr content differed significantly (p < 0.05). No pollution risk was observed in soil in respect to total Co, Cd, Pb, and Cr contents. The amount of boron and heavy metals in leaves showed no significant differences among the applications. Cobalt, Cd, and Pb in leaves were at normal levels whereas Cr was slightly above normal but well under the critical level. Boron was low in tubers and varied significantly between applications such as Co and Cd. The Co content of tubers was high, Cd and Cr contents were below average, and Pb content was between the given values. Some significant correlations were found between soil characteristics and the boron and heavy metal content of soil, leaves, and tubers.

  8. A method for determining weights for excess relative risk and excess absolute risk when applied in the calculation of lifetime risk of cancer from radiation exposure.

    PubMed

    Walsh, Linda; Schneider, Uwe

    2013-03-01

    Radiation-related risks of cancer can be transported from one population to another population at risk, for the purpose of calculating lifetime risks from radiation exposure. Transfer via excess relative risks (ERR) or excess absolute risks (EAR) or a mixture of both (i.e., from the life span study (LSS) of Japanese atomic bomb survivors) has been done in the past based on qualitative weighting. Consequently, the values of the weights applied and the method of application of the weights (i.e., as additive or geometric weighted means) have varied both between reports produced at different times by the same regulatory body and also between reports produced at similar times by different regulatory bodies. Since the gender and age patterns are often markedly different between EAR and ERR models, it is useful to have an evidence-based method for determining the relative goodness of fit of such models to the data. This paper identifies a method, using Akaike model weights, which could aid expert judgment and be applied to help to achieve consistency of approach and quantitative evidence-based results in future health risk assessments. The results of applying this method to recent LSS cancer incidence models are that the relative EAR weighting by cancer solid cancer site, on a scale of 0-1, is zero for breast and colon, 0.02 for all solid, 0.03 for lung, 0.08 for liver, 0.15 for thyroid, 0.18 for bladder and 0.93 for stomach. The EAR weighting for female breast cancer increases from 0 to 0.3, if a generally observed change in the trend between female age-specific breast cancer incidence rates and attained age, associated with menopause, is accounted for in the EAR model. Application of this method to preferred models from a study of multi-model inference from many models fitted to the LSS leukemia mortality data, results in an EAR weighting of 0. From these results it can be seen that lifetime risk transfer is most highly weighted by EAR only for stomach cancer. However

  9. Risk analysis for plant-made vaccines.

    PubMed

    Kirk, Dwayne D; McIntosh, Kim; Walmsley, Amanda M; Peterson, Robert K D

    2005-08-01

    The production of vaccines in transgenic plants was first proposed in 1990 however no product has yet reached commercialization. There are several risks during the production and delivery stages of this technology, with potential impact on the environment and on human health. Risks to the environment include gene transfer and exposure to antigens or selectable marker proteins. Risks to human health include oral tolerance, allergenicity, inconsistent dosage, worker exposure and unintended exposure to antigens or selectable marker proteins in the food chain. These risks are controllable through appropriate regulatory measures at all stages of production and distribution of a potential plant-made vaccine. Successful use of this technology is highly dependant on stewardship and active risk management by the developers of this technology, and through quality standards for production, which will be set by regulatory agencies. Regulatory agencies can also negatively affect the future viability of this technology by requiring that all risks must be controlled, or by applying conventional regulations which are overly cumbersome for a plant production and oral delivery system. The value of new or replacement vaccines produced in plant cells and delivered orally must be considered alongside the probability and severity of potential risks in their production and use, and the cost of not deploying this technology--the risk of continuing with the status quo alternative.

  10. Use, fate and ecological risks of antibiotics applied in tilapia cage farming in Thailand.

    PubMed

    Rico, Andreu; Oliveira, Rhaul; McDonough, Sakchai; Matser, Arrienne; Khatikarn, Jidapa; Satapornvanit, Kriengkrai; Nogueira, António J A; Soares, Amadeu M V M; Domingues, Inês; Van den Brink, Paul J

    2014-08-01

    The use, environmental fate and ecological risks of antibiotics applied in tilapia cage farming were investigated in the Tha Chin and Mun rivers in Thailand. Information on antibiotic use was collected through interviewing 29 farmers, and the concentrations of the most commonly used antibiotics, oxytetracycline (OTC) and enrofloxacin (ENR), were monitored in river water and sediment samples. Moreover, we assessed the toxicity of OTC and ENR on tropical freshwater invertebrates and performed a risk assessment for aquatic ecosystems. All interviewed tilapia farmers reported to routinely use antibiotics. Peak water concentrations for OTC and ENR were 49 and 1.6 μg/L, respectively. Antibiotics were most frequently detected in sediments with concentrations up to 6908 μg/kg d.w. for OTC, and 2339 μg/kg d.w. for ENR. The results of this study indicate insignificant short-term risks for primary producers and invertebrates, but suggest that the studied aquaculture farms constitute an important source of antibiotic pollution. Copyright © 2014 Elsevier Ltd. All rights reserved.

  11. From pathways to people: applying the adverse outcome pathway (AOP) for skin sensitization to risk assessment.

    PubMed

    MacKay, Cameron; Davies, Michael; Summerfield, Vicki; Maxwell, Gavin

    2013-01-01

    Consumer safety risk assessment of skin sensitization requires information on both consumer exposure to the ingredient through product use and the hazardous properties of the ingredient. Significant progress has been made in determining the hazard potential of ingredients without animal testing. However, hazard identification is insufficient for risk assessment, and an understanding of the dose-response is needed. Obtaining such knowledge without animal testing is challenging and requires applying available mechanistic knowledge to both assay development and the integration of these data. The recent OECD report "The Adverse Outcome Pathway for Skin Sensitization Initiated by Covalent Binding to Proteins" presents the available mechanistic knowledge of the sensitization response within an adverse outcome pathway (AOP). We propose to use this AOP as the mechanistic basis for physiologically- and mechanistically-based toxicokinetic-toxicodynamic models of the sensitization response. The approach would be informed by non-animal data, provide predictions of the dose-response required for risk assessment, and would be evaluated against human clinical data.

  12. Risk Analysis Training within the Army: Current Status, Future Trends,

    DTIC Science & Technology

    risk analysis . Since risk analysis training in the Army is...become involved in risk analysis training. He reviews all risk analysis -related training done in any course at the Center. Also provided is information...expected to use the training. Then the future trend in risk analysis training is presented. New course, course changes and hardware/software changes that will make risk analysis more palatable are

  13. Terrestrial ecological risk evaluation for triclosan in land-applied biosolids.

    PubMed

    Fuchsman, Phyllis; Lyndall, Jennifer; Bock, Michael; Lauren, Darrel; Barber, Timothy; Leigh, Katrina; Perruchon, Elyse; Capdevielle, Marie

    2010-07-01

    Triclosan is an antimicrobial compound found in many consumer products including soaps and personal care products. Most triclosan is disposed of down household drains, whereupon it is conveyed to wastewater treatment plants. Although a high percentage of triclosan biodegrades during wastewater treatment, most of the remainder is adsorbed to sludge, which may ultimately be applied to land as biosolids. We evaluated terrestrial ecological risks related to triclosan in land-applied biosolids for soil microbes, plants, soil invertebrates, mammals, and birds. Exposures are estimated using a probabilistic fugacity-based model. Triclosan concentrations in biosolids and reported biosolids application rates are compiled to support estimation of triclosan concentrations in soil. Concentrations in biota tissue are estimated using an equilibrium partitioning model for plants and worms and a steady-state model for small mammals; the resulting tissue concentrations are used to model mammalian and avian dietary exposures. Toxicity benchmarks are identified from a review of published and proprietary studies. The results indicate that adverse effects related to soil fertility (i.e., disruption of nitrogen cycling) would be expected only under "worst-case" exposures, under certain soil conditions and would likely be transient. The available data indicate that adverse effects on plants, invertebrates, birds, and mammals due to triclosan in land-applied biosolids are unlikely. (c) 2010 SETAC.

  14. Risk-stratified imputation in survival analysis.

    PubMed

    Kennedy, Richard E; Adragni, Kofi P; Tiwari, Hemant K; Voeks, Jenifer H; Brott, Thomas G; Howard, George

    2013-08-01

    Censoring that is dependent on covariates associated with survival can arise in randomized trials due to changes in recruitment and eligibility criteria to minimize withdrawals, potentially leading to biased treatment effect estimates. Imputation approaches have been proposed to address censoring in survival analysis; while these approaches may provide unbiased estimates of treatment effects, imputation of a large number of outcomes may over- or underestimate the associated variance based on the imputation pool selected. We propose an improved method, risk-stratified imputation, as an alternative to address withdrawal related to the risk of events in the context of time-to-event analyses. Our algorithm performs imputation from a pool of replacement subjects with similar values of both treatment and covariate(s) of interest, that is, from a risk-stratified sample. This stratification prior to imputation addresses the requirement of time-to-event analysis that censored observations are representative of all other observations in the risk group with similar exposure variables. We compared our risk-stratified imputation to case deletion and bootstrap imputation in a simulated dataset in which the covariate of interest (study withdrawal) was related to treatment. A motivating example from a recent clinical trial is also presented to demonstrate the utility of our method. In our simulations, risk-stratified imputation gives estimates of treatment effect comparable to bootstrap and auxiliary variable imputation while avoiding inaccuracies of the latter two in estimating the associated variance. Similar results were obtained in analysis of clinical trial data. Risk-stratified imputation has little advantage over other imputation methods when covariates of interest are not related to treatment. Risk-stratified imputation is intended for categorical covariates and may be sensitive to the width of the matching window if continuous covariates are used. The use of the risk

  15. Moving beyond regression techniques in cardiovascular risk prediction: applying machine learning to address analytic challenges.

    PubMed

    Goldstein, Benjamin A; Navar, Ann Marie; Carter, Rickey E

    2016-07-19

    Risk prediction plays an important role in clinical cardiology research. Traditionally, most risk models have been based on regression models. While useful and robust, these statistical methods are limited to using a small number of predictors which operate in the same way on everyone, and uniformly throughout their range. The purpose of this review is to illustrate the use of machine-learning methods for development of risk prediction models. Typically presented as black box approaches, most machine-learning methods are aimed at solving particular challenges that arise in data analysis that are not well addressed by typical regression approaches. To illustrate these challenges, as well as how different methods can address them, we consider trying to predicting mortality after diagnosis of acute myocardial infarction. We use data derived from our institution's electronic health record and abstract data on 13 regularly measured laboratory markers. We walk through different challenges that arise in modelling these data and then introduce different machine-learning approaches. Finally, we discuss general issues in the application of machine-learning methods including tuning parameters, loss functions, variable importance, and missing data. Overall, this review serves as an introduction for those working on risk modelling to approach the diffuse field of machine learning.

  16. Phase plane analysis: applying chaos theory in health care.

    PubMed

    Priesmeyer, H R; Sharp, L F

    1995-01-01

    This article applies the new science of nonlinearity to administrative issues and accounts receivable management in health care, and it provides a new perspective on common operating and quality control measures.

  17. Risk Interfaces to Support Integrated Systems Analysis and Development

    NASA Technical Reports Server (NTRS)

    Mindock, Jennifer; Lumpkins, Sarah; Shelhamer, Mark; Anton, Wilma; Havenhill, Maria

    2016-01-01

    Objectives for systems analysis capability: Develop integrated understanding of how a complex human physiological-socio-technical mission system behaves in spaceflight. Why? Support development of integrated solutions that prevent unwanted outcomes (Implementable approaches to minimize mission resources(mass, power, crew time, etc.)); Support development of tools for autonomy (need for exploration) (Assess and maintain resilience -individuals, teams, integrated system). Output of this exercise: -Representation of interfaces based on Human System Risk Board (HSRB) Risk Summary information and simple status based on Human Research Roadmap; Consolidated HSRB information applied to support communication; Point-of-Departure for HRP Element planning; Ability to track and communicate status of collaborations. 4

  18. Confronting deep uncertainties in risk analysis.

    PubMed

    Cox, Louis Anthony

    2012-10-01

    How can risk analysts help to improve policy and decision making when the correct probabilistic relation between alternative acts and their probable consequences is unknown? This practical challenge of risk management with model uncertainty arises in problems from preparing for climate change to managing emerging diseases to operating complex and hazardous facilities safely. We review constructive methods for robust and adaptive risk analysis under deep uncertainty. These methods are not yet as familiar to many risk analysts as older statistical and model-based methods, such as the paradigm of identifying a single "best-fitting" model and performing sensitivity analyses for its conclusions. They provide genuine breakthroughs for improving predictions and decisions when the correct model is highly uncertain. We demonstrate their potential by summarizing a variety of practical risk management applications.

  19. [The precautionary principle applied to blood transfusion. What is its impact on practices and risk management?].

    PubMed

    Hergon, E; Moutel, G; Duchange, N; Bellier, L; Hervé, C; Rouger, P

    2004-07-01

    The precautionary principle has boomed in the French public health sector through blood transfusion. There has been, however, no perambulatory reflection on the definition, objectives, methods of application or consequences of this principle. The question of the pertinence of its application remains unanswered. This study, based on interviews with blood transfusion practitioners, aims to establish their perceptions of the precautionary principle's application in this specific field and of its consequences in terms of risk management and patients' rights. The pros and cons of this application are analysed based on these perceptions. According to our analysis, the precautionary principle seems to be born of confusion. It is seen more as a way to protect decision makers than patients and, if taken to extremes, could prejudice medical logic. Nevertheless, it also brings measures which renew and encourage evolution in transfusion risk management.

  20. Foodborne zoonoses due to meat: a quantitative approach for a comparative risk assessment applied to pig slaughtering in Europe.

    PubMed

    Fosse, Julien; Seegers, Henri; Magras, Catherine

    2008-01-01

    Foodborne zoonoses have a major health impact in industrialised countries. New European food safety regulations were issued to apply risk analysis to the food chain. The severity of foodborne zoonoses and the exposure of humans to biological hazards transmitted by food must be assessed. For meat, inspection at the slaughterhouse is historically the main means of control to protect consumers. However, the levels of detection of biological hazards during meat inspection have not been established in quantitative terms yet. Pork is the most frequently consumed meat in Europe. The aim of this study was to provide elements for quantifying levels of risk for pork consumers and lack of detection by meat inspection. Information concerning hazard identification and characterisation was obtained by the compilation and statistical analysis of data from 440 literature references. The incidence and severity of human cases due to pork consumption in Europe were assessed in order to calculate risk scores. A ratio of non-control was calculated for each biological hazard identified as currently established in Europe, i.e. the incidence of human cases divided by the prevalence of hazards on pork. Salmonella enterica, Yersinia enterocolitica and Campylobacter spp. were characterised by high incidence rates. Listeria monocytogenes, Clostridium botulinum and Mycobacterium spp. showed the highest severity scores. The three main high risk hazards involved in foodborne infections, Y. enterocolitica, S. enterica and Campylobacter spp. are characterised by high non-control ratios and cannot be detected by macroscopic examination of carcasses. New means of hazard control are needed to complement the classical macroscopic examination.

  1. Multi-Criteria Analysis for Biomass Utilization Applying Data Envelopment Analysis

    NASA Astrophysics Data System (ADS)

    Morimoto, Hidetsugu; Hoshino, Satoshi; Kuki, Yasuaki

    This paper aimed to consider about material-recycling, preventing global warming, and economic efficiency on preset and planed 195 Biomass Towns applying DEA (Data Envelopment Analysis), which can evaluate operational efficiency entities such as private companies or projects. In the results, although the Biomass Town can recycle material efficiently, it was clarified that preventing global warming and business profitability was brushed off like it in Biomass Town Design. Moreover, from the point of view of operational efficiency, we suggested an improvement of the Biomass Town scale for more efficiency-enhancing applying DEA. We found that applying DEA was able to catch more improvements or indicator as compared with cost-benefit analysis and cost-effectiveness analysis.

  2. Common Methods for Security Risk Analysis

    DTIC Science & Technology

    2007-11-02

    Workshops was particularly influential among Canadian tool-designers in the late 1980’s. These models generally favour a software tool solution simply...tools that have too small a market to justify extensive software development. Also, most of the risk management standards that came out at this...companies developing specialized risk analysis tools, such as the Vulcanizer project of DOMUS Software Inc. The latter incorporated fuzzy logic to

  3. Risk-Based Explosive Safety Analysis

    DTIC Science & Technology

    2016-11-30

    safety siting of energetic liquids and propellants can be greatly aided by the use of risk- based methodologies. The low probability of exposed...of energetic liquids and propellants can be greatly aided by the use of risk- based methodologies. The low probability of exposed personnel and the... based analysis of scenario 2 would likely determine that the hazard of death or injury to any single person is low due to the separation distance

  4. Risk Assessment and Integration Team (RAIT) Portfolio Risk Analysis Strategy

    NASA Technical Reports Server (NTRS)

    Edwards, Michelle

    2010-01-01

    Impact at management level: Qualitative assessment of risk criticality in conjunction with risk consequence, likelihood, and severity enable development of an "investment policy" towards managing a portfolio of risks. Impact at research level: Quantitative risk assessments enable researchers to develop risk mitigation strategies with meaningful risk reduction results. Quantitative assessment approach provides useful risk mitigation information.

  5. A risk analysis model for radioactive wastes.

    PubMed

    Külahcı, Fatih

    2011-07-15

    Hazardous wastes affect natural environmental systems to a significant extend, and therefore, it is necessary to control their harm through risk analysis. Herein, an effective risk methodology is proposed by considering their uncertain behaviors on stochastic, statistical and probabilistic bases. The basic element is attachment of a convenient probability distribution function (pdf) to a given waste quality measurement sequence. In this paper, (40)K contaminant measurements are adapted for risk assessment application after derivation of necessary fundamental formulations. The spatial contaminant distribution of (40)K is presented in the forms of maps and three-dimensional surfaces.

  6. Applying simulation modeling to problems in toxicology and risk assessment--a short perspective.

    PubMed

    Andersen, M E; Clewell, H J; Frederick, C B

    1995-08-01

    The goals of this perspective have been to examine areas where quantitative simulation models may be useful in toxicology and related risk assessment fields and to offer suggestions for preparing manuscripts that describe these models. If developments in other disciplines serve as a bell-wether, the use of mathematical models in toxicology will continue to increase, partly, at least, because the new generations of scientists are being trained in an electronic environment where computation of all kinds is learned at an early age. Undoubtedly, however, the utility of these models will be directly tied to the skills of investigators in accurately describing models in their research papers. These publications should convey descriptions of both the insights obtained and the opportunities provided by these models to integrate existing data bases and suggest new and useful experiments. We hope these comments serve to facilitate the expansion of good modeling practices as applied to toxicological problems.

  7. Acquaintance Rape: Applying Crime Scene Analysis to the Prediction of Sexual Recidivism.

    PubMed

    Lehmann, Robert J B; Goodwill, Alasdair M; Hanson, R Karl; Dahle, Klaus-Peter

    2016-10-01

    The aim of the current study was to enhance the assessment and predictive accuracy of risk assessments for sexual offenders by utilizing detailed crime scene analysis (CSA). CSA was conducted on a sample of 247 male acquaintance rapists from Berlin (Germany) using a nonmetric, multidimensional scaling (MDS) Behavioral Thematic Analysis (BTA) approach. The age of the offenders at the time of the index offense ranged from 14 to 64 years (M = 32.3; SD = 11.4). The BTA procedure revealed three behavioral themes of hostility, criminality, and pseudo-intimacy, consistent with previous CSA research on stranger rape. The construct validity of the three themes was demonstrated through correlational analyses with known sexual offending measures and criminal histories. The themes of hostility and pseudo-intimacy were significant predictors of sexual recidivism. In addition, the pseudo-intimacy theme led to a significant increase in the incremental validity of the Static-99 actuarial risk assessment instrument for the prediction of sexual recidivism. The results indicate the potential utility and validity of crime scene behaviors in the applied risk assessment of sexual offenders. © The Author(s) 2015.

  8. 76 FR 30705 - Problem Formulation for Human Health Risk Assessments of Pathogens in Land-Applied Biosolids

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-05-26

    ... From the Federal Register Online via the Government Publishing Office ENVIRONMENTAL PROTECTION AGENCY Problem Formulation for Human Health Risk Assessments of Pathogens in Land-Applied Biosolids AGENCY: Environmental Protection Agency (EPA). ACTION: Notice of availability. SUMMARY: EPA is announcing...

  9. Applying thematic analysis theory to practice: a researcher's experience.

    PubMed

    Tuckett, Anthony G

    2005-01-01

    This article describes an experience of thematic analysis. In order to answer the question 'What does analysis look like in practice?' it describes in brief how the methodology of grounded theory, the epistemology of social constructionism, and the theoretical stance of symbolic interactionism inform analysis. Additionally, analysis is examined by evidencing the systematic processes--here termed organising, coding, writing, theorising, and reading--that led the researcher to develop a final thematic schema.

  10. Applying quality criteria to exposure in asbestos epidemiology increases the estimated risk.

    PubMed

    Burdorf, Alex; Heederik, Dick

    2011-07-01

    Mesothelioma deaths due to environmental exposure to asbestos in The Netherlands led to parliamentary concern that exposure guidelines were not strict enough. The Health Council of the Netherlands was asked for advice. Its report has recently been published. The question of quality of the exposure estimates was studied more systematically than in previous asbestos meta-analyses. Five criteria of quality of exposure information were applied, and cohort studies that failed to meet these were excluded. For lung cancer, this decreased the number of cohorts included from 19 to 3 and increased the risk estimate 3- to 6-fold, with the requirements for good historical data on exposure and job history having the largest effects. It also suggested that the apparent differences in lung cancer potency between amphiboles and chrysotile may be produced by lower quality studies. A similar pattern was seen for mesothelioma. As a result, the Health Council has proposed that the occupational exposure limit be reduced from 10 000 fibres m(-3) (all types) to 250 f m(-3) (amphiboles), 1300 f m(-3) (mixed fibres), and 2000 f m(-3) (chrysotile). The process illustrates the importance of evaluating quality of exposure in epidemiology since poor quality of exposure data will lead to underestimated risk.

  11. From aviation to medicine: applying concepts of aviation safety to risk management in ambulatory care

    PubMed Central

    Wilf-Miron, R; Lewenhoff, I; Benyamini, Z; Aviram, A

    2003-01-01

    

 The development of a medical risk management programme based on the aviation safety approach and its implementation in a large ambulatory healthcare organisation is described. The following key safety principles were applied: (1) errors inevitably occur and usually derive from faulty system design, not from negligence; (2) accident prevention should be an ongoing process based on open and full reporting; (3) major accidents are only the "tip of the iceberg" of processes that indicate possibilities for organisational learning. Reporting physicians were granted immunity, which encouraged open reporting of errors. A telephone "hotline" served the medical staff for direct reporting and receipt of emotional support and medical guidance. Any adverse event which had learning potential was debriefed, while focusing on the human cause of error within a systemic context. Specific recommendations were formulated to rectify processes conducive to error when failures were identified. During the first 5 years of implementation, the aviation safety concept and tools were successfully adapted to ambulatory care, fostering a culture of greater concern for patient safety through risk management while providing support to the medical staff. PMID:12571343

  12. From aviation to medicine: applying concepts of aviation safety to risk management in ambulatory care.

    PubMed

    Wilf-Miron, R; Lewenhoff, I; Benyamini, Z; Aviram, A

    2003-02-01

    The development of a medical risk management programme based on the aviation safety approach and its implementation in a large ambulatory healthcare organisation is described. The following key safety principles were applied: (1). errors inevitably occur and usually derive from faulty system design, not from negligence; (2). accident prevention should be an ongoing process based on open and full reporting; (3). major accidents are only the "tip of the iceberg" of processes that indicate possibilities for organisational learning. Reporting physicians were granted immunity, which encouraged open reporting of errors. A telephone "hotline" served the medical staff for direct reporting and receipt of emotional support and medical guidance. Any adverse event which had learning potential was debriefed, while focusing on the human cause of error within a systemic context. Specific recommendations were formulated to rectify processes conducive to error when failures were identified. During the first 5 years of implementation, the aviation safety concept and tools were successfully adapted to ambulatory care, fostering a culture of greater concern for patient safety through risk management while providing support to the medical staff.

  13. Climate change, land slide risks and sustainable development, risk analysis and decision support process tool

    NASA Astrophysics Data System (ADS)

    Andersson-sköld, Y. B.; Tremblay, M.

    2011-12-01

    Climate change is in most parts of Sweden expected to result in increased precipitation and increased sea water levels causing flooding, erosion, slope instability and related secondary consequences. Landslide risks are expected to increase with climate change in large parts of Sweden due to increased annual precipitation, more intense precipitation and increased flows combined with dryer summers. In response to the potential climate related risks, and on the commission of the Ministry of Environment, the Swedish Geotechnical Institute (SGI) is at present performing a risk analysis project for the most prominent landslide risk area in Sweden: the Göta river valley. As part of this, a methodology for land slide ex-ante consequence analysis today, and in a future climate, has been developed and applied in the Göta river valley. Human life, settlements, industry, contaminated sites, infrastructure of national importance are invented and assessed important elements at risk. The goal of the consequence analysis is to produce a map of geographically distributed expected losses, which can be combined with a corresponding map displaying landslide probability to describe the risk (the combination of probability and consequence of a (negative) event). The risk analysis is GIS-aided in presenting and visualise the risk and using existing databases for quantification of the consequences represented by ex-ante estimated monetary losses. The results will be used on national, regional and as an indication of the risk on local level, to assess the need of measures to mitigate the risk. The costs and environmental and social impacts to mitigate the risk are expected to be very high but the costs and impacts of a severe landslide are expected to be even higher. Therefore, civil servants have pronounced a need of tools to assess both the vulnerability and a more holistic picture of impacts of climate change adaptation measures. At SGI a tool for the inclusion of sustainability

  14. Applied Missing Data Analysis. Methodology in the Social Sciences Series

    ERIC Educational Resources Information Center

    Enders, Craig K.

    2010-01-01

    Walking readers step by step through complex concepts, this book translates missing data techniques into something that applied researchers and graduate students can understand and utilize in their own research. Enders explains the rationale and procedural details for maximum likelihood estimation, Bayesian estimation, multiple imputation, and…

  15. Geometric Error Analysis in Applied Calculus Problem Solving

    ERIC Educational Resources Information Center

    Usman, Ahmed Ibrahim

    2017-01-01

    The paper investigates geometric errors students made as they tried to use their basic geometric knowledge in the solution of the Applied Calculus Optimization Problem (ACOP). Inaccuracies related to the drawing of geometric diagrams (visualization skills) and those associated with the application of basic differentiation concepts into ACOP…

  16. Applied Missing Data Analysis. Methodology in the Social Sciences Series

    ERIC Educational Resources Information Center

    Enders, Craig K.

    2010-01-01

    Walking readers step by step through complex concepts, this book translates missing data techniques into something that applied researchers and graduate students can understand and utilize in their own research. Enders explains the rationale and procedural details for maximum likelihood estimation, Bayesian estimation, multiple imputation, and…

  17. Duration Analysis Applied to the Adoption of Knowledge.

    ERIC Educational Resources Information Center

    Vega-Cervera, Juan A.; Gordillo, Isabel Cuadrado

    2001-01-01

    Analyzes knowledge acquisition in a sample of 264 pupils in 9 Spanish elementary schools, using time as a dependent variable. Introduces psycho-pedagogical, pedagogical, and social variables into a hazard model applied to the reading process. Auditory discrimination (not intelligence or visual perception) most significantly influences learning to…

  18. An integrated risk analysis methodology in a multidisciplinary design environment

    NASA Astrophysics Data System (ADS)

    Hampton, Katrina Renee

    Design of complex, one-of-a-kind systems, such as space transportation systems, is characterized by high uncertainty and, consequently, high risk. It is necessary to account for these uncertainties in the design process to produce systems that are more reliable. Systems designed by including uncertainties and managing them, as well, are more robust and less prone to poor operations as a result of parameter variability. The quantification, analysis and mitigation of uncertainties are challenging tasks as many systems lack historical data. In such an environment, risk or uncertainty quantification becomes subjective because input data is based on professional judgment. Additionally, there are uncertainties associated with the analysis tools and models. Both the input data and the model uncertainties must be considered for a multi disciplinary systems level risk analysis. This research synthesizes an integrated approach for developing a method for risk analysis. Expert judgment methodology is employed to quantify external risk. This methodology is then combined with a Latin Hypercube Sampling - Monte Carlo simulation to propagate uncertainties across a multidisciplinary environment for the overall system. Finally, a robust design strategy is employed to mitigate risk during the optimization process. This type of approach to risk analysis is conducive to the examination of quantitative risk factors. The core of this research methodology is the theoretical framework for uncertainty propagation. The research is divided into three stages or modules. The first two modules include the identification/quantification and propagation of uncertainties. The third module involves the management of uncertainties or response optimization. This final module also incorporates the integration of risk into program decision-making. The risk analysis methodology, is applied to a launch vehicle conceptual design study at NASA Langley Research Center. The launch vehicle multidisciplinary

  19. Dynamic Positioning System (DPS) Risk Analysis Using Probabilistic Risk Assessment (PRA)

    NASA Technical Reports Server (NTRS)

    Thigpen, Eric B.; Stewart, Michael A.; Boyer, Roger L.; Fougere, Pete

    2017-01-01

    The National Aeronautics and Space Administration (NASA) Safety & Mission Assurance (S&MA) directorate at the Johnson Space Center (JSC) has applied its knowledge and experience with Probabilistic Risk Assessment (PRA) to projects in industries ranging from spacecraft to nuclear power plants. PRA is a comprehensive and structured process for analyzing risk in complex engineered systems and/or processes. The PRA process enables the user to identify potential risk contributors such as, hardware and software failure, human error, and external events. Recent developments in the oil and gas industry have presented opportunities for NASA to lend their PRA expertise to both ongoing and developmental projects within the industry. This paper provides an overview of the PRA process and demonstrates how this process was applied in estimating the probability that a Mobile Offshore Drilling Unit (MODU) operating in the Gulf of Mexico and equipped with a generically configured Dynamic Positioning System (DPS) loses location and needs to initiate an emergency disconnect. The PRA described in this paper is intended to be generic such that the vessel meets the general requirements of an International Maritime Organization (IMO) Maritime Safety Committee (MSC)/Circ. 645 Class 3 dynamically positioned vessel. The results of this analysis are not intended to be applied to any specific drilling vessel, although provisions were made to allow the analysis to be configured to a specific vessel if required.

  20. How Has Applied Behavior Analysis and Behavior Therapy Changed?: An Historical Analysis of Journals

    ERIC Educational Resources Information Center

    O'Donohue, William; Fryling, Mitch

    2007-01-01

    Applied behavior analysis and behavior therapy are now nearly a half century old. It is interesting to ask if and how these disciplines have changed over time, particularly regarding some of their key internal controversies (e.g., role of cognitions). We examined the first five years and the 2000-2004 five year period of the "Journal of Applied…

  1. Applying Model Analysis to a Resource-Based Analysis of the Force and Motion Conceptual Evaluation

    ERIC Educational Resources Information Center

    Smith, Trevor I.; Wittmann, Michael C.; Carter, Tom

    2014-01-01

    Previously, we analyzed the Force and Motion Conceptual Evaluation in terms of a resources-based model that allows for clustering of questions so as to provide useful information on how students correctly or incorrectly reason about physics. In this paper, we apply model analysis to show that the associated model plots provide more information…

  2. Applying Model Analysis to a Resource-Based Analysis of the Force and Motion Conceptual Evaluation

    ERIC Educational Resources Information Center

    Smith, Trevor I.; Wittmann, Michael C.; Carter, Tom

    2014-01-01

    Previously, we analyzed the Force and Motion Conceptual Evaluation in terms of a resources-based model that allows for clustering of questions so as to provide useful information on how students correctly or incorrectly reason about physics. In this paper, we apply model analysis to show that the associated model plots provide more information…

  3. Matrix Factorization Methods Applied in Microarray Data Analysis

    PubMed Central

    Kossenkov, Andrew V.

    2010-01-01

    Numerous methods have been applied to microarray data to group genes into clusters that show similar expression patterns. These methods assign each gene to a single group, which does not reflect the widely held view among biologists that most, if not all, genes in eukaryotes are involved in multiple biological processes and therefore will be multiply regulated. Here, we review several methods that have been developed that are capable of identifying patterns of behavior in transcriptional response and assigning genes to multiple patterns. Broadly speaking, these methods define a series of mathematical approaches to matrix factorization with different approaches to the fitting of the model to the data. We focus on these methods in contrast to traditional clustering methods applied to microarray data, which assign one gene to one cluster. PMID:20376923

  4. The Preschool: Setting for Applied Behavior Analysis Research.

    ERIC Educational Resources Information Center

    Essa, Eva L.

    1978-01-01

    This review covers four general areas: modification of undesirable behavior; increasing competence; generalization and maintenance; and behavior analysis methods in assessing preschool procedures. These studies support the usefulness of behavioral analysis; however, a major problem is the lack of systematic follow-up or procedures encouraging…

  5. Applying Financial Portfolio Analysis to Government Program Portfolios

    DTIC Science & Technology

    2007-06-01

    instance, in stock valuation, the Capital Asset Pricing Model ( CAPM ) often provides the returns used in portfolio theory analysis. Further...PORTFOLIO OPTIMIZATION MODELS ................................... 63 1. EVM Optimization Model ...63 2. Constrained EVM Optimization Model ................................. 64 IV. DATA ANALYSIS AND RESULTS

  6. Modal analysis applied to circular, rectangular, and coaxial waveguides

    NASA Technical Reports Server (NTRS)

    Hoppe, D. J.

    1988-01-01

    Recent developments in the analysis of various waveguide components and feedhorns using Modal Analysis (Mode Matching Method) are summarized. A brief description of the theory is presented, and the important features of the method are pointed out. Specific examples in circular, rectangular, and coaxial waveguides are included, with comparisons between the theory and experimental measurements. Extensions to the methods are described.

  7. Multimodel Bayesian analysis of data-worth applied to unsaturated fractured tuffs

    NASA Astrophysics Data System (ADS)

    Lu, Dan; Ye, Ming; Neuman, Shlomo P.; Xue, Liang

    2012-01-01

    To manage water resource and environmental systems effectively requires suitable data. The worth of collecting such data depends on their potential benefit and cost, including the expected cost (risk) of failing to take an appropriate decision. Evaluating this risk calls for a probabilistic approach to data-worth assessment. Recently we [39] developed a multimodel approach to optimum value-of-information or data-worth analysis based on model averaging within a maximum likelihood Bayesian framework. Adopting a two-dimensional synthetic example, we implemented our approach using Monte Carlo (MC) simulations with and without lead order approximations, finding that the former approach was almost equally accurate but computationally more efficient. Here we apply our methodology to pneumatic permeability data from vertical and inclined boreholes drilled into unsaturated fractured tuff near Superior, Arizona. In an attempt to improve computational efficiency, we introduce three new approximations that require less computational effort and compare results with those obtained by the original Monte Carlo method. The first approximation disregards uncertainty in model parameter estimates, the second does so for estimates of potential new data, and the third disregards both uncertainties. We find that only the first approximation yields reliable quantitative assessments of reductions in predictive uncertainty brought about by the collection of new data. We conclude that, whereas parameter uncertainty may sometimes be disregarded for purposes of analyzing data worth, the same does not generally apply to uncertainty in estimates of potential new data.

  8. Applying the Heuristic to the Risk Assessment within the Automotive Industry Supply Chain

    NASA Astrophysics Data System (ADS)

    Marasova, Daniela; Andrejiova, Miriam; Grincova, Anna

    2017-03-01

    Risk management facilitates risk identification, evaluation, control, and by means of appropriate set of measures, risk reduction or complete elimination. Therefore, the risk management becomes a strategic factor for a company's success. Properly implemented risk management system does not represent a tool to avoid the risk; it is used to understand the risk and provide the bases for strategic decision-making. Risk management represents a key factor for the supply chain operations. Managing the risks is crucial for achieving the customer satisfaction and thus also a company's success. The subject-matter of the article is the assessment of the supply chain in the automobile industry, in terms of risks. The topicality of this problem is even higher, as after the economic crisis it is necessary to revaluate the readiness of the supply chain for prospective risk conditions. One advantage of this article is the use of the Saaty method as a tool for the risk management within the supply chain.

  9. RISK ASSESSMENT AND EPIDEMIOLOGICAL INFORMATION FOR PATHOGENIC MICROORGANISMS APPLIED TO SOIL

    EPA Science Inventory

    There is increasing interest in the development of a microbial risk assessment methodology for regulatory and operational decision making. Initial interests in microbial risk assessments focused on drinking, recreational, and reclaimed water issues. More recently risk assessmen...

  10. RISK ASSESSMENT AND EPIDEMIOLOGICAL INFORMATION FOR PATHOGENIC MICROORGANISMS APPLIED TO SOIL

    EPA Science Inventory

    There is increasing interest in the development of a microbial risk assessment methodology for regulatory and operational decision making. Initial interests in microbial risk assessments focused on drinking, recreational, and reclaimed water issues. More recently risk assessmen...

  11. Spectral Analysis and Filter Theory in Applied Geophysics

    NASA Astrophysics Data System (ADS)

    Buttkus, Burkhard; Newcomb, R. C.

    This book gives a comprehensive picture of the present stage of development of spectral analysis and filter theory in geophysics. The principles and theories behind classical and modern methods are described and the effectiveness of these methods is assessed; selected examples of their practical application in geophysics are discussed. The modern methods include, for example, spectral analysis by fitting random models to the data, the maximum-entropy and maximum-likelihood spectral analysis procedures, the Wiener and Kalman filters, homomorphic deconvolution, and adaptive procedures for non-stationary processes. This book represents a valuable aid in education and research and for solving practical problems in geophysics and related disciplines.

  12. Neutron-activation analysis applied to copper ores and artifacts

    NASA Technical Reports Server (NTRS)

    Linder, N. F.

    1970-01-01

    Neutron activation analysis is used for quantitative identification of trace metals in copper. Establishing a unique fingerprint of impurities in Michigan copper would enable identification of artifacts made from this copper.

  13. PIXE measurement applied to trace elemental analysis of human tissues

    NASA Astrophysics Data System (ADS)

    Tanaka, M.; Matsugi, E.; Miyasaki, K.; Yamagata, T.; Inoue, M.; Ogata, H.; Shimoura, S.

    1987-03-01

    PIXE measurement was applied for trace elemental analyses of 40 autoptic human kidneys. To investigate the reproducibility of the PIXE data, 9 targets obtained from one human liver were examined. The targets were prepared by wet-digestion using nitric and sulfuric acid. Yttrium was used as an internal standard. The extracted elemental concentrations for K, Fe, Cu, Zn, and Cd were in reasonable agreement with those obtained by atomic absorption spectrometry (AAS) and flame photometry (FP). Various correlations among the elements K, Ca, Cr, Mn, Fe, Ni, Cu, Zn, Rb, and Cd were examined individually for the renal cortex and renal medulla.

  14. Supplemental Hazard Analysis and Risk Assessment - Hydrotreater

    SciTech Connect

    Lowry, Peter P.; Wagner, Katie A.

    2015-04-01

    A supplemental hazard analysis was conducted and quantitative risk assessment performed in response to an independent review comment received by the Pacific Northwest National Laboratory (PNNL) from the U.S. Department of Energy Pacific Northwest Field Office (PNSO) against the Hydrotreater/Distillation Column Hazard Analysis Report issued in April 2013. The supplemental analysis used the hazardous conditions documented by the previous April 2013 report as a basis. The conditions were screened and grouped for the purpose of identifying whether additional prudent, practical hazard controls could be identified, using a quantitative risk evaluation to assess the adequacy of the controls and establish a lower level of concern for the likelihood of potential serious accidents. Calculations were performed to support conclusions where necessary.

  15. GPS ensemble analysis applied to Antarctic vertical velocities

    NASA Astrophysics Data System (ADS)

    Petrie, E. J.; Clarke, P. J.; King, M. A.; Williams, S. D. P.

    2014-12-01

    GPS data is used to provide estimates of vertical land motion caused by e.g. glacial isostatic adjustment (GIA) and hydrologic loading. The vertical velocities estimated from the GPS data are often assimilated into GIA models or used for comparison purposes. GIA models are very important as they provide time-variable gravity corrections needed to estimate ice mass change over Greenland and Antarctica. While state-of-the art global GPS analysis has previously been performed for many Antarctic sites, formal errors in the resulting site velocities are typically obtained from noise analysis of each individual time series without consideration of processing or metadata issues. Here we present analysis of the results from two full global runs including a variety of parameter and reference frame alignment choices, and compare the results to previous work with a view to assessing if the size of the formal errors from the standard method is truly representative.

  16. Orbit Response Matrix Analysis Applied at PEP-II

    SciTech Connect

    Steier, C.; Wolski, A.; Ecklund, S.; Safranek, J.A.; Tenenbaum, P.; Terebilo, A.; Turner, J.L.; Yocky, G.; /SLAC

    2005-05-17

    The analysis of orbit response matrices has been used very successfully to measure and correct the gradient and skew gradient distribution in many accelerators. It allows determination of an accurately calibrated model of the coupled machine lattice, which then can be used to calculate the corrections necessary to improve coupling, dynamic aperture and ultimately luminosity. At PEP-II, the Matlab version of LOCO has been used to analyze coupled response matrices for both the LER and the HER. The large number of elements in PEP-II and the very complicated interaction region present unique challenges to the data analysis. All necessary tools to make the analysis method useable at PEP-II have been implemented and LOCO can now be used as a routine tool for lattice diagnostic.

  17. Joint regression analysis and AMMI model applied to oat improvement

    NASA Astrophysics Data System (ADS)

    Oliveira, A.; Oliveira, T. A.; Mejza, S.

    2012-09-01

    In our work we present an application of some biometrical methods useful in genotype stability evaluation, namely AMMI model, Joint Regression Analysis (JRA) and multiple comparison tests. A genotype stability analysis of oat (Avena Sativa L.) grain yield was carried out using data of the Portuguese Plant Breeding Board, sample of the 22 different genotypes during the years 2002, 2003 and 2004 in six locations. In Ferreira et al. (2006) the authors state the relevance of the regression models and of the Additive Main Effects and Multiplicative Interactions (AMMI) model, to study and to estimate phenotypic stability effects. As computational techniques we use the Zigzag algorithm to estimate the regression coefficients and the agricolae-package available in R software for AMMI model analysis.

  18. Applying the Framingham risk score for prediction of metabolic syndrome: The Kerman Coronary Artery Disease Risk Study, Iran

    PubMed Central

    Yousefzadeh, Gholamreza; Shokoohi, Mostafa; Najafipour, Hamid; Shadkamfarokhi, Mitra

    2015-01-01

    BACKGROUND There has been a few studies about the predictability of metabolic syndrome (MetS) based on the Framingham risk score (FRS) as a tool for predicting the risk of 10-years cardiovascular diseases (CVD) in Iranian population. The aim of this study was to compare the risk stratification obtained with the FRS and MetS in a cohort of the Iranian population. METHODS In this population-based study Kerman Coronary Artery Disease Risk study, Iran, MetS was diagnosed as defined by the revised National Cholesterol Education Program definition criteria (ATPIII) and the FRS was calculated using a computer program, previously reported algorithm. RESULTS Overall, the prevalence 10-years risk of CVD for patients with MetS was significantly different with those without MetS (74.3 vs. 86.4% for low-risk patients, 18.1 vs. 12.3% for intermediate-risk people, and 7.6 vs. 1.3% for high-risk individuals) (P < 0.001). The frequency of intermediate-risk and high-risk for 10-year CVD in men with MetS (39.5 and 18.3%, respectively) was considerably higher than women with MetS (3.2 and 0.1%, respectively). Using multiple logistic regression, the odds ratio of MetS in intermediate-risk and high-risk FRS group was 1.7 and 6.7, respectively (P < 0.001). CONCLUSION Significant association between the presence of MetS and high risk for CVD based on FRS was revealed in both men and women indicating a good concordance between MetS and FRS in predicting the risk of CVDs. However, the odds ratio of the development of risk of cardiovascular events among women was higher than men with MetS. PMID:26405450

  19. Systems design analysis applied to launch vehicle configuration

    NASA Astrophysics Data System (ADS)

    Ryan, R.; Verderaime, V.

    1993-01-01

    As emphasis shifts from optimum-performance aerospace systems to least lift-cycle costs, systems designs must seek, adapt, and innovate cost improvement techniques in design through operations. The systems design process of concept, definition, and design was assessed for the types and flow of total quality management techniques that may be applicable in a launch vehicle systems design analysis. Techniques discussed are task ordering, quality leverage, concurrent engineering, Pareto's principle, robustness, quality function deployment, criteria, and others. These cost oriented techniques are as applicable to aerospace systems design analysis as to any large commercial system.

  20. Systems design analysis applied to launch vehicle configuration

    NASA Technical Reports Server (NTRS)

    Ryan, R.; Verderaime, V.

    1993-01-01

    As emphasis shifts from optimum-performance aerospace systems to least lift-cycle costs, systems designs must seek, adapt, and innovate cost improvement techniques in design through operations. The systems design process of concept, definition, and design was assessed for the types and flow of total quality management techniques that may be applicable in a launch vehicle systems design analysis. Techniques discussed are task ordering, quality leverage, concurrent engineering, Pareto's principle, robustness, quality function deployment, criteria, and others. These cost oriented techniques are as applicable to aerospace systems design analysis as to any large commercial system.

  1. Risk assessment and its application to flight safety analysis

    SciTech Connect

    Keese, D.L.; Barton, W.R.

    1989-12-01

    Potentially hazardous test activities have historically been a part of Sandia National Labs mission to design, develop, and test new weapons systems. These test activities include high speed air drops for parachute development, sled tests for component and system level studies, multiple stage rocket experiments, and artillery firings of various projectiles. Due to the nature of Sandia's test programs, the risk associated with these activities can never be totally eliminated. However, a consistent set of policies should be available to provide guidance into the level of risk that is acceptable in these areas. This report presents a general set of guidelines for addressing safety issues related to rocket flight operations at Sandia National Laboratories. Even though the majority of this report deals primarily with rocket flight safety, these same principles could be applied to other hazardous test activities. The basic concepts of risk analysis have a wide range of applications into many of Sandia's current operations. 14 refs., 1 tab.

  2. Best practices: applying management analysis of excellence to immunization.

    PubMed

    Wishner, Amy; Aronson, Jerold; Kohrt, Alan; Norton, Gary

    2005-01-01

    The authors applied business management tools to analyze and promote excellence and to evaluate differences between average and above-average immunization peformers in private practices. The authors conducted a pilot study of 10 private practices in Pennsylvania using tools common in management to assess practices' organizational climate and managerial style. Authoritative and coaching styles of physician leaders were common to both groups. Managerial styles that emphasized higher levels of clarity and responsibility managerial styles were evident in the large practices; and rewards and flexibility styles were higher in the small above-average practices. The findings of this pilot study match results seen in high performers in other industries. It concludes that the authoritative style appears to have the most impact on performance. It has interesting implications for training/behavior change to improve immunization rates, along with traditional medical interventions.

  3. A value analysis model applied to the management of amblyopia.

    PubMed Central

    Beauchamp, G R; Bane, M C; Stager, D R; Berry, P M; Wright, W W

    1999-01-01

    PURPOSE: To assess the value of amblyopia-related services by utilizing a health value model (HVM). Cost and quality criteria are evaluated in accordance with the interests of patients, physicians, and purchasers. METHODS: We applied an HVM to a hypothetical statistical ("median") child with amblyopia whose visual acuity is 20/80 and to a group of children with amblyopia who are managed by our practice. We applied the model to calculate the value of these services by evaluating the responses of patients and physicians and relating these responses to clinical outcomes. RESULTS: The consensus value of care for the hypothetical median child was calculated to be 0.406 (of 1.000). For those children managed in our practice, the calculated value is 0.682. Clinically, 79% achieved 20/40 or better visual acuity, and the mean final visual acuity was 0.2 logMAR (20/32). Value appraisals revealed significant concerns about the financial aspects of amblyopia-related services, particularly among physicians. Patients rated services more positively than did physicians. CONCLUSIONS: Amblyopia care is difficult, sustained, and important work that requires substantial sensitivity to and support of children and families. Compliance and early detection are essential to success. The value of amblyopia services is rated significantly higher by patients than by physicians. Relative to the measured value, amblyopia care is undercompensated. The HVM is useful to appraise clinical service delivery and its variation. The costs of failure and the benefits of success are high; high-value amblyopia care yields substantial dividends and should be commensurately compensated in the marketplace. PMID:10703133

  4. Metabolic and Dynamic Profiling for Risk Assessment of Fluopyram, a Typical Phenylamide Fungicide Widely Applied in Vegetable Ecosystem

    NASA Astrophysics Data System (ADS)

    Wei, Peng; Liu, Yanan; Li, Wenzhuo; Qian, Yuan; Nie, Yanxia; Kim, Dongyeop; Wang, Mengcen

    2016-09-01

    Fluopyram, a typical phenylamide fungicide, was widely applied to protect fruit vegetables from fungal pathogens-responsible yield loss. Highly linked to the ecological and dietary risks, its residual and metabolic profiles in the fruit vegetable ecosystem still remained obscure. Here, an approach using modified QuEChERS (Quick, Easy, Cheap, Effective, Rugged and Safe) extraction combined with GC-MS/MS analysis was developed to investigate fluopyram fate in the typical fruit vegetables including tomato, cucumber, pepper under the greenhouse environment. Fluopyram dissipated in accordance with the first-order rate dynamics equation with the maximum half-life of 5.7 d. Cleveage of fluopyram into 2-trifluoromethyl benzamide and subsequent formation of 3-chloro-5-(trifluoromethyl) pyridine-2-acetic acid and 3-chloro-5-(trifluoromethyl) picolinic acid was elucidated to be its ubiquitous metabolic pathway. Moreover, the incurrence of fluopyram at the pre-harvest interval (PHI) of 7-21 d was between 0.0108 and 0.1603 mg/kg, and the Hazard Quotients (HQs) were calculated to be less than 1, indicating temporary safety on consumption of the fruit vegetables incurred with fluopyram, irrespective of the uncertain toxicity of the metabolites. Taken together, our findings reveal the residual essential of fluopyram in the typical agricultural ecosystem, and would advance the further insight into ecological risk posed by this fungicide associated with its metabolites.

  5. Metabolic and Dynamic Profiling for Risk Assessment of Fluopyram, a Typical Phenylamide Fungicide Widely Applied in Vegetable Ecosystem

    PubMed Central

    Wei, Peng; Liu, Yanan; Li, Wenzhuo; Qian, Yuan; Nie, Yanxia; Kim, Dongyeop; Wang, Mengcen

    2016-01-01

    Fluopyram, a typical phenylamide fungicide, was widely applied to protect fruit vegetables from fungal pathogens-responsible yield loss. Highly linked to the ecological and dietary risks, its residual and metabolic profiles in the fruit vegetable ecosystem still remained obscure. Here, an approach using modified QuEChERS (Quick, Easy, Cheap, Effective, Rugged and Safe) extraction combined with GC-MS/MS analysis was developed to investigate fluopyram fate in the typical fruit vegetables including tomato, cucumber, pepper under the greenhouse environment. Fluopyram dissipated in accordance with the first-order rate dynamics equation with the maximum half-life of 5.7 d. Cleveage of fluopyram into 2-trifluoromethyl benzamide and subsequent formation of 3-chloro-5-(trifluoromethyl) pyridine-2-acetic acid and 3-chloro-5-(trifluoromethyl) picolinic acid was elucidated to be its ubiquitous metabolic pathway. Moreover, the incurrence of fluopyram at the pre-harvest interval (PHI) of 7–21 d was between 0.0108 and 0.1603 mg/kg, and the Hazard Quotients (HQs) were calculated to be less than 1, indicating temporary safety on consumption of the fruit vegetables incurred with fluopyram, irrespective of the uncertain toxicity of the metabolites. Taken together, our findings reveal the residual essential of fluopyram in the typical agricultural ecosystem, and would advance the further insight into ecological risk posed by this fungicide associated with its metabolites. PMID:27654708

  6. Applying Score Analysis to a Rehearsal Pedagogy of Expressive Performance

    ERIC Educational Resources Information Center

    Byo, James L.

    2014-01-01

    The discoveries of score analysis (e.g., minor seventh chord, ostinato, phrase elision, melodic fragment, half cadence) are more than just compositional techniques or music vocabulary. They are sounds--fascinating, storytelling, dynamic modes of expression--that when approached as such enrich the rehearsal experience. This article presents a…

  7. Applying Adult Learning Theory through a Character Analysis

    ERIC Educational Resources Information Center

    Baskas, Richard S.

    2011-01-01

    The purpose of this study is to analyze the behavior of a character, Celie, in a movie, 'The Color Purple," through the lens of two adult learning theorists to determine the relationships the character has with each theory. The development and portrayal of characters in movies can be explained and understood by the analysis of adult learning…

  8. Applying Skinner's Analysis of Verbal Behavior to Persons with Dementia

    ERIC Educational Resources Information Center

    Dixon, Mark; Baker, Jonathan C.; Sadowski, Katherine Ann

    2011-01-01

    Skinner's 1957 analysis of verbal behavior has demonstrated a fair amount of utility to teach language to children with autism and other various disorders. However, the learning of language can be forgotten, as is the case for many elderly suffering from dementia or other degenerative diseases. It appears possible that Skinner's operants may…

  9. Action, Content and Identity in Applied Genre Analysis for ESP

    ERIC Educational Resources Information Center

    Flowerdew, John

    2011-01-01

    Genres are staged, structured, communicative events, motivated by various communicative purposes, and performed by members of specific discourse communities (Swales 1990; Bhatia 1993, 2004; Berkenkotter & Huckin 1995). Since its inception, with the two seminal works on the topic by Swales (1990) and Bhatia (1993), genre analysis has taken pride of…

  10. Applying an Activity System to Online Collaborative Group Work Analysis

    ERIC Educational Resources Information Center

    Choi, Hyungshin; Kang, Myunghee

    2010-01-01

    This study determines whether an activity system provides a systematic framework to analyse collaborative group work. Using an activity system as a unit of analysis, the research examined learner behaviours, conflicting factors and facilitating factors while students engaged in collaborative work via asynchronous computer-mediated communication.…

  11. Applying Score Analysis to a Rehearsal Pedagogy of Expressive Performance

    ERIC Educational Resources Information Center

    Byo, James L.

    2014-01-01

    The discoveries of score analysis (e.g., minor seventh chord, ostinato, phrase elision, melodic fragment, half cadence) are more than just compositional techniques or music vocabulary. They are sounds--fascinating, storytelling, dynamic modes of expression--that when approached as such enrich the rehearsal experience. This article presents a…

  12. Applying Skinner's Analysis of Verbal Behavior to Persons with Dementia

    ERIC Educational Resources Information Center

    Dixon, Mark; Baker, Jonathan C.; Sadowski, Katherine Ann

    2011-01-01

    Skinner's 1957 analysis of verbal behavior has demonstrated a fair amount of utility to teach language to children with autism and other various disorders. However, the learning of language can be forgotten, as is the case for many elderly suffering from dementia or other degenerative diseases. It appears possible that Skinner's operants may…

  13. Applying AI tools to operational space environmental analysis

    NASA Technical Reports Server (NTRS)

    Krajnak, Mike; Jesse, Lisa; Mucks, John

    1995-01-01

    The U.S. Air Force and National Oceanic Atmospheric Agency (NOAA) space environmental operations centers are facing increasingly complex challenges meeting the needs of their growing user community. These centers provide current space environmental information and short term forecasts of geomagnetic activity. Recent advances in modeling and data access have provided sophisticated tools for making accurate and timely forecasts, but have introduced new problems associated with handling and analyzing large quantities of complex data. AI (Artificial Intelligence) techniques have been considered as potential solutions to some of these problems. Fielding AI systems has proven more difficult than expected, in part because of operational constraints. Using systems which have been demonstrated successfully in the operational environment will provide a basis for a useful data fusion and analysis capability. Our approach uses a general purpose AI system already in operational use within the military intelligence community, called the Temporal Analysis System (TAS). TAS is an operational suite of tools supporting data processing, data visualization, historical analysis, situation assessment and predictive analysis. TAS includes expert system tools to analyze incoming events for indications of particular situations and predicts future activity. The expert system operates on a knowledge base of temporal patterns encoded using a knowledge representation called Temporal Transition Models (TTM's) and an event database maintained by the other TAS tools. The system also includes a robust knowledge acquisition and maintenance tool for creating TTM's using a graphical specification language. The ability to manipulate TTM's in a graphical format gives non-computer specialists an intuitive way of accessing and editing the knowledge base. To support space environmental analyses, we used TAS's ability to define domain specific event analysis abstractions. The prototype system defines

  14. Nutrient Status and Contamination Risks from Digested Pig Slurry Applied on a Vegetable Crops Field

    PubMed Central

    Zhang, Shaohui; Hua, Yumei; Deng, Liangwei

    2016-01-01

    The effects of applied digested pig slurry on a vegetable crops field were studied. The study included a 3-year investigation on nutrient characteristics, heavy metals contamination and hygienic risks of a vegetable crops field in Wuhan, China. The results showed that, after anaerobic digestion, abundant N, P and K remained in the digested pig slurry while fecal coliforms, ascaris eggs, schistosoma eggs and hookworm eggs were highly reduced. High Cr, Zn and Cu contents in the digested pig slurry were found in spring. Digested pig slurry application to the vegetable crops field led to improved soil fertility. Plant-available P in the fertilized soils increased due to considerable increase in total P content and decrease in low-availability P fraction. The As content in the fertilized soils increased slightly but significantly (p = 0.003) compared with control. The Hg, Zn, Cr, Cd, Pb, and Cu contents in the fertilized soils did not exceed the maximum permissible contents for vegetable crops soils in China. However, high Zn accumulation should be of concern due to repeated applications of digested pig slurry. No fecal coliforms, ascaris eggs, schistosoma eggs or hookworm eggs were detected in the fertilized soils. PMID:27058548

  15. Advanced Risk Analysis for High-Performing Organizations

    DTIC Science & Technology

    2006-01-01

    using traditional risk analysis techniques. Mission Assurance Analysis Protocol (MAAP) is one technique that high performers can use to identify and mitigate the risks arising from operational complexity....The operational environment for many types of organizations is changing. Changes in operational environments are driving the need for advanced risk ... analysis techniques. Many types of risk prevalent in today’s operational environments (e.g., event risks, inherited risk) are not readily identified

  16. Audio spectrum analysis of umbilical artery Doppler ultrasound signals applied to a clinical material.

    PubMed

    Thuring, Ann; Brännström, K Jonas; Jansson, Tomas; Maršál, Karel

    2014-12-01

    Analysis of umbilical artery flow velocity waveforms characterized by pulsatility index (PI) is used to evaluate fetoplacental circulation in high-risk pregnancies. However, an experienced sonographer may be able to further differentiate between various timbres of Doppler audio signals. Recently, we have developed a method for objective audio signal characterization; the method has been tested in an animal model. In the present pilot study, the method was for the first time applied to human pregnancies. Doppler umbilical artery velocimetry was performed in 13 preterm fetuses before and after two doses of 12 mg betamethasone. The auditory measure defined by the frequency band where the spectral energy had dropped 15 dB from its maximum level (MAXpeak-15 dB ), increased two days after betamethasone administration (p = 0.001) parallel with a less pronounced decrease in PI (p = 0.04). The new auditory parameter MAXpeak-15 dB reflected the changes more sensitively than the PI did.

  17. Setting events in applied behavior analysis: Toward a conceptual and methodological expansion

    PubMed Central

    Wahler, Robert G.; Fox, James J.

    1981-01-01

    The contributions of applied behavior analysis as a natural science approach to the study of human behavior are acknowledged. However, it is also argued that applied behavior analysis has provided limited access to the full range of environmental events that influence socially significant behavior. Recent changes in applied behavior analysis to include analysis of side effects and social validation represent ways in which the traditional applied behavior analysis conceptual and methodological model has been profitably expanded. A third area of expansion, the analysis of setting events, is proposed by the authors. The historical development of setting events as a behavior influence concept is traced. Modifications of the basic applied behavior analysis methodology and conceptual systems that seem necessary to setting event analysis are discussed and examples of descriptive and experimental setting event analyses are presented. PMID:16795646

  18. Current Human Reliability Analysis Methods Applied to Computerized Procedures

    SciTech Connect

    Ronald L. Boring

    2012-06-01

    Computerized procedures (CPs) are an emerging technology within nuclear power plant control rooms. While CPs have been implemented internationally in advanced control rooms, to date no US nuclear power plant has implemented CPs in its main control room (Fink et al., 2009). Yet, CPs are a reality of new plant builds and are an area of considerable interest to existing plants, which see advantages in terms of enhanced ease of use and easier records management by omitting the need for updating hardcopy procedures. The overall intent of this paper is to provide a characterization of human reliability analysis (HRA) issues for computerized procedures. It is beyond the scope of this document to propose a new HRA approach or to recommend specific methods or refinements to those methods. Rather, this paper serves as a review of current HRA as it may be used for the analysis and review of computerized procedures.

  19. Nested sampling applied in Bayesian room-acoustics decay analysis.

    PubMed

    Jasa, Tomislav; Xiang, Ning

    2012-11-01

    Room-acoustic energy decays often exhibit single-rate or multiple-rate characteristics in a wide variety of rooms/halls. Both the energy decay order and decay parameter estimation are of practical significance in architectural acoustics applications, representing two different levels of Bayesian probabilistic inference. This paper discusses a model-based sound energy decay analysis within a Bayesian framework utilizing the nested sampling algorithm. The nested sampling algorithm is specifically developed to evaluate the Bayesian evidence required for determining the energy decay order with decay parameter estimates as a secondary result. Taking the energy decay analysis in architectural acoustics as an example, this paper demonstrates that two different levels of inference, decay model-selection and decay parameter estimation, can be cohesively accomplished by the nested sampling algorithm.

  20. Risk Analysis of the Supply-Handling Conveyor System.

    DTIC Science & Technology

    The report documents the risk analysis that was performed on a supply-handling conveyor system. The risk analysis was done to quantify the risks...involved for project development in addition to compliance with the draft AMC regulation on risk analysis . The conveyor system is in the final phase of

  1. Applying Cognitive Work Analysis to Time Critical Targeting Functionality

    DTIC Science & Technology

    2004-10-01

    Target List/Dynamic Target Queue (DTL/ DTQ ) in the same place. Figure 4-27 shows the task steps involved in achieving Goal 7. 4- 30 Figure 4-27...GUI WG to brainstorm the order of columns in the DTL/ DTQ Table, a critical component of the TCTF CUI, with successful results, which were...Cognitive Work Analysis DTD Display Task Description DTL/ DTQ Dynamic Target List/Dynamic Target Queue FDO Fighter Duty Officer FEBA Forward Edge

  2. Risk analysis of analytical validations by probabilistic modification of FMEA.

    PubMed

    Barends, D M; Oldenhof, M T; Vredenbregt, M J; Nauta, M J

    2012-05-01

    Risk analysis is a valuable addition to validation of an analytical chemistry process, enabling not only detecting technical risks, but also risks related to human failures. Failure Mode and Effect Analysis (FMEA) can be applied, using a categorical risk scoring of the occurrence, detection and severity of failure modes, and calculating the Risk Priority Number (RPN) to select failure modes for correction. We propose a probabilistic modification of FMEA, replacing the categorical scoring of occurrence and detection by their estimated relative frequency and maintaining the categorical scoring of severity. In an example, the results of traditional FMEA of a Near Infrared (NIR) analytical procedure used for the screening of suspected counterfeited tablets are re-interpretated by this probabilistic modification of FMEA. Using this probabilistic modification of FMEA, the frequency of occurrence of undetected failure mode(s) can be estimated quantitatively, for each individual failure mode, for a set of failure modes, and the full analytical procedure. Copyright © 2012 Elsevier B.V. All rights reserved.

  3. Applying thiouracil (TU)-tagging for mouse transcriptome analysis

    PubMed Central

    Gay, Leslie; Karfilis, Kate V.; Miller, Michael R.; Doe, Chris Q.; Stankunas, Kryn

    2014-01-01

    Transcriptional profiling is a powerful approach to study mouse development, physiology, and disease models. Here, we describe a protocol for mouse thiouracil-tagging (TU-tagging), a transcriptome analysis technology that includes in vivo covalent labeling, purification, and analysis of cell type-specific RNA. TU-tagging enables 1) the isolation of RNA from a given cell population of a complex tissue, avoiding transcriptional changes induced by cell isolation trauma, and 2) the identification of actively transcribed RNAs and not pre-existing transcripts. Therefore, in contrast to other cell-specific transcriptional profiling methods based on purification of tagged ribosomes or nuclei, TU-tagging provides a direct examination of transcriptional regulation. We describe how to: 1) deliver 4-thiouracil to transgenic mice to thio-label cell lineage-specific transcripts, 2) purify TU-tagged RNA and prepare libraries for Illumina sequencing, and 3) follow a straight-forward bioinformatics workflow to identify cell type-enriched or differentially expressed genes. Tissue containing TU-tagged RNA can be obtained in one day, RNA-Seq libraries generated within two days, and, following sequencing, an initial bioinformatics analysis completed in one additional day. PMID:24457332

  4. Low-thrust mission risk analysis.

    NASA Technical Reports Server (NTRS)

    Yen, C. L.; Smith, D. B.

    1973-01-01

    A computerized multi-stage failure process simulation procedure is used to evaluate the risk in a solar electric space mission. The procedure uses currently available thrust-subsystem reliability data and performs approximate simulations of the thrust subsystem burn operation, the system failure processes, and the retargetting operations. The application of the method is used to assess the risks in carrying out a 1980 rendezvous mission to Comet Encke. Analysis of the results and evaluation of the effects of various risk factors on the mission show that system component failure rates is the limiting factor in attaining a high mission reliability. But it is also shown that a well-designed trajectory and system operation mode can be used effectively to partially compensate for unreliable thruster performance.

  5. Response deprivation and reinforcement in applied settings: A preliminary analysis

    PubMed Central

    Konarski, Edward A.; Johnson, Moses R.; Crowell, Charles R.; Whitman, Thomas L.

    1980-01-01

    First-grade children engaged in seatwork behaviors under reinforcement schedules established according to the Premack Principle and the Response Deprivation Hypothesis. Across two experiments, schedules were presented to the children in a counter-balanced fashion which fulfilled the conditions of one, both, or neither of the hypotheses. Duration of on-task math and coloring in Experiment 1 and on-task math and reading in Experiment 2 were the dependent variables. A modified ABA-type withdrawal design, including a condition to control for the noncontingent effects of a schedule, indicated an increase of on-task instrumental responding only in those schedules where the condition of response deprivation was present but not where it was absent, regardless of the probability differential between the instrumental and contingent responses. These results were consistent with laboratory findings supporting the necessity of response deprivation for producing the reinforcement effect in single response, instrumental schedules. However, the results of the control procedure were equivocal so the contribution of the contingent relationship between the responses to the increases in instrumental behavior could not be determined. Nevertheless, these results provided tentative support for the Response Deprivation Hypothesis as a new approach to establishing reinforcement schedules while indicating the need for further research in this area. The possible advantages of this technique for applied use were identified and discussed. PMID:16795635

  6. Quantified MS analysis applied to combinatorial heterogeneous catalyst libraries.

    PubMed

    Wang, Hua; Liu, Zhongmin; Shen, Jianghan

    2003-01-01

    A high-throughput screening system for secondary catalyst libraries has been developed by incorporation of an 80-pass reactor and a quantified multistream mass spectrometer screening (MSMSS) technique. With a low-melting alloy as the heating medium, a uniform reaction temperature could be obtained in the multistream reactor (maximum temperature differences are less than 1 K at 673 K). Quantification of the results was realized by combination of a gas chromatogram with the MSMSS, which could provide the product selectivities of each catalyst in a heterogeneous catalyst library. Because the catalyst loading of each reaction tube is comparable to that of the conventional microreaction system and because the parallel reactions could be operated under identical conditions (homogeneous temperature, same pressure and WHSV), the reaction results of a promising catalyst selected from the library could be reasonably applied to the further scale-up of the system. The aldol condensation of acetone, with obvious differences in the product distribution over different kind of catalysts, was selected as a model reaction to validate the screening system.

  7. A comprehensive risk analysis of coastal zones in China

    NASA Astrophysics Data System (ADS)

    Wang, Guanghui; Liu, Yijun; Wang, Hongbing; Wang, Xueying

    2014-03-01

    Although coastal zones occupy an important position in the world development, they face high risks and vulnerability to natural disasters because of their special locations and their high population density. In order to estimate their capability for crisis-response, various models have been established. However, those studies mainly focused on natural factors or conditions, which could not reflect the social vulnerability and regional disparities of coastal zones. Drawing lessons from the experiences of the United Nations Environment Programme (UNEP), this paper presents a comprehensive assessment strategy based on the mechanism of Risk Matrix Approach (RMA), which includes two aspects that are further composed of five second-class indicators. The first aspect, the probability phase, consists of indicators of economic conditions, social development, and living standards, while the second one, the severity phase, is comprised of geographic exposure and natural disasters. After weighing all of the above indicators by applying the Analytic Hierarchy Process (AHP) and Delphi Method, the paper uses the comprehensive assessment strategy to analyze the risk indices of 50 coastal cities in China. The analytical results are presented in ESRI ArcGis10.1, which generates six different risk maps covering the aspects of economy, society, life, environment, disasters, and an overall assessment of the five areas. Furthermore, the study also investigates the spatial pattern of these risk maps, with detailed discussion and analysis of different risks in coastal cities.

  8. Risk analysis for renewable energy projects due to constraints arising

    NASA Astrophysics Data System (ADS)

    Prostean, G.; Vasar, C.; Prostean, O.; Vartosu, A.

    2016-02-01

    Starting from the target of the European Union (EU) to use renewable energy in the area that aims a binding target of 20% renewable energy in final energy consumption by 2020, this article illustrates the identification of risks for implementation of wind energy projects in Romania, which could lead to complex technical implications, social and administrative. In specific projects analyzed in this paper were identified critical bottlenecks in the future wind power supply chain and reasonable time periods that may arise. Renewable energy technologies have to face a number of constraints that delayed scaling-up their production process, their transport process, the equipment reliability, etc. so implementing these types of projects requiring complex specialized team, the coordination of which also involve specific risks. The research team applied an analytical risk approach to identify major risks encountered within a wind farm project developed in Romania in isolated regions with different particularities, configured for different geographical areas (hill and mountain locations in Romania). Identification of major risks was based on the conceptual model set up for the entire project implementation process. Throughout this conceptual model there were identified specific constraints of such process. Integration risks were examined by an empirical study based on the method HAZOP (Hazard and Operability). The discussion describes the analysis of our results implementation context of renewable energy projects in Romania and creates a framework for assessing energy supply to any entity from renewable sources.

  9. An integrated approach of analytical network process and fuzzy based spatial decision making systems applied to landslide risk mapping

    NASA Astrophysics Data System (ADS)

    Abedi Gheshlaghi, Hassan; Feizizadeh, Bakhtiar

    2017-09-01

    Landslides in mountainous areas render major damages to residential areas, roads, and farmlands. Hence, one of the basic measures to reduce the possible damage is by identifying landslide-prone areas through landslide mapping by different models and methods. The purpose of conducting this study is to evaluate the efficacy of a combination of two models of the analytical network process (ANP) and fuzzy logic in landslide risk mapping in the Azarshahr Chay basin in northwest Iran. After field investigations and a review of research literature, factors affecting the occurrence of landslides including slope, slope aspect, altitude, lithology, land use, vegetation density, rainfall, distance to fault, distance to roads, distance to rivers, along with a map of the distribution of occurred landslides were prepared in GIS environment. Then, fuzzy logic was used for weighting sub-criteria, and the ANP was applied to weight the criteria. Next, they were integrated based on GIS spatial analysis methods and the landslide risk map was produced. Evaluating the results of this study by using receiver operating characteristic curves shows that the hybrid model designed by areas under the curve 0.815 has good accuracy. Also, according to the prepared map, a total of 23.22% of the area, amounting to 105.38 km2, is in the high and very high-risk class. Results of this research are great of importance for regional planning tasks and the landslide prediction map can be used for spatial planning tasks and for the mitigation of future hazards in the study area.

  10. An Inverse Kinematic Approach Using Groebner Basis Theory Applied to Gait Cycle Analysis

    DTIC Science & Technology

    2013-03-01

    AN INVERSE KINEMATIC APPROACH USING GROEBNER BASIS THEORY APPLIED TO GAIT CYCLE ANALYSIS THESIS Anum Barki AFIT-ENP-13-M-02 DEPARTMENT OF THE AIR...copyright protection in the United States. AFIT-ENP-13-M-02 AN INVERSE KINEMATIC APPROACH USING GROEBNER BASIS THEORY APPLIED TO GAIT CYCLE ANALYSIS THESIS...APPROACH USING GROEBNER BASIS THEORY APPLIED TO GAIT CYCLE ANALYSIS Anum Barki, BS Approved: Dr. Ronald F. Tuttle (Chairman) Date Dr. Kimberly Kendricks

  11. Applying temporal network analysis to the venture capital market

    NASA Astrophysics Data System (ADS)

    Zhang, Xin; Feng, Ling; Zhu, Rongqian; Stanley, H. Eugene

    2015-10-01

    Using complex network theory to study the investment relationships of venture capital firms has produced a number of significant results. However, previous studies have often neglected the temporal properties of those relationships, which in real-world scenarios play a pivotal role. Here we examine the time-evolving dynamics of venture capital investment in China by constructing temporal networks to represent (i) investment relationships between venture capital firms and portfolio companies and (ii) the syndication ties between venture capital investors. The evolution of the networks exhibits rich variations in centrality, connectivity and local topology. We demonstrate that a temporal network approach provides a dynamic and comprehensive analysis of real-world networks.

  12. Applied analysis/computational mathematics. Final report 1993

    SciTech Connect

    Lax, P.; Berger, M.

    1993-12-01

    This is the final report for the Courant Mathematics and Computing Laboratory (CMCL) research program for the years 1991--1993. Our research efforts encompass the formulation of physical problems in terms of mathematical models (both old and new), the mathematical analysis of such models, and their numerical resolution. This last step involves the development and implementation of efficient methods for large scale computation. Our analytic and numerical work often go hand in hand; new theoretical approaches often have numerical counterparts, while numerical experimentation often suggests avenues for analytical investigation.

  13. Inconsistencies in the harmonic analysis applied to pulsating stars

    NASA Astrophysics Data System (ADS)

    Pascual-Granado, J.; Garrido, R.; Suárez, J. C.

    2015-05-01

    Harmonic analysis is the fundamental mathematical method used for the identification of pulsation frequencies in asteroseismology and other fields of physics. Here we introduce a test to evaluate the validity of the hypothesis in which Fourier theorem is based: the convergence of the expansion series. The huge number of difficulties found in the interpretation of the periodograms of pulsating stars observed by CoRoT and Kepler satellites lead us to test whether the function underlying these time series is analytic or not. Surprisingly, the main result is that these are originated from non-analytic functions, therefore, the condition for Parseval's theorem is not guaranteed.

  14. Tea consumption and leukemia risk: a meta-analysis.

    PubMed

    Zhong, Shanliang; Chen, Zhiyuan; Yu, Xinnian; Chen, Weixian; Lv, Mengmeng; Ma, Tengfei; Zhao, Jianhua

    2014-06-01

    Epidemiologic findings concerning the association between tea consumption and leukemia risk yielded mixed results. We aimed to investigate the association by performing a meta-analysis of all available studies. One cohort studies and six case-control studies with 1,019 cases were identified using PubMed, Web of Science, and EMBASE. We computed summary relative risks (RRs) and 95 % confidence intervals (CIs) using random effect model applied to the relative risk associated with ever, moderate, or highest drinkers vs. non/lowest drinkers. Subgroup analyses were performed based on country (China and USA). Compared with non/lowest drinkers, the combined RR for ever drinkers was 0.76 (95 % CI=0.65-0.89). In subgroup analyses, significant inverse associations were found for both China and USA studies. The summary RR was 0.57 (95 % CI=0.41-0.78) for highest drinkers. Same results were only found in China studies. No significant associations were found for moderate drinkers in overall analysis or in subgroup analyses. There was some evidence of publication bias. In conclusion, this meta-analysis suggests a significant inverse association of high tea consumption and leukemia risk. Results should be interpreted cautiously given the potential publication bias.

  15. Differential Network Analysis Applied to Preoperative Breast Cancer Chemotherapy Response

    PubMed Central

    Warsow, Gregor; Struckmann, Stephan; Kerkhoff, Claus; Reimer, Toralf; Engel, Nadja; Fuellen, Georg

    2013-01-01

    In silico approaches are increasingly considered to improve breast cancer treatment. One of these treatments, neoadjuvant TFAC chemotherapy, is used in cases where application of preoperative systemic therapy is indicated. Estimating response to treatment allows or improves clinical decision-making and this, in turn, may be based on a good understanding of the underlying molecular mechanisms. Ever increasing amounts of high throughput data become available for integration into functional networks. In this study, we applied our software tool ExprEssence to identify specific mechanisms relevant for TFAC therapy response, from a gene/protein interaction network. We contrasted the resulting active subnetwork to the subnetworks of two other such methods, OptDis and KeyPathwayMiner. We could show that the ExprEssence subnetwork is more related to the mechanistic functional principles of TFAC therapy than the subnetworks of the other two methods despite the simplicity of ExprEssence. We were able to validate our method by recovering known mechanisms and as an application example of our method, we identified a mechanism that may further explain the synergism between paclitaxel and doxorubicin in TFAC treatment: Paclitaxel may attenuate MELK gene expression, resulting in lower levels of its target MYBL2, already associated with doxorubicin synergism in hepatocellular carcinoma cell lines. We tested our hypothesis in three breast cancer cell lines, confirming it in part. In particular, the predicted effect on MYBL2 could be validated, and a synergistic effect of paclitaxel and doxorubicin could be demonstrated in the breast cancer cell lines SKBR3 and MCF-7. PMID:24349128

  16. Multivariate calibration applied to the quantitative analysis of infrared spectra

    SciTech Connect

    Haaland, D.M.

    1991-01-01

    Multivariate calibration methods are very useful for improving the precision, accuracy, and reliability of quantitative spectral analyses. Spectroscopists can more effectively use these sophisticated statistical tools if they have a qualitative understanding of the techniques involved. A qualitative picture of the factor analysis multivariate calibration methods of partial least squares (PLS) and principal component regression (PCR) is presented using infrared calibrations based upon spectra of phosphosilicate glass thin films on silicon wafers. Comparisons of the relative prediction abilities of four different multivariate calibration methods are given based on Monte Carlo simulations of spectral calibration and prediction data. The success of multivariate spectral calibrations is demonstrated for several quantitative infrared studies. The infrared absorption and emission spectra of thin-film dielectrics used in the manufacture of microelectronic devices demonstrate rapid, nondestructive at-line and in-situ analyses using PLS calibrations. Finally, the application of multivariate spectral calibrations to reagentless analysis of blood is presented. We have found that the determination of glucose in whole blood taken from diabetics can be precisely monitored from the PLS calibration of either mind- or near-infrared spectra of the blood. Progress toward the non-invasive determination of glucose levels in diabetics is an ultimate goal of this research. 13 refs., 4 figs.

  17. Dynamical systems analysis applied to working memory data.

    PubMed

    Gasimova, Fidan; Robitzsch, Alexander; Wilhelm, Oliver; Boker, Steven M; Hu, Yueqin; Hülür, Gizem

    2014-01-01

    In the present paper we investigate weekly fluctuations in the working memory capacity (WMC) assessed over a period of 2 years. We use dynamical system analysis, specifically a second order linear differential equation, to model weekly variability in WMC in a sample of 112 9th graders. In our longitudinal data we use a B-spline imputation method to deal with missing data. The results show a significant negative frequency parameter in the data, indicating a cyclical pattern in weekly memory updating performance across time. We use a multilevel modeling approach to capture individual differences in model parameters and find that a higher initial performance level and a slower improvement at the MU task is associated with a slower frequency of oscillation. Additionally, we conduct a simulation study examining the analysis procedure's performance using different numbers of B-spline knots and values of time delay embedding dimensions. Results show that the number of knots in the B-spline imputation influence accuracy more than the number of embedding dimensions.

  18. Principles of micellar electrokinetic capillary chromatography applied in pharmaceutical analysis.

    PubMed

    Hancu, Gabriel; Simon, Brigitta; Rusu, Aura; Mircia, Eleonora; Gyéresi, Arpád

    2013-01-01

    Since its introduction capillary electrophoresis has shown great potential in areas where electrophoretic techniques have rarely been used before, including here the analysis of pharmaceutical substances. The large majority of pharmaceutical substances are neutral from electrophoretic point of view, consequently separations by the classic capillary zone electrophoresis; where separation is based on the differences between the own electrophoretic mobilities of the analytes; are hard to achieve. Micellar electrokinetic capillary chromatography, a hybrid method that combines chromatographic and electrophoretic separation principles, extends the applicability of capillary electrophoretic methods to neutral analytes. In micellar electrokinetic capillary chromatography, surfactants are added to the buffer solution in concentration above their critical micellar concentrations, consequently micelles are formed; micelles that undergo electrophoretic migration like any other charged particle. The separation is based on the differential partitioning of an analyte between the two-phase system: the mobile aqueous phase and micellar pseudostationary phase. The present paper aims to summarize the basic aspects regarding separation principles and practical applications of micellar electrokinetic capillary chromatography, with particular attention to those relevant in pharmaceutical analysis.

  19. Operational modal analysis applied to the concert harp

    NASA Astrophysics Data System (ADS)

    Chomette, B.; Le Carrou, J.-L.

    2015-05-01

    Operational modal analysis (OMA) methods are useful to extract modal parameters of operating systems. These methods seem to be particularly interesting to investigate the modal basis of string instruments during operation to avoid certain disadvantages due to conventional methods. However, the excitation in the case of string instruments is not optimal for OMA due to the presence of damped harmonic components and low noise in the disturbance signal. Therefore, the present study investigates the least-square complex exponential (LSCE) and the modified least-square complex exponential methods in the case of a string instrument to identify modal parameters of the instrument when it is played. The efficiency of the approach is experimentally demonstrated on a concert harp excited by some of its strings and the two methods are compared to a conventional modal analysis. The results show that OMA allows us to identify modes particularly present in the instrument's response with a good estimation especially if they are close to the excitation frequency with the modified LSCE method.

  20. Unsupervised feature relevance analysis applied to improve ECG heartbeat clustering.

    PubMed

    Rodríguez-Sotelo, J L; Peluffo-Ordoñez, D; Cuesta-Frau, D; Castellanos-Domínguez, G

    2012-10-01

    The computer-assisted analysis of biomedical records has become an essential tool in clinical settings. However, current devices provide a growing amount of data that often exceeds the processing capacity of normal computers. As this amount of information rises, new demands for more efficient data extracting methods appear. This paper addresses the task of data mining in physiological records using a feature selection scheme. An unsupervised method based on relevance analysis is described. This scheme uses a least-squares optimization of the input feature matrix in a single iteration. The output of the algorithm is a feature weighting vector. The performance of the method was assessed using a heartbeat clustering test on real ECG records. The quantitative cluster validity measures yielded a correctly classified heartbeat rate of 98.69% (specificity), 85.88% (sensitivity) and 95.04% (general clustering performance), which is even higher than the performance achieved by other similar ECG clustering studies. The number of features was reduced on average from 100 to 18, and the temporal cost was a 43% lower than in previous ECG clustering schemes.

  1. Integrated Reliability and Risk Analysis System (IRRAS)

    SciTech Connect

    Russell, K D; McKay, M K; Sattison, M.B. Skinner, N.L.; Wood, S T; Rasmuson, D M

    1992-01-01

    The Integrated Reliability and Risk Analysis System (IRRAS) is a state-of-the-art, microcomputer-based probabilistic risk assessment (PRA) model development and analysis tool to address key nuclear plant safety issues. IRRAS is an integrated software tool that gives the user the ability to create and analyze fault trees and accident sequences using a microcomputer. This program provides functions that range from graphical fault tree construction to cut set generation and quantification. Version 1.0 of the IRRAS program was released in February of 1987. Since that time, many user comments and enhancements have been incorporated into the program providing a much more powerful and user-friendly system. This version has been designated IRRAS 4.0 and is the subject of this Reference Manual. Version 4.0 of IRRAS provides the same capabilities as Version 1.0 and adds a relational data base facility for managing the data, improved functionality, and improved algorithm performance.

  2. Sinkhole risk modelling applied to transportation infrastructures. A case study from the Ebro valley evaporite karst (NE Spain)

    NASA Astrophysics Data System (ADS)

    Galve, Jorge P.; Remondo, Juan; Gutiérrez, Francisco; Guerrero, Jesús; Bonachea, Jaime; Lucha, Pedro

    2010-05-01

    Sinkholes disrupt transportation route serviceability causing significant direct and indirect economic losses. Additionally, catastrophic collapse sinkholes may lead to accidents producing loss of human lives. Sinkhole risk modelling allows the estimation of the expectable losses in different portions of infrastructures and the identification of the sections where the application corrective measures would have a better cost-benefit ratio. An example of sinkhole risk analysis applied to a motorway under construction in a mantled evaporite karst area with a very high probability of occurrence of cover collapse sinkholes is presented. Firstly, sinkhole susceptibility models have been obtained, and independently evaluated, on the basis of a probabilistic method which combines the distance to nearest sinkhole with other conditioning factors. The most reliable susceptibility model was then transformed into several sinkhole hazard models using empirical functions. This functions describe the relationships between the frequency of sinkholes and (1) sinkholes dimensions, (2) terrain susceptibility and (3) land cover. Although to evaluate hazard models more information on temporal occurrences would be needed, the quality and quantity of the data in which models are based and the distribution of the latest sinkholes of considerable magnitude occurred in the study area indicate that the models seem to be sound. Two collapse sinkholes 4 m across formed after the production of the models coincide with the zone of highest hazard, which occupy 15% of the study area. Finally, on the basis of the hazard models obtained, sinkhole risk models were generated for a motorway under construction with the aim of quantitatively estimating the expected losses in different sections of the infrastructure in a given period of time. To produce the risk models, the vulnerability of the motorway was estimated considering the cost of the structure, sinkhole magnitude and frequency and the expectable

  3. Examples of Communicating Uncertainty Applied to Earthquake Hazard and Risk Products

    NASA Astrophysics Data System (ADS)

    Wald, D. J.

    2013-12-01

    When is communicating scientific modeling uncertainty effective? One viewpoint is that the answer depends on whether one is communicating hazard or risk: hazards have quantifiable uncertainties (which, granted, are often ignored), yet risk uncertainties compound uncertainties inherent in the hazard with those of the risk calculations, and are thus often larger. Larger, yet more meaningful: since risk entails societal impact of some form, consumers of such information tend to have a better grasp of the potential uncertainty ranges for loss information than they do for less-tangible hazard values (like magnitude, peak acceleration, or stream flow). I present two examples that compare and contrast communicating uncertainty for earthquake hazard and risk products. The first example is the U.S. Geological Survey's (USGS) ShakeMap system, which portrays the uncertain, best estimate of the distribution and intensity of shaking over the potentially impacted region. The shaking intensity is well constrained at seismograph locations yet is uncertain elsewhere, so shaking uncertainties are quantified and presented spatially. However, with ShakeMap, it seems that users tend to believe what they see is accurate in part because (1) considering the shaking uncertainty complicates the picture, and (2) it would not necessarily alter their decision-making. In contrast, when it comes to making earthquake-response decisions based on uncertain loss estimates, actions tend to be made only after analysis of the confidence in (or source of) such estimates. Uncertain ranges of loss estimates instill tangible images for users, and when such uncertainties become large, intuitive reality-check alarms go off, for example, when the range of losses presented become too wide to be useful. The USGS Prompt Assessment of Global Earthquakes for Response (PAGER) system, which in near-real time alerts users to the likelihood of ranges of potential fatalities and economic impact, is aimed at

  4. On applying continuous wavelet transform in wheeze analysis.

    PubMed

    Taplidou, Styliani A; Hadjileontiadis, Leontios J; Kitsas, Ilias K; Panoulas, Konstantinos I; Penzel, Thomas; Gross, Volker; Panas, Stavros M

    2004-01-01

    The identification of continuous abnormal lung sounds, like wheezes, in the total breathing cycle is of great importance in the diagnosis of obstructive airways pathologies. To this vein, the current work introduces an efficient method for the detection of wheezes, based on the time-scale representation of breath sound recordings. The employed Continuous Wavelet Transform is proven to be a valuable tool at this direction, when combined with scale-dependent thresholding. Analysis of lung sound recordings from 'wheezing' patients shows promising performance in the detection and extraction of wheezes from the background noise and reveals its potentiality for data-volume reduction in long-term wheezing screening, such as in sleep-laboratories.

  5. Applying Machine Learning to GlueX Data Analysis

    NASA Astrophysics Data System (ADS)

    Boettcher, Thomas

    2014-03-01

    GlueX is a high energy physics experiment with the goal of collecting data necessary for understanding confinement in quantum chromodynamics. Beginning in 2015, GlueX will collect huge amounts of data describing billions of particle collisions. In preparation for data collection, efforts are underway to develop a methodology for analyzing these large data sets. One of the primary challenges in GlueX data analysis is isolating events of interest from a proportionally large background. GlueX has recently begun approaching this selection problem using machine learning algorithms, specifically boosted decision trees. Preliminary studies indicate that these algorithms have the potential to offer vast improvements in both signal selection efficiency and purity over more traditional techniques.

  6. Applying Skinner's analysis of verbal behavior to persons with dementia.

    PubMed

    Dixon, Mark; Baker, Jonathan C; Sadowski, Katherine Ann

    2011-03-01

    Skinner's 1957 analysis of verbal behavior has demonstrated a fair amount of utility to teach language to children with autism and other various disorders. However, the learning of language can be forgotten, as is the case for many elderly suffering from dementia or other degenerative diseases. It appears possible that Skinner's operants may facilitate not only acquisition of language but also the ability to recall items or objects that may have appeared to be "forgotten." The present study examined the utility of having a series of adults in long-term care emit tacts, echoics, or intraverbals upon presentation of various visual stimuli. Compared to a no-verbal response condition, it appears that the incorporation of Skinner's verbal operants can in fact improve recall for this population. Implications for the retraining of lost language are presented.

  7. Geostatistical analysis as applied to two environmental radiometric time series.

    PubMed

    Dowdall, Mark; Lind, Bjørn; Gerland, Sebastian; Rudjord, Anne Liv

    2003-03-01

    This article details the results of an investigation into the application of geostatistical data analysis to two environmental radiometric time series. The data series employed consist of 99Tc values for seaweed (Fucus vesiculosus) and seawater samples taken as part of a marine monitoring program conducted on the coast of northern Norway by the Norwegian Radiation Protection Authority. Geostatistical methods were selected in order to provide information on values of the variables at unsampled times and to investigate the temporal correlation exhibited by the data sets. This information is of use in the optimisation of future sampling schemes and for providing information on the temporal behaviour of the variables in question that may not be obtained during a cursory analysis. The results indicate a high degree of temporal correlation within the data sets, the correlation for the seawater and seaweed data being modelled with an exponential and linear function, respectively. The semi-variogram for the seawater data indicates a temporal range of correlation of approximately 395 days with no apparent random component to the overall variance structure and was described best by an exponential function. The temporal structure of the seaweed data was best modelled by a linear function with a small nugget component. Evidence of drift was present in both semi-variograms. Interpolation of the data sets using the fitted models and a simple kriging procedure were compared, using a cross-validation procedure, with simple linear interpolation. Results of this exercise indicate that, for the seawater data, the kriging procedure outperformed the simple interpolation with respect to error distribution and correlation of estimates with actual values. Using the unbounded linear model with the seaweed data produced estimates that were only marginally better than those produced by the simple interpolation.

  8. [Improvement of legislation basis for occupational risk analysis in occupational hygiene and work safety].

    PubMed

    Zaitseva, N V; Shur, P Z; Alekseev, V B; Andreeva, E E; Sliapniakov, D M

    2014-01-01

    One among priority trendsin health care in Russian Federation and abroad is minimization of occupational risks. The authors present evaluation of legislation basis for occupational risk analysis. The most promising trend in improvement of national legislation is its development on basis of internationally accepted documents, that-provides legislation basis for analysis of workers' health risk. Findings are that complete evaluation of occupational risk requires combination of data on work conditions and data of occupational control, and sometimes--with results of special research. Further improvement is needed for justifying hygienic norms with applying criteria of allowable risk for workers' health. Now development of risk analysis methodology enables quantitative evaluation of health risk via mathematic models including those describing risk evolution.

  9. Applying Lessons Learned from Space Safety to Unmanned Aerial Vehicle Risk Assessments

    NASA Astrophysics Data System (ADS)

    Devoid, Wayne E.

    2013-09-01

    This paper will examine the application of current orbital launch risk methodology to assessing risk for unmanned aerial vehicle flights over populated areas. Major differences, such as the added complexity of lifting bodies, accounting for pilots-in-the-loop, and the complexity of using current population data to estimate risk for unmanned aerial vehicles, will be highlighted.

  10. Decision making for wildfires: A guide for applying a risk management process at the incident level

    Treesearch

    Mary A. Taber; Lisa M. Elenz; Paul G. Langowski

    2013-01-01

    This publication focuses on the thought processes and considerations surrounding a risk management process for decision making on wildfires. The publication introduces a six element risk management cycle designed to encourage sound risk-informed decision making in accordance with Federal wildland fire policy, although the process is equally applicable to non-Federal...

  11. Fire Risk Analysis for Armenian NPP Confinement

    SciTech Connect

    Poghosyan, Shahen; Malkhasyan, Albert; Bznuni, Surik; Amirjanyan, Armen

    2006-07-01

    Major fire occurred at Armenian NPP (ANPP) in October 1982 showed that fire-induced initiating events (IE) can have dominant contribution in overall risk of core damage. Probabilistic Safety Assessment study for fire-induced initiating events for ANPP was initiated in 2002. Analysis was performed for compartments fires in which could result in failure of components which are necessary for reactor cold shutdown. Analysis shows that main risk from fire at ANPP is conditioned by fire in cable tunnels 61-64. Meanwhile fire in confinement compartments don't have significant contribution to overall risk of core damage. The exception is so called 'confinement valves compartment' (room no.A-013/2) fire (more than 7.5% of CDF) in which fire could result in the loss of coolant accident with unavailability of primary makeup system, which directly leads to core damage. Detailed analysis of this problem that is common for typical WWER-440/230 reactors with no hermetic MCPs and recommendations for solution are presented in this paper. (authors)

  12. Global Human Settlement Analysis for Disaster Risk Reduction

    NASA Astrophysics Data System (ADS)

    Pesaresi, M.; Ehrlich, D.; Ferri, S.; Florczyk, A.; Freire, S.; Haag, F.; Halkia, M.; Julea, A. M.; Kemper, T.; Soille, P.

    2015-04-01

    The Global Human Settlement Layer (GHSL) is supported by the European Commission, Joint Research Center (JRC) in the frame of his institutional research activities. Scope of GHSL is developing, testing and applying the technologies and analysis methods integrated in the JRC Global Human Settlement analysis platform for applications in support to global disaster risk reduction initiatives (DRR) and regional analysis in the frame of the European Cohesion policy. GHSL analysis platform uses geo-spatial data, primarily remotely sensed and population. GHSL also cooperates with the Group on Earth Observation on SB-04-Global Urban Observation and Information, and various international partners andWorld Bank and United Nations agencies. Some preliminary results integrating global human settlement information extracted from Landsat data records of the last 40 years and population data are presented.

  13. The impact of applied behavior analysis on diverse areas of research.

    PubMed Central

    Kazdin, A E

    1975-01-01

    The impact of applied behavior analysis on various disciplines and areas of research was assessed through two major analyses. First, the relationship of applied behavior analysis to the general area of "behavior modification" was evaluated by examining the citation characteristics of journal articles in JABA and three other behavior-modification journals. Second, the penetration of applied behavior analysis into diverse areas and disciplines, including behavior modification, psychiatry, clinical psychology, education, special education, retardation, speech and hearing, counselling, and law enforcement and correction was assessed. Twenty-five journals representing diverse research areas were evaluated from 1968 to 1974 to assess the extent to which operant techniques were applied for therapeutic, rehabilitative, and educative purposes and the degree to which methodological desiderata of applied behavior analysis were met. The analyses revealed diverse publication outlets for applied behavior analysis in various disciplines. PMID:1184488

  14. Using Technology to Expand and Enhance Applied Behavioral Analysis Programs for Children with Autism in Military Families

    DTIC Science & Technology

    2015-07-01

    for children with an ASD remain bleak and are associated with a high divorce rate among parents. Interventions based on applied behavior analysis are...training individuals. This project will demonstrate how web- based technologies can increase the availability of this effective treatment. The fourth year...associated with a high divorce rate among parents and increased risk for mental health disorders among family members. The efficacy of and empirical

  15. Applying the skin sensitisation adverse outcome pathway (AOP) to quantitative risk assessment.

    PubMed

    Maxwell, Gavin; MacKay, Cameron; Cubberley, Richard; Davies, Michael; Gellatly, Nichola; Glavin, Stephen; Gouin, Todd; Jacquoilleot, Sandrine; Moore, Craig; Pendlington, Ruth; Saib, Ouarda; Sheffield, David; Stark, Richard; Summerfield, Vicki

    2014-02-01

    As documented in the recent OECD report 'the adverse outcome pathway for skin sensitisation initiated by covalent binding to proteins' (OECD, 2012), the chemical and biological events driving the induction of human skin sensitisation have been investigated for many years and are now well understood. Several non-animal test methods have been developed to predict sensitiser potential by measuring the impact of chemical sensitisers on these key events (Adler et al., 2011; Maxwell et al., 2011); however our ability to use these non-animal datasets for risk assessment decision-making (i.e. to establish a safe level of human exposure for a sensitising chemical) remains limited and a more mechanistic approach to data integration is required to address this challenge. Informed by our previous efforts to model the induction of skin sensitisation (Maxwell and MacKay, 2008) we are now developing two mathematical models ('total haptenated protein' model and 'CD8(+) T cell response' model) that will be linked to provide predictions of the human CD8(+) T cell response for a defined skin exposure to a sensitising chemical. Mathematical model development is underpinned by focussed clinical or human-relevant research activities designed to inform/challenge model predictions whilst also increasing our fundamental understanding of human skin sensitisation. With this approach, we aim to quantify the relationship between the dose of sensitiser applied to the skin and the extent of the hapten-specific T cell response that would result. Furthermore, by benchmarking our mathematical model predictions against clinical datasets (e.g. human diagnostic patch test data), instead of animal test data, we propose that this approach could represent a new paradigm for mechanistic toxicology.

  16. Risk-driven security testing using risk analysis with threat modeling approach.

    PubMed

    Palanivel, Maragathavalli; Selvadurai, Kanmani

    2014-01-01

    Security testing is a process of determining risks present in the system states and protects them from vulnerabilities. But security testing does not provide due importance to threat modeling and risk analysis simultaneously that affects confidentiality and integrity of the system. Risk analysis includes identification, evaluation and assessment of risks. Threat modeling approach is identifying threats associated with the system. Risk-driven security testing uses risk analysis results in test case identification, selection and assessment to prioritize and optimize the testing process. Threat modeling approach, STRIDE is generally used to identify both technical and non-technical threats present in the system. Thus, a security testing mechanism based on risk analysis results using STRIDE approach has been proposed for identifying highly risk states. Risk metrics considered for testing includes risk impact, risk possibility and risk threshold. Risk threshold value is directly proportional to risk impact and risk possibility. Risk-driven security testing results in reduced test suite which in turn reduces test case selection time. Risk analysis optimizes the test case selection and execution process. For experimentation, the system models namely LMS, ATM, OBS, OSS and MTRS are considered. The performance of proposed system is analyzed using Test Suite Reduction Rate (TSRR) and FSM coverage. TSRR varies from 13.16 to 21.43% whereas FSM coverage is achieved up to 91.49%. The results show that the proposed method combining risk analysis with threat modeling identifies states with high risks to improve the testing efficiency.

  17. Applying data mining for the analysis of breast cancer data.

    PubMed

    Liou, Der-Ming; Chang, Wei-Pin

    2015-01-01

    Data mining, also known as Knowledge-Discovery in Databases (KDD), is the process of automatically searching large volumes of data for patterns. For instance, a clinical pattern might indicate a female who have diabetes or hypertension are easier suffered from stroke for 5 years in a future. Then, a physician can learn valuable knowledge from the data mining processes. Here, we present a study focused on the investigation of the application of artificial intelligence and data mining techniques to the prediction models of breast cancer. The artificial neural network, decision tree, logistic regression, and genetic algorithm were used for the comparative studies and the accuracy and positive predictive value of each algorithm were used as the evaluation indicators. 699 records acquired from the breast cancer patients at the University of Wisconsin, nine predictor variables, and one outcome variable were incorporated for the data analysis followed by the tenfold cross-validation. The results revealed that the accuracies of logistic regression model were 0.9434 (sensitivity 0.9716 and specificity 0.9482), the decision tree model 0.9434 (sensitivity 0.9615, specificity 0.9105), the neural network model 0.9502 (sensitivity 0.9628, specificity 0.9273), and the genetic algorithm model 0.9878 (sensitivity 1, specificity 0.9802). The accuracy of the genetic algorithm was significantly higher than the average predicted accuracy of 0.9612. The predicted outcome of the logistic regression model was higher than that of the neural network model but no significant difference was observed. The average predicted accuracy of the decision tree model was 0.9435 which was the lowest of all four predictive models. The standard deviation of the tenfold cross-validation was rather unreliable. This study indicated that the genetic algorithm model yielded better results than other data mining models for the analysis of the data of breast cancer patients in terms of the overall accuracy of

  18. Applied Climate-Change Analysis: The Climate Wizard Tool

    PubMed Central

    Girvetz, Evan H.; Zganjar, Chris; Raber, George T.; Maurer, Edwin P.; Kareiva, Peter; Lawler, Joshua J.

    2009-01-01

    Background Although the message of “global climate change” is catalyzing international action, it is local and regional changes that directly affect people and ecosystems and are of immediate concern to scientists, managers, and policy makers. A major barrier preventing informed climate-change adaptation planning is the difficulty accessing, analyzing, and interpreting climate-change information. To address this problem, we developed a powerful, yet easy to use, web-based tool called Climate Wizard (http://ClimateWizard.org) that provides non-climate specialists with simple analyses and innovative graphical depictions for conveying how climate has and is projected to change within specific geographic areas throughout the world. Methodology/Principal Findings To demonstrate the Climate Wizard, we explored historic trends and future departures (anomalies) in temperature and precipitation globally, and within specific latitudinal zones and countries. We found the greatest temperature increases during 1951–2002 occurred in northern hemisphere countries (especially during January–April), but the latitude of greatest temperature change varied throughout the year, sinusoidally ranging from approximately 50°N during February-March to 10°N during August-September. Precipitation decreases occurred most commonly in countries between 0–20°N, and increases mostly occurred outside of this latitudinal region. Similarly, a quantile ensemble analysis based on projections from 16 General Circulation Models (GCMs) for 2070–2099 identified the median projected change within countries, which showed both latitudinal and regional patterns in projected temperature and precipitation change. Conclusions/Significance The results of these analyses are consistent with those reported by the Intergovernmental Panel on Climate Change, but at the same time, they provide examples of how Climate Wizard can be used to explore regionally- and temporally-specific analyses of climate

  19. New risk metrics and mathematical tools for risk analysis: Current and future challenges

    NASA Astrophysics Data System (ADS)

    Skandamis, Panagiotis N.; Andritsos, Nikolaos; Psomas, Antonios; Paramythiotis, Spyridon

    2015-01-01

    The current status of the food safety supply world wide, has led Food and Agriculture Organization (FAO) and World Health Organization (WHO) to establishing Risk Analysis as the single framework for building food safety control programs. A series of guidelines and reports that detail out the various steps in Risk Analysis, namely Risk Management, Risk Assessment and Risk Communication is available. The Risk Analysis approach enables integration between operational food management systems, such as Hazard Analysis Critical Control Points, public health and governmental decisions. To do that, a series of new Risk Metrics has been established as follows: i) the Appropriate Level of Protection (ALOP), which indicates the maximum numbers of illnesses in a population per annum, defined by quantitative risk assessments, and used to establish; ii) Food Safety Objective (FSO), which sets the maximum frequency and/or concentration of a hazard in a food at the time of consumption that provides or contributes to the ALOP. Given that ALOP is rather a metric of the public health tolerable burden (it addresses the total `failure' that may be handled at a national level), it is difficult to be interpreted into control measures applied at the manufacturing level. Thus, a series of specific objectives and criteria for performance of individual processes and products have been established, all of them assisting in the achievement of FSO and hence, ALOP. In order to achieve FSO, tools quantifying the effect of processes and intrinsic properties of foods on survival and growth of pathogens are essential. In this context, predictive microbiology and risk assessment have offered an important assistance to Food Safety Management. Predictive modelling is the basis of exposure assessment and the development of stochastic and kinetic models, which are also available in the form of Web-based applications, e.g., COMBASE and Microbial Responses Viewer), or introduced into user-friendly softwares

  20. New risk metrics and mathematical tools for risk analysis: Current and future challenges

    SciTech Connect

    Skandamis, Panagiotis N. Andritsos, Nikolaos Psomas, Antonios Paramythiotis, Spyridon

    2015-01-22

    The current status of the food safety supply world wide, has led Food and Agriculture Organization (FAO) and World Health Organization (WHO) to establishing Risk Analysis as the single framework for building food safety control programs. A series of guidelines and reports that detail out the various steps in Risk Analysis, namely Risk Management, Risk Assessment and Risk Communication is available. The Risk Analysis approach enables integration between operational food management systems, such as Hazard Analysis Critical Control Points, public health and governmental decisions. To do that, a series of new Risk Metrics has been established as follows: i) the Appropriate Level of Protection (ALOP), which indicates the maximum numbers of illnesses in a population per annum, defined by quantitative risk assessments, and used to establish; ii) Food Safety Objective (FSO), which sets the maximum frequency and/or concentration of a hazard in a food at the time of consumption that provides or contributes to the ALOP. Given that ALOP is rather a metric of the public health tolerable burden (it addresses the total ‘failure’ that may be handled at a national level), it is difficult to be interpreted into control measures applied at the manufacturing level. Thus, a series of specific objectives and criteria for performance of individual processes and products have been established, all of them assisting in the achievement of FSO and hence, ALOP. In order to achieve FSO, tools quantifying the effect of processes and intrinsic properties of foods on survival and growth of pathogens are essential. In this context, predictive microbiology and risk assessment have offered an important assistance to Food Safety Management. Predictive modelling is the basis of exposure assessment and the development of stochastic and kinetic models, which are also available in the form of Web-based applications, e.g., COMBASE and Microbial Responses Viewer), or introduced into user

  1. Health risk analysis of atmospheric polycyclic aromatic hydrocarbons in big cities of China.

    PubMed

    Wang, Yonghua; Hu, Liangfeng; Lu, Guanghua

    2014-05-01

    A probabilistic carcinogenic risk assessment of atmospheric polycyclic aromatic hydrocarbons (PAHs) in four big cities (Beijing, Shanghai, Guangzhou, Xiamen) of China was carried out. PAHs levels in these cities were collected from published literatures and converted into BaP equivalent (BaPeq) concentrations. The health risk assessment models recommended by US EPA were applied to quantitatively characterize the health risk values of PAHs. Monte Carlo simulation and sensitivity analysis were applied to quantify uncertainties of risk assessment. The results showed that BaPeq concentrations of four cities were all higher than the newest limited value (1 ng/m(3)) of China. Health risk assessment indicated that atmospheric PAHs in Guangzhou and Xiamen posed no or little carcinogenic risk on local residents. However, the PAHs in Beijing and Shanghai posed potential carcinogenic risk for adults and lifetime exposure. Notwithstanding the uncertainties, this study provides the primary information on the carcinogenic risk of atmospheric PAHs in studied cities of China.

  2. Applied and computational harmonic analysis on graphs and networks

    NASA Astrophysics Data System (ADS)

    Irion, Jeff; Saito, Naoki

    2015-09-01

    In recent years, the advent of new sensor technologies and social network infrastructure has provided huge opportunities and challenges for analyzing data recorded on such networks. In the case of data on regular lattices, computational harmonic analysis tools such as the Fourier and wavelet transforms have well-developed theories and proven track records of success. It is therefore quite important to extend such tools from the classical setting of regular lattices to the more general setting of graphs and networks. In this article, we first review basics of graph Laplacian matrices, whose eigenpairs are often interpreted as the frequencies and the Fourier basis vectors on a given graph. We point out, however, that such an interpretation is misleading unless the underlying graph is either an unweighted path or cycle. We then discuss our recent effort of constructing multiscale basis dictionaries on a graph, including the Hierarchical Graph Laplacian Eigenbasis Dictionary and the Generalized Haar-Walsh Wavelet Packet Dictionary, which are viewed as generalizations of the classical hierarchical block DCTs and the Haar-Walsh wavelet packets, respectively, to the graph setting. Finally, we demonstrate the usefulness of our dictionaries by using them to simultaneously segment and denoise 1-D noisy signals sampled on regular lattices, a problem where classical tools have difficulty.

  3. Ion Beam Analysis applied to laser-generated plasmas

    NASA Astrophysics Data System (ADS)

    Cutroneo, M.; Macková, A.; Havranek, V.; Malinsky, P.; Torrisi, L.; Kormunda, M.; Barchuk, M.; Ullschmied, J.; Dudzak, R.

    2016-04-01

    This paper presents the research activity on Ion Beam Analysis methods performed at Tandetron Laboratory (LT) of the Institute of Nuclear Physics AS CR, Rez, Czech Republic. Recently, many groups are paying attention to implantation by laser generated plasma. This process allows to insert a controllable amount of energetic ions into the surface layers of different materials modifying the physical and chemical properties of the surface material. Different substrates are implanted by accelerated ions from plasma through terawatt iodine laser, at nominal intensity of 1015 W/cm2, at the PALS Research Infrastructure AS CR, in the Czech Republic. This regime of the laser matter interaction generates, multi-MeV proton beams, and multi-charged ions that are tightly confined in time (hundreds ps) and space (source radius of a few microns). These ion beams have a much lower transverse temperature, a much shorter duration and a much higher current than those obtainable from conventional accelerators. The implementation of protons and ions acceleration driven by ultra-short high intensity lasers is exhibited by adopting suitable irradiation conditions as well as tailored targets. An overview of implanted targets and their morphological and structural characterizations is presented and discussed.

  4. Applying revised gap analysis model in measuring hotel service quality.

    PubMed

    Lee, Yu-Cheng; Wang, Yu-Che; Chien, Chih-Hung; Wu, Chia-Huei; Lu, Shu-Chiung; Tsai, Sang-Bing; Dong, Weiwei

    2016-01-01

    With the number of tourists coming to Taiwan growing by 10-20 % since 2010, the number has increased due to an increasing number of foreign tourists, particularly after deregulation allowed admitting tourist groups, followed later on by foreign individual tourists, from mainland China. The purpose of this study is to propose a revised gap model to evaluate and improve service quality in Taiwanese hotel industry. Thus, service quality could be clearly measured through gap analysis, which was more effective for offering direction in developing and improving service quality. The HOLSERV instrument was used to identify and analyze service gaps from the perceptions of internal and external customers. The sample for this study included three main categories of respondents: tourists, employees, and managers. The results show that five gaps influenced tourists' evaluations of service quality. In particular, the study revealed that Gap 1 (management perceptions vs. customer expectations) and Gap 9 (service provider perceptions of management perceptions vs. service delivery) were more critical than the others in affecting perceived service quality, making service delivery the main area of improvement. This study contributes toward an evaluation of the service quality of the Taiwanese hotel industry from the perspectives of customers, service providers, and managers, which is considerably valuable for hotel managers. It was the aim of this study to explore all of these together in order to better understand the possible gaps in the hotel industry in Taiwan.

  5. 14 CFR 417.225 - Debris risk analysis.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 14 Aeronautics and Space 4 2010-01-01 2010-01-01 false Debris risk analysis. 417.225 Section 417... OF TRANSPORTATION LICENSING LAUNCH SAFETY Flight Safety Analysis § 417.225 Debris risk analysis. A flight safety analysis must demonstrate that the risk to the public potentially exposed to inert and...

  6. Applying the Land Use Portfolio Model with Hazus to analyse risk from natural hazard events

    USGS Publications Warehouse

    Dinitz, Laura B.; Taketa, Richard A.

    2013-01-01

    This paper describes and demonstrates the integration of two geospatial decision-support systems for natural-hazard risk assessment and management. Hazus is a risk-assessment tool developed by the Federal Emergency Management Agency to identify risks and estimate the severity of risk from natural hazards. The Land Use Portfolio Model (LUPM) is a risk-management tool developed by the U.S. Geological Survey to evaluate plans or actions intended to reduce risk from natural hazards. We analysed three mitigation policies for one earthquake scenario in the San Francisco Bay area to demonstrate the added value of using Hazus and the LUPM together. The demonstration showed that Hazus loss estimates can be input to the LUPM to obtain estimates of losses avoided through mitigation, rates of return on mitigation investment, and measures of uncertainty. Together, they offer a more comprehensive approach to help with decisions for reducing risk from natural hazards.

  7. Meta-analysis of osteoporosis: fracture risks, medication and treatment.

    PubMed

    Liu, W; Yang, L-H; Kong, X-C; An, L-K; Wang, R

    2015-08-01

    Osteoporosis is a brittle bone disease that can cause fractures mostly in older men and women. Meta-analysis is the statistical method which is applied in the frame work for the assessment of results obtained from various research studies conducted in several years. A meta-analysis of osteoporotic fracture risk with medication non-adherence has been described to assess the bone fracture risk among patients non-adherent versus adherent to therapy for osteoporosis by many researchers. Osteoporosis therapy reduces the risk of fracture in clinical trials, and real-world adherence to therapy which is suboptimal and can reduce the effectiveness of intervention. The methods of Medline, Embase, and CINAHL were literature searched for these observational studies from year 1998 to 2009, and up to 2015. The results of meta-analysis of osteoporosis research on fractures of postmenopausal women and men are presented. The use of bisphosphonate therapy for osteoporosis has been described with other drugs. The authors, design, studies (women %), years (data), follow-up (wks), fractures (types), and compliance or persistence results from years 2004 to 2009 from are shown in a brief table. The meta-analysis studies have been reviewed from other researchers on osteoporosis and fractures, medications and treatments.

  8. Improving the flash flood frequency analysis applying dendrogeomorphological evidences

    NASA Astrophysics Data System (ADS)

    Ruiz-Villanueva, V.; Ballesteros, J. A.; Bodoque, J. M.; Stoffel, M.; Bollschweiler, M.; Díez-Herrero, A.

    2009-09-01

    Flash floods are one of the natural hazards that cause major damages worldwide. Especially in Mediterranean areas they provoke high economic losses every year. In mountain areas with high stream gradients, floods events are characterized by extremely high flow and debris transport rates. Flash flood analysis in mountain areas presents specific scientific challenges. On one hand, there is a lack of information on precipitation and discharge due to a lack of spatially well distributed gauge stations with long records. On the other hand, gauge stations may not record correctly during extreme events when they are damaged or the discharge exceeds the recordable level. In this case, no systematic data allows improvement of the understanding of the spatial and temporal occurrence of the process. Since historic documentation is normally scarce or even completely missing in mountain areas, tree-ring analysis can provide an alternative approach. Flash floods may influence trees in different ways: (1) tilting of the stem through the unilateral pressure of the flowing mass or individual boulders; (2) root exposure through erosion of the banks; (3) injuries and scars caused by boulders and wood transported in the flow; (4) decapitation of the stem and resulting candelabra growth through the severe impact of boulders; (5) stem burial through deposition of material. The trees react to these disturbances with specific growth changes such as abrupt change of the yearly increment and anatomical changes like reaction wood or callus tissue. In this study, we sampled 90 cross sections and 265 increment cores of trees heavily affected by past flash floods in order to date past events and to reconstruct recurrence intervals in two torrent channels located in the Spanish Central System. The first study site is located along the Pelayo River, a torrent in natural conditions. Based on the external disturbances of trees and their geomorphological position, 114 Pinus pinaster (Ait

  9. Applied Drama and the Higher Education Learning Spaces: A Reflective Analysis

    ERIC Educational Resources Information Center

    Moyo, Cletus

    2015-01-01

    This paper explores Applied Drama as a teaching approach in Higher Education learning spaces. The exploration takes a reflective analysis approach by first examining the impact that Applied Drama has had on my career as a Lecturer/Educator/Teacher working in Higher Education environments. My engagement with Applied Drama practice and theory is…

  10. Exploratory Factor Analysis as a Construct Validation Tool: (Mis)applications in Applied Linguistics Research

    ERIC Educational Resources Information Center

    Karami, Hossein

    2015-01-01

    Factor analysis has been frequently exploited in applied research to provide evidence about the underlying factors in various measurement instruments. A close inspection of a large number of studies published in leading applied linguistic journals shows that there is a misconception among applied linguists as to the relative merits of exploratory…

  11. Experiments With Temporal Reasoning Applied To Analysis Of Telemetry Data

    NASA Astrophysics Data System (ADS)

    Perkins, W. A.; Austin, A.

    1987-10-01

    Many applications of expert systems to Space Station Automation, such as monitoring, planning, and scheduling will involve reasoning about attributes of objects at different times. For example, in monitoring, the system must reason about changes in signal parameters over time because causal relationships among events are important. In order to reason efficiently and concurrently about attributes with different values at different times, different time formats, and different time validity conditions requires more complex knowledge representations than are generally available in expert systems. Representation issues dealing with point times, intervals, and relative times must also be resolved. We have implemented a temporal reasoning capability in a generic expert system shell (LES) to address these issues and to increase the flexibility of the knowledge representation for a variety of applications. For its first application, we chose monitoring of telemetry data from a satellite (the Space Telescope). Our work involved just the RCE (Rotor Controlled Electronics) bearing, a component of the reaction-wheels subsystem which has attributes such as ACTUAL-TEMPERATURE of the bearing, WHEEL-SPEED, and MOTOR-CURRENT. This task consists of collecting one attribute value per sensor per cycle, checking each value to see if it is within the acceptable range, and storing the each value with a time tag in the database. Processing becomes more complex when one or more readings are out of their acceptable range. The analysis to discover the cause involves examining several cycles of readings, as well as comparing the readings of different sensors over time. The temporal reasoning capability in LES allowed us to compare the most recent readings of two sensors; or to compare one current reading with a value collected some time earlier; or to collect several consecutive readings which are analyzed for trends. In addition, having time tags associated with attribute values permitted us

  12. Debris Flow Risk Management Framework and Risk Analysis in Taiwan, A Preliminary Study

    NASA Astrophysics Data System (ADS)

    Tsao, Ting-Chi; Hsu, Wen-Ko; Chiou, Lin-Bin; Cheng, Chin-Tung; Lo, Wen-Chun; Chen, Chen-Yu; Lai, Cheng-Nong; Ju, Jiun-Ping

    2010-05-01

    Taiwan is located on a seismically active mountain belt between the Philippine Sea plate and Eurasian plate. After 1999's Chi-Chi earthquake (Mw=7.6), landslide and debris flow occurred frequently. In Aug. 2009, Typhoon Morakot struck Taiwan and numerous landslides and debris flow events, some with tremendous fatalities, were observed. With limited resources, authorities should establish a disaster management system to cope with slope disaster risks more effectively. Since 2006, Taiwan's authority in charge of debris flow management, the Soil and Water Conservation Bureau (SWCB), completed the basic investigation and data collection of 1,503 potential debris flow creeks around Taiwan. During 2008 and 2009, a debris flow quantitative risk analysis (QRA) framework, based on landslide risk management framework of Australia, was proposed and conducted on 106 creeks of the 30 villages with debris flow hazard history. Information and value of several types of elements at risk (bridge, road, building and crop) were gathered and integrated into a GIS layer, with the vulnerability model of each elements at risk applied. Through studying the historical hazard events of the 30 villages, numerical simulations of debris flow hazards with different magnitudes (5, 10, 25, 50, 100 and 200 years return period) were conducted, the economic losses and fatalities of each scenario were calculated for each creek. When taking annual exceeding probability into account, the annual total risk of each creek was calculated, and the results displayed on a debris flow risk map. The number of fatalities and frequency were calculated, and the F-N curves of 106 creeks were provided. For F-N curves, the individual risk to life per year of 1.0E-04 and slope of 1, which matched with international standards, were considered to be an acceptable risk. Applying the results of the 106 creeks onto the F-N curve, they were divided into 3 categories: Unacceptable, ALARP (As Low As Reasonable Practicable) and

  13. Comprehensive safeguards evaluation methods and societal risk analysis

    SciTech Connect

    Richardson, J.M.

    1982-03-01

    Essential capabilities of an integrated evaluation methodology for analyzing safeguards systems are discussed. Such a methodology must be conceptually meaningful, technically defensible, discriminating and consistent. A decompostion of safeguards systems by function is mentioned as a possible starting point for methodology development. The application of a societal risk equation to safeguards systems analysis is addressed. Conceptual problems with this approach are discussed. Technical difficulties in applying this equation to safeguards systems are illustrated through the use of confidence intervals, information content, hypothesis testing and ranking and selection procedures.

  14. Improving causal inferences in risk analysis.

    PubMed

    Cox, Louis Anthony Tony

    2013-10-01

    Recent headlines and scientific articles projecting significant human health benefits from changes in exposures too often depend on unvalidated subjective expert judgments and modeling assumptions, especially about the causal interpretation of statistical associations. Some of these assessments are demonstrably biased toward false positives and inflated effects estimates. More objective, data-driven methods of causal analysis are available to risk analysts. These can help to reduce bias and increase the credibility and realism of health effects risk assessments and causal claims. For example, quasi-experimental designs and analysis allow alternative (noncausal) explanations for associations to be tested, and refuted if appropriate. Panel data studies examine empirical relations between changes in hypothesized causes and effects. Intervention and change-point analyses identify effects (e.g., significant changes in health effects time series) and estimate their sizes. Granger causality tests, conditional independence tests, and counterfactual causality models test whether a hypothesized cause helps to predict its presumed effects, and quantify exposure-specific contributions to response rates in differently exposed groups, even in the presence of confounders. Causal graph models let causal mechanistic hypotheses be tested and refined using biomarker data. These methods can potentially revolutionize the study of exposure-induced health effects, helping to overcome pervasive false-positive biases and move the health risk assessment scientific community toward more accurate assessments of the impacts of exposures and interventions on public health.

  15. Approach to uncertainty in risk analysis

    SciTech Connect

    Rish, W.R.

    1988-08-01

    In the Fall of 1985 EPA's Office of Radiation Programs (ORP) initiated a project to develop a formal approach to dealing with uncertainties encountered when estimating and evaluating risks to human health and the environment. Based on a literature review of modeling uncertainty, interviews with ORP technical and management staff, and input from experts on uncertainty analysis, a comprehensive approach was developed. This approach recognizes by design the constraints on budget, time, manpower, expertise, and availability of information often encountered in ''real world'' modeling. It is based on the observation that in practice risk modeling is usually done to support a decision process. As such, the approach focuses on how to frame a given risk modeling problem, how to use that framing to select an appropriate mixture of uncertainty analyses techniques, and how to integrate the techniques into an uncertainty assessment that effectively communicates important information and insight to decision-makers. The approach is presented in this report. Practical guidance on characterizing and analyzing uncertainties about model form and quantities and on effectively communicating uncertainty analysis results is included. Examples from actual applications are presented.

  16. A risk analysis of pulmonary complications following major trauma.

    PubMed

    Hoyt, D B; Simons, R K; Winchell, R J; Cushman, J; Hollingsworth-Fridlund, P; Holbrook, T; Fortlage, D

    1993-10-01

    Varying institutional definitions and degrees of surveillance limit awareness of the true incidence of posttraumatic pulmonary complications. Prospective review with standardized definitions of 25 categories of pulmonary complications was applied to a university level I trauma service over 3 years to establish the true incidence. Potential injury-related predictors of individual complications were determined using multiple logistic regression analysis and adjusted odds ratios were calculated, thereby controlling for the effect of other covariants. Significance was attributed to p < 0.05. Of 3289 patients meeting MTOS criteria, pulmonary complications occurred in 368 (11.2%). Pulmonary complications account for one third of all disease complications. Significant associations with pneumonia included age, the presence of shock on admission, significant head injury, and surgery to the head and chest. Significant risk for atelectasis occurred in patients with blunt injury mechanism, ISS > 16, shock on admission, and severe head injury. Risks for development of respiratory failure included age > 55 years, the mechanism of "pedestrian struck", and the presence of significant head injury. Risk factors for ARDS included surgery to the head and a Trauma Score < 13 on arrival. Significant predictors for pulmonary embolism included ISS > 16, shock on admission, and extremity and pelvis injuries. The true incidence of pulmonary complications is established with this kind of analysis and focuses attention on (1) groups at high risk for developing complications, (2) groups for which current therapeutic modalities are still ineffective, and (3) defining the need to refocus on prospective research rather than ineffective processes of care.

  17. A novel bi-level meta-analysis approach: applied to biological pathway analysis.

    PubMed

    Nguyen, Tin; Tagett, Rebecca; Donato, Michele; Mitrea, Cristina; Draghici, Sorin

    2016-02-01

    The accumulation of high-throughput data in public repositories creates a pressing need for integrative analysis of multiple datasets from independent experiments. However, study heterogeneity, study bias, outliers and the lack of power of available methods present real challenge in integrating genomic data. One practical drawback of many P-value-based meta-analysis methods, including Fisher's, Stouffer's, minP and maxP, is that they are sensitive to outliers. Another drawback is that, because they perform just one statistical test for each individual experiment, they may not fully exploit the potentially large number of samples within each study. We propose a novel bi-level meta-analysis approach that employs the additive method and the Central Limit Theorem within each individual experiment and also across multiple experiments. We prove that the bi-level framework is robust against bias, less sensitive to outliers than other methods, and more sensitive to small changes in signal. For comparative analysis, we demonstrate that the intra-experiment analysis has more power than the equivalent statistical test performed on a single large experiment. For pathway analysis, we compare the proposed framework versus classical meta-analysis approaches (Fisher's, Stouffer's and the additive method) as well as against a dedicated pathway meta-analysis package (MetaPath), using 1252 samples from 21 datasets related to three human diseases, acute myeloid leukemia (9 datasets), type II diabetes (5 datasets) and Alzheimer's disease (7 datasets). Our framework outperforms its competitors to correctly identify pathways relevant to the phenotypes. The framework is sufficiently general to be applied to any type of statistical meta-analysis. The R scripts are available on demand from the authors. sorin@wayne.edu Supplementary data are available at Bioinformatics online. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please e

  18. A novel bi-level meta-analysis approach: applied to biological pathway analysis

    PubMed Central

    Nguyen, Tin; Tagett, Rebecca; Donato, Michele; Mitrea, Cristina; Draghici, Sorin

    2016-01-01

    Motivation: The accumulation of high-throughput data in public repositories creates a pressing need for integrative analysis of multiple datasets from independent experiments. However, study heterogeneity, study bias, outliers and the lack of power of available methods present real challenge in integrating genomic data. One practical drawback of many P-value-based meta-analysis methods, including Fisher’s, Stouffer’s, minP and maxP, is that they are sensitive to outliers. Another drawback is that, because they perform just one statistical test for each individual experiment, they may not fully exploit the potentially large number of samples within each study. Results: We propose a novel bi-level meta-analysis approach that employs the additive method and the Central Limit Theorem within each individual experiment and also across multiple experiments. We prove that the bi-level framework is robust against bias, less sensitive to outliers than other methods, and more sensitive to small changes in signal. For comparative analysis, we demonstrate that the intra-experiment analysis has more power than the equivalent statistical test performed on a single large experiment. For pathway analysis, we compare the proposed framework versus classical meta-analysis approaches (Fisher’s, Stouffer’s and the additive method) as well as against a dedicated pathway meta-analysis package (MetaPath), using 1252 samples from 21 datasets related to three human diseases, acute myeloid leukemia (9 datasets), type II diabetes (5 datasets) and Alzheimer’s disease (7 datasets). Our framework outperforms its competitors to correctly identify pathways relevant to the phenotypes. The framework is sufficiently general to be applied to any type of statistical meta-analysis. Availability and implementation: The R scripts are available on demand from the authors. Contact: sorin@wayne.edu Supplementary Information: Supplementary data are available at Bioinformatics online. PMID:26471455

  19. Beyond Time Out and Table Time: Today's Applied Behavior Analysis for Students with Autism

    ERIC Educational Resources Information Center

    Boutot, E. Amanda; Hume, Kara

    2010-01-01

    Recent mandates related to the implementation of evidence-based practices for individuals with autism spectrum disorder (ASD) require that autism professionals both understand and are able to implement practices based on the science of applied behavior analysis (ABA). The use of the term "applied behavior analysis" and its related concepts…

  20. Beyond Time out and Table Time: Today's Applied Behavior Analysis for Students with Autism

    ERIC Educational Resources Information Center

    Boutot, E. Amanda; Hume, Kara

    2012-01-01

    Recent mandates related to the implementation of evidence-based practices for individuals with autism spectrum disorder (ASD) require that autism professionals both understand and are able to implement practices based on the science of applied behavior analysis (ABA). The use of the term "applied behavior analysis" and its related concepts…

  1. Anticipating risk for human subjects participating in clinical research: application of Failure Mode and Effects Analysis.

    PubMed

    Cody, Robert J

    2006-03-01

    Failure Mode and Effects Analysis (FMEA) is a method applied in various industries to anticipate and mitigate risk. This methodology can be more systematically applied to the protection of human subjects in research. The purpose of FMEA is simple: prevent problems before they occur. By applying FMEA process analysis to the elements of a specific research protocol, the failure severity, occurrence, and detection rates can be estimated for calculation of a "risk priority number" (RPN). Methods can then be identified to reduce the RPN to levels where the risk/benefit ratio favors human subject benefit, to a greater magnitude than existed in the pre-analysis risk profile. At the very least, the approach provides a checklist of issues that can be individualized for specific research protocols or human subject populations.

  2. Methodology for risk analysis based on atmospheric dispersion modelling from nuclear risk sites

    NASA Astrophysics Data System (ADS)

    Baklanov, A.; Mahura, A.; Sørensen, J. H.; Rigina, O.

    2003-04-01

    The main purpose of this multidisciplinary study is to develop a methodology for complex nuclear risk and vulnerability assessment, and to test it on example of estimation of nuclear risk to the population in the Nordic countries in case of a severe accident at a nuclear risk site (NRS). The main focus of the paper is the methodology for the evaluation of the atmospheric transport and deposition of radioactive pollutants from NRSs. The method developed for this evaluation is derived from a probabilistic point of view. The main question we are trying to answer is: What is the probability for radionuclide atmospheric transport and impact to different neighbouring regions and countries in case of an accident at an NPP? To answer this question we applied a number of different tools: (i) Trajectory Modelling - to calculate multiyear forward trajectories originating over the locations of selected risk sites; (ii) Dispersion Modelling - for long-term simulation and case studies of radionuclide transport from hypothetical accidental releases at NRSs; (iii) Cluster Analysis - to identify atmospheric transport pathways from NRSs; (iv) Probability Fields Analysis - to construct annual, monthly, and seasonal NRS impact indicators to identify the most impacted geographical regions; (v) Specific Case Studies - to estimate consequences for the environment and the populations after a hypothetical accident; (vi) Vulnerability Evaluation to Radioactive Deposition - to describe its persistence in the ecosystems with a focus to the transfer of certain radionuclides into the food chains of key importance for the intake and exposure for a whole population and for certain population groups; (vii) Risk Evaluation and Mapping - to analyse socio-economical consequences for different geographical areas and various population groups taking into account social-geophysical factors and probabilities, and using demographic databases based on GIS analysis.

  3. Improving Patient Prostate Cancer Risk Assessment: Moving From Static, Globally-Applied to Dynamic, Practice-Specific Cancer Risk Calculators

    PubMed Central

    Strobl, Andreas N.; Vickers, Andrew J.; Van Calster, Ben; Steyerberg, Ewout; Leach, Robin J.; Thompson, Ian M.; Ankerst, Donna P.

    2015-01-01

    Clinical risk calculators are now widely available but have generally been implemented in a static and one-size-fits-all fashion. The objective of this study was to challenge these notions and show via a case study concerning risk-based screening for prostate cancer how calculators can be dynamically and locally tailored to improve on-site patient accuracy. Yearly data from five international prostate biopsy cohorts (3 in the US, 1 in Austria, 1 in England) were used to compare 6 methods for annual risk prediction: static use of the online US-developed Prostate Cancer Prevention Trial Risk Calculator (PCPTRC); recalibration of the PCPTRC; revision of the PCPTRC; building a new model each year using logistic regression, Bayesian prior-to-posterior updating, or random forests. All methods performed similarly with respect to discrimination, except for random forests, which were worse. All methods except for random forests greatly improved calibration over the static PCPTRC in all cohorts except for Austria, where the PCPTRC had the best calibration followed closely by recalibration. The case study shows that a simple annual recalibration of a general online risk tool for prostate cancer can improve its accuracy with respect to the local patient practice at hand. PMID:25989018

  4. Applying a Generic Juvenile Risk Assessment Instrument to a Local Context: Some Practical and Theoretical Lessons

    ERIC Educational Resources Information Center

    Miller, Joel; Lin, Jeffrey

    2007-01-01

    This article examines issues raised by the application of a generic actuarial juvenile risk instrument (the Model Risk Assessment Instrument) to New York City, a context different from the one in which it was developed. It describes practical challenges arising from the constraints of locally available data and local sensibilities and highlights…

  5. RADON EXPOSURE ASSESSMENT AND DOSIMETRY APPLIED TO EPIDEMIOLOGY AND RISK ESTIMATION

    EPA Science Inventory

    Epidemiological studies of underground miners provide the primary basis for radon risk estimates for indoor exposures as well as mine exposures. A major source of uncertainty in these risk estimates is the uncertainty in radon progeny exposure estimates for the miners. In addit...

  6. Modeling Opponents in Adversarial Risk Analysis.

    PubMed

    Rios Insua, David; Banks, David; Rios, Jesus

    2016-04-01

    Adversarial risk analysis has been introduced as a framework to deal with risks derived from intentional actions of adversaries. The analysis supports one of the decisionmakers, who must forecast the actions of the other agents. Typically, this forecast must take account of random consequences resulting from the set of selected actions. The solution requires one to model the behavior of the opponents, which entails strategic thinking. The supported agent may face different kinds of opponents, who may use different rationality paradigms, for example, the opponent may behave randomly, or seek a Nash equilibrium, or perform level-k thinking, or use mirroring, or employ prospect theory, among many other possibilities. We describe the appropriate analysis for these situations, and also show how to model the uncertainty about the rationality paradigm used by the opponent through a Bayesian model averaging approach, enabling a fully decision-theoretic solution. We also show how as we observe an opponent's decision behavior, this approach allows learning about the validity of each of the rationality models used to predict his decision by computing the models' (posterior) probabilities, which can be understood as a measure of their validity. We focus on simultaneous decision making by two agents.

  7. Applying a forensic actuarial assessment (the Violence Risk Appraisal Guide) to nonforensic patients.

    PubMed

    Harris, Grant T; Rice, Marnie E; Camilleri, Joseph A

    2004-09-01

    The actuarial Violence Risk Appraisal Guide (VRAG) was developed for male offenders where it has shown excellent replicability in many new forensic samples using officially recorded outcomes. Clinicians also make decisions, however, about the risk of interpersonal violence posed by nonforensic psychiatric patients of both sexes. Could an actuarial risk assessment developed for male forensic populations be used for a broader clientele? We modified the VRAG to permit evaluation using data from the MacArthur Violence Risk Assessment Study that included nonforensic male and female patients and primarily self-reported violence. The modified VRAG yielded a large effect size in the prediction of dichotomous postdischarge severe violence over 20 and 50 weeks. Accuracy of VRAG predictions was unrelated to sex. The results provide evidence about the robustness of comprehensive actuarial risk assessments and the generality of the personal factors that underlie violent behavior.

  8. Applying Costs, Risks and Values Evaluation (CRAVE) methodology to Engineering Support Request (ESR) prioritization

    NASA Technical Reports Server (NTRS)

    Joglekar, Prafulla N.

    1994-01-01

    Given limited budget, the problem of prioritization among Engineering Support Requests (ESR's) with varied sizes, shapes, and colors is a difficult one. At the Kennedy Space Center (KSC), the recently developed 4-Matrix (4-M) method represents a step in the right direction as it attempts to combine the traditional criteria of technical merits only with the new concern for cost-effectiveness. However, the 4-M method was not adequately successful in the actual prioritization of ESRs for the fiscal year 1995 (FY95). This research identifies a number of design issues that should help us to develop better methods. It emphasizes that given the variety and diversity of ESR's one should not expect that a single method could help in the assessment of all ESR's. One conclusion is that a methodology such as Costs, Risks, and Values Evaluation (CRAVE) should be adopted. It also is clear that the development of methods such as 4-M requires input not only from engineers with technical expertise in ESR's but also from personnel with adequate background in the theory and practice of cost-effectiveness analysis. At KSC, ESR prioritization is one part of the Ground Support Working Teams (GSWT) Integration Process. It was discovered that the more important barriers to the incorporation of cost-effectiveness considerations in ESR prioritization lie in this process. The culture of integration, and the corresponding structure of review by a committee of peers, is not conducive to the analysis and confrontation necessary in the assessment and prioritization of ESR's. Without assistance from appropriately trained analysts charged with the responsibility to analyze and be confrontational about each ESR, the GSWT steering committee will continue to make its decisions based on incomplete understanding, inconsistent numbers, and at times, colored facts. The current organizational separation of the prioritization and the funding processes is also identified as an important barrier to the

  9. Risk Analysis Related to Quality Management Principles

    NASA Astrophysics Data System (ADS)

    Vykydal, David; Halfarová, Petra; Nenadál, Jaroslav; Plura, Jiří; Hekelová, Edita

    2012-12-01

    Efficient and effective implementation of quality management principles asks for a responsible approach from top managers' perspectives. A study of the current state of affairs in Czech organizations discovers a lot of shortcomings in this field that can be changed to vary managerial risks. The article identifies and analyses some of them and gives short guidance for appropriate treatment. Text of the article reflects the authors' experience as well as knowledge obtained from the systematic analysis of industrial companies' environments.

  10. Application of Risk Analysis: Response from a Systems Division,

    DTIC Science & Technology

    A review of theoretical literature reveals that most technical aspects of risk analysis have become a reasonably well-defined process with many... risk analysis in order to enhance its application. Also needed are better tools to enhance use of both subjective judgment and group decision processes...hope that it would lead to increased application of risk analysis in the acquisition process.

  11. Problem Formulation for Human Health Risk Assessments of Pathogens in Land-Applied Biosolids (Final Report)

    EPA Science Inventory

    Cover of the Land-<span class=Applied Biosolids 2011 Final Report "> Millions of tons of treated sewage sludges or “biosolids” are applied annually to f...

  12. Big Data Usage Patterns in the Health Care Domain: A Use Case Driven Approach Applied to the Assessment of Vaccination Benefits and Risks

    PubMed Central

    Liyanage, H.; Liaw, S-T.; Kuziemsky, C.; Mold, F.; Krause, P.; Fleming, D.; Jones, S.

    2014-01-01

    Summary Background Generally benefits and risks of vaccines can be determined from studies carried out as part of regulatory compliance, followed by surveillance of routine data; however there are some rarer and more long term events that require new methods. Big data generated by increasingly affordable personalised computing, and from pervasive computing devices is rapidly growing and low cost, high volume, cloud computing makes the processing of these data inexpensive. Objective To describe how big data and related analytical methods might be applied to assess the benefits and risks of vaccines. Method: We reviewed the literature on the use of big data to improve health, applied to generic vaccine use cases, that illustrate benefits and risks of vaccination. We defined a use case as the interaction between a user and an information system to achieve a goal. We used flu vaccination and pre-school childhood immunisation as exemplars. Results We reviewed three big data use cases relevant to assessing vaccine benefits and risks: (i) Big data processing using crowd-sourcing, distributed big data processing, and predictive analytics, (ii) Data integration from heterogeneous big data sources, e.g. the increasing range of devices in the “internet of things”, and (iii) Real-time monitoring for the direct monitoring of epidemics as well as vaccine effects via social media and other data sources. Conclusions Big data raises new ethical dilemmas, though its analysis methods can bring complementary real-time capabilities for monitoring epidemics and assessing vaccine benefit-risk balance. PMID:25123718

  13. Applying Full Spectrum Analysis to a Raman Spectroscopic Assessment of Fracture Toughness of Human Cortical Bone.

    PubMed

    Makowski, Alexander J; Granke, Mathilde; Ayala, Oscar D; Uppuganti, Sasidhar; Mahadevan-Jansen, Anita; Nyman, Jeffry S

    2017-10-01

    A decline in the inherent quality of bone tissue is a † Equal contributors contributor to the age-related increase in fracture risk. Although this is well-known, the important biochemical factors of bone quality have yet to be identified using Raman spectroscopy (RS), a nondestructive, inelastic light-scattering technique. To identify potential RS predictors of fracture risk, we applied principal component analysis (PCA) to 558 Raman spectra (370-1720 cm(-1)) of human cortical bone acquired from 62 female and male donors (nine spectra each) spanning adulthood (age range = 21-101 years). Spectra were analyzed prior to R-curve, nonlinear fracture mechanics that delineate crack initiation (Kinit) from crack growth toughness (Kgrow). The traditional ν1phosphate peak per amide I peak (mineral-to-matrix ratio) weakly correlated with Kinit (r = 0.341, p = 0.0067) and overall crack growth toughness (J-int: r = 0.331, p = 0.0086). Sub-peak ratios of the amide I band that are related to the secondary structure of type 1 collagen did not correlate with the fracture toughness properties. In the full spectrum analysis, one principal component (PC5) correlated with all of the mechanical properties (Kinit: r = - 0.467, Kgrow: r = - 0.375, and J-int: r = - 0.428; p < 0.0067). More importantly, when known predictors of fracture toughness, namely age and/or volumetric bone mineral density (vBMD), were included in general linear models as covariates, several PCs helped explain 45.0% (PC5) to 48.5% (PC7), 31.4% (PC6), and 25.8% (PC7) of the variance in Kinit, Kgrow, and J-int, respectively. Deriving spectral features from full spectrum analysis may improve the ability of RS, a clinically viable technology, to assess fracture risk.

  14. IT-OSRA: applying ensemble simulations to estimate the oil spill risk associated to operational and accidental oil spills

    NASA Astrophysics Data System (ADS)

    Sepp Neves, Antonio Augusto; Pinardi, Nadia; Martins, Flavio

    2016-08-01

    Oil Spill Risk Assessments (OSRAs) are widely employed to support decision making regarding oil spill risks. This article adapts the ISO-compliant OSRA framework developed by Sepp Neves et al. (J Environ Manag 159:158-168, 2015) to estimate risks in a complex scenario where uncertainties related to the meteo-oceanographic conditions, where and how a spill could happen exist and the risk computation methodology is not yet well established (ensemble oil spill modeling). The improved method was applied to the Algarve coast, Portugal. Over 50,000 simulations were performed in 2 ensemble experiments to estimate the risks due to operational and accidental spill scenarios associated with maritime traffic. The level of risk was found to be important for both types of scenarios, with significant seasonal variations due to the the currents and waves variability. Higher frequency variability in the meteo-oceanographic variables were also found to contribute to the level of risk. The ensemble results show that the distribution of oil concentrations found on the coast is not Gaussian, opening up new fields of research on how to deal with oil spill risks and related uncertainties.

  15. School attendance, health-risk behaviors, and self-esteem in adolescents applying for working papers.

    PubMed Central

    Suss, A. L.; Tinkelman, B. K.; Freeman, K.; Friedman, S. B.

    1996-01-01

    Since health-risk behaviors are often encountered in clusters among adolescents, it was hypothesized that adolescents with poor school attendance would be associated with more health-risk behaviors (e.g., substance use, violence) than those who attend school regularly. This study assessed the relationship between poor school attendance and health-risk behaviors, and described health-risk behaviors and self-esteem among adolescents seeking employment. In this cross-sectional study, school attendance (poor vs. regular attendance) was related to health-risk behaviors by asking 122 subjects seen at a New York City Working Papers Clinic to complete both a 72-item questionnaire about their health-risk behaviors and the 58-item Coopersmith Self-Esteem School Form Inventory. Chi-square and Fisher's Exact Tests were performed. The poor and regular attenders of school differed significantly in only 5 out of 44 items pertaining to health-risk behaviors. Self-esteem measures for the two groups did not differ from one another or from national norms. In this sample, depression "in general" (global) and "at home," but not "at school," were associated significantly with suicidal thoughts/attempts and serious past life events (e.g. family conflict, sexual abuse). There were no significant associations between depression or self-esteem and illicit substance or alcohol use. We found few associations between poor school attendance and health-risk behaviors in this sample of employment-seeking adolescents. The poor and regular attenders of school were similar in most aspects of their health-risk behaviors and self-esteem. PMID:8982520

  16. Applying the Analytic Hierarchy Process to Oil Sands Environmental Compliance Risk Management

    NASA Astrophysics Data System (ADS)

    Roux, Izak Johannes, III

    Oil companies in Alberta, Canada, invested $32 billion on new oil sands projects in 2013. Despite the size of this investment, there is a demonstrable deficiency in the uniformity and understanding of environmental legislation requirements that manifest into increased project compliance risks. This descriptive study developed 2 prioritized lists of environmental regulatory compliance risks and mitigation strategies and used multi-criteria decision theory for its theoretical framework. Information from compiled lists of environmental compliance risks and mitigation strategies was used to generate a specialized pairwise survey, which was piloted by 5 subject matter experts (SMEs). The survey was validated by a sample of 16 SMEs, after which the Analytic Hierarchy Process (AHP) was used to rank a total of 33 compliance risks and 12 mitigation strategy criteria. A key finding was that the AHP is a suitable tool for ranking of compliance risks and mitigation strategies. Several working hypotheses were also tested regarding how SMEs prioritized 1 compliance risk or mitigation strategy compared to another. The AHP showed that regulatory compliance, company reputation, environmental compliance, and economics ranked the highest and that a multi criteria mitigation strategy for environmental compliance ranked the highest. The study results will inform Alberta oil sands industry leaders about the ranking and utility of specific compliance risks and mitigations strategies, enabling them to focus on actions that will generate legislative and public trust. Oil sands leaders implementing a risk management program using the risks and mitigation strategies identified in this study will contribute to environmental conservation, economic growth, and positive social change.

  17. Inclusive Elementary Classroom Teacher Knowledge of and Attitudes toward Applied Behavior Analysis and Autism Spectrum Disorder and Their Use of Applied Behavior Analysis

    ERIC Educational Resources Information Center

    McCormick, Jennifer A.

    2011-01-01

    The purpose of this study was to examine inclusive elementary teacher knowledge and attitude toward Autism Spectrum Disorder (ASD) and applied behavior analysis (ABA) and their use of ABA. Furthermore, this study examined if knowledge and attitude predicted use of ABA. A survey was developed and administered through a web-based program. Of the…

  18. [Future built-up area zoning by applying the methodology for assessing the population health risk].

    PubMed

    Bobkova, T E

    2009-01-01

    Using the methodology for assessing the population health risk provides proposals on the functional zoning of the reorganized area of a plastics-works. An area has been allocated for possible house-building.

  19. Applying the emergency risk management process to tackle the crisis of antibiotic resistance

    PubMed Central

    Dominey-Howes, Dale; Bajorek, Beata; Michael, Carolyn A.; Betteridge, Brittany; Iredell, Jonathan; Labbate, Maurizio

    2015-01-01

    We advocate that antibiotic resistance be reframed as a disaster risk management problem. Antibiotic-resistant infections represent a risk to life as significant as other commonly occurring natural disasters (e.g., earthquakes). Despite efforts by global health authorities, antibiotic resistance continues to escalate. Therefore, new approaches and expertise are needed to manage the issue. In this perspective we: (1) make a call for the emergency management community to recognize the antibiotic resistance risk and join in addressing this problem; (2) suggest using the risk management process to help tackle antibiotic resistance; (3) show why this approach has value and why it is different to existing approaches; and (4) identify public perception of antibiotic resistance as an important issue that warrants exploration. PMID:26388864

  20. Applying the emergency risk management process to tackle the crisis of antibiotic resistance.

    PubMed

    Dominey-Howes, Dale; Bajorek, Beata; Michael, Carolyn A; Betteridge, Brittany; Iredell, Jonathan; Labbate, Maurizio

    2015-01-01

    We advocate that antibiotic resistance be reframed as a disaster risk management problem. Antibiotic-resistant infections represent a risk to life as significant as other commonly occurring natural disasters (e.g., earthquakes). Despite efforts by global health authorities, antibiotic resistance continues to escalate. Therefore, new approaches and expertise are needed to manage the issue. In this perspective we: (1) make a call for the emergency management community to recognize the antibiotic resistance risk and join in addressing this problem; (2) suggest using the risk management process to help tackle antibiotic resistance; (3) show why this approach has value and why it is different to existing approaches; and (4) identify public perception of antibiotic resistance as an important issue that warrants exploration.

  1. Applying Risk Society Theory to findings of a scoping review on caregiver safety.

    PubMed

    Macdonald, Marilyn; Lang, Ariella

    2014-03-01

    Chronic Illness represents a growing concern in the western world and individuals living with chronic illness are primarily managed at home by family caregivers. A scoping review of the home-care literature (2004-2009; updated with review articles from 2010 to January 2013) on the topic of the caregiver revealed that this group experiences the following safety-related concerns: caregivers are conscripted to the role, experience economic hardship, risk being abused as well as abusing, and may well become patients themselves. Methodology and methods used in the scoping review are presented as well as a brief overview of the findings. The concepts of risk and safety are defined. Risk Society Theory is introduced and used as a lens to view the findings, and to contribute to an understanding of the construction of risk in contemporary health-care.

  2. INDICATORS OF RISK: AN ANALYSIS APPROACH FOR IMPROVED RIVER MANAGEMENT

    EPA Science Inventory

    A risk index is an approach to measuring the level of risk to the plants and/or animals (biota) in a certain area using water and habitat quality information. A new technique for developing risk indices was applied to data collected from Mid-Atlantic streams of the U.S. during 1...

  3. INDICATORS OF RISK: AN ANALYSIS APPROACH FOR IMPROVED RIVER MANAGEMENT

    EPA Science Inventory

    A risk index is an approach to measuring the level of risk to the plants and/or animals (biota) in a certain area using water and habitat quality information. A new technique for developing risk indices was applied to data collected from Mid-Atlantic streams of the U.S. during 1...

  4. The Veterans Affairs Cardiac Risk Score: Recalibrating the Atherosclerotic Cardiovascular Disease Score for Applied Use.

    PubMed

    Sussman, Jeremy B; Wiitala, Wyndy L; Zawistowski, Matthew; Hofer, Timothy P; Bentley, Douglas; Hayward, Rodney A

    2017-09-01

    Accurately estimating cardiovascular risk is fundamental to good decision-making in cardiovascular disease (CVD) prevention, but risk scores developed in one population often perform poorly in dissimilar populations. We sought to examine whether a large integrated health system can use their electronic health data to better predict individual patients' risk of developing CVD. We created a cohort using all patients ages 45-80 who used Department of Veterans Affairs (VA) ambulatory care services in 2006 with no history of CVD, heart failure, or loop diuretics. Our outcome variable was new-onset CVD in 2007-2011. We then developed a series of recalibrated scores, including a fully refit "VA Risk Score-CVD (VARS-CVD)." We tested the different scores using standard measures of prediction quality. For the 1,512,092 patients in the study, the Atherosclerotic cardiovascular disease risk score had similar discrimination as the VARS-CVD (c-statistic of 0.66 in men and 0.73 in women), but the Atherosclerotic cardiovascular disease model had poor calibration, predicting 63% more events than observed. Calibration was excellent in the fully recalibrated VARS-CVD tool, but simpler techniques tested proved less reliable. We found that local electronic health record data can be used to estimate CVD better than an established risk score based on research populations. Recalibration improved estimates dramatically, and the type of recalibration was important. Such tools can also easily be integrated into health system's electronic health record and can be more readily updated.

  5. Quantitative microbial risk assessment applied to irrigation of salad crops with waste stabilization pond effluents.

    PubMed

    Pavione, D M S; Bastos, R K X; Bevilacqua, P D

    2013-01-01

    A quantitative microbial risk assessment model for estimating infection risks arising from consuming crops eaten raw that have been irrigated with effluents from stabilization ponds was constructed. A log-normal probability distribution function was fitted to a large database from a comprehensive monitoring of an experimental pond system to account for variability in Escherichia coli concentration in irrigation water. Crop contamination levels were estimated using predictive models derived from field experiments involving the irrigation of several crops with different effluent qualities. Data on daily intake of salad crops were obtained from a national survey in Brazil. Ten thousand-trial Monte Carlo simulations were used to estimate human health risks associated with the use of wastewater for irrigating low- and high-growing crops. The use of effluents containing 10(3)-10(4) E. coli per 100 ml resulted in median rotavirus infection risk of approximately 10(-3) and 10(-4) pppy when irrigating, respectively, low- and high-growing crops; the corresponding 95th percentile risk estimates were around 10(-2) in both scenarios. Sensitivity analyses revealed that variations in effluent quality, in the assumed ratios of pathogens to E. coli, and in the reduction of pathogens between harvest and consumption had great impact upon risk estimates.

  6. The acquired preparedness risk model applied to smoking in 5th grade children.

    PubMed

    Combs, Jessica L; Spillane, Nichea S; Caudill, Leann; Stark, Brittany; Smith, Gregory T

    2012-03-01

    The very early onset of smoking predicts numerous health problems. The authors conducted the first test of one risk model for elementary school age smoking, known as the acquired preparedness (AP) model of risk, in a cross-sectional sample of 309 5th grade children. The model posits that (a) impulsivity-related personality traits contribute to risk for a variety of risky, maladaptive behaviors; (b) smoking expectancies confer risk only for smoking; and (c) the personality traits contribute to the formation of high risk expectancies for reinforcement from smoking, which in turn increases the likelihood of early onset smoking. The model was supported: the high-risk personality traits distinguished children engaging in any risky, maladaptive behavior from other children, and the smoking expectancies differentiated smokers from all other children. The relationship between personality tendencies to act rashly when experiencing intense positive or negative emotions and smoker status was partially mediated by expectancies for reinforcement from smoking. This model should be investigated longitudinally.

  7. Radiation Leukemogenesis: Applying Basic Science of Epidemiological Estimates of Low Dose Risks and Dose-Rate Effects

    SciTech Connect

    Hoel, D. G.

    1998-11-01

    The next stage of work has been to examine more closely the A-bomb leukemia data which provides the underpinnings of the risk estimation of CML in the above mentioned manuscript. The paper by Hoel and Li (Health Physics 75:241-50) shows how the linear-quadratic model has basic non-linearities at the low dose region for the leukemias including CML. Pierce et. al., (Radiation Research 123:275-84) have developed distributions for the uncertainty in the estimated exposures of the A-bomb cohort. Kellerer, et. al., (Radiation and Environmental Biophysics 36:73-83) has further considered possible errors in the estimated neutron values and with changing RBE values with dose and has hypothesized that the tumor response due to gamma may not be linear. We have incorporated his neutron model and have constricted new A-bomb doses based on his model adjustments. The Hoel and Li dose response analysis has also been applied using the Kellerer neutron dose adjustments for the leukemias. Finally, both Pierce's dose uncertainties and Kellerer neutron adjustments are combined as well as the varying RBE with dose as suggested by Rossi and Zaider and used for leukemia dose-response analysis. First the results of Hoel and Li showing a significantly improved fit of the linear-quadratic dose response by the inclusion of a threshold (i.e. low-dose nonlinearity) persisted. This work has been complete for both solid tumor as well as leukemia for both mortality as well as incidence data. The results are given in the manuscript described below which has been submitted to Health Physics.

  8. Risk analysis for autonomous underwater vehicle operations in extreme environments.

    PubMed

    Brito, Mario Paulo; Griffiths, Gwyn; Challenor, Peter

    2010-12-01

    Autonomous underwater vehicles (AUVs) are used increasingly to explore hazardous marine environments. Risk assessment for such complex systems is based on subjective judgment and expert knowledge as much as on hard statistics. Here, we describe the use of a risk management process tailored to AUV operations, the implementation of which requires the elicitation of expert judgment. We conducted a formal judgment elicitation process where eight world experts in AUV design and operation were asked to assign a probability of AUV loss given the emergence of each fault or incident from the vehicle's life history of 63 faults and incidents. After discussing methods of aggregation and analysis, we show how the aggregated risk estimates obtained from the expert judgments were used to create a risk model. To estimate AUV survival with mission distance, we adopted a statistical survival function based on the nonparametric Kaplan-Meier estimator. We present theoretical formulations for the estimator, its variance, and confidence limits. We also present a numerical example where the approach is applied to estimate the probability that the Autosub3 AUV would survive a set of missions under Pine Island Glacier, Antarctica in January-March 2009.

  9. An Emerging New Risk Analysis Science: Foundations and Implications.

    PubMed

    Aven, Terje

    2017-09-07

    To solve real-life problems-such as those related to technology, health, security, or climate change-and make suitable decisions, risk is nearly always a main issue. Different types of sciences are often supporting the work, for example, statistics, natural sciences, and social sciences. Risk analysis approaches and methods are also commonly used, but risk analysis is not broadly accepted as a science in itself. A key problem is the lack of explanatory power and large uncertainties when assessing risk. This article presents an emerging new risk analysis science based on novel ideas and theories on risk analysis developed in recent years by the risk analysis community. It builds on a fundamental change in thinking, from the search for accurate predictions and risk estimates, to knowledge generation related to concepts, theories, frameworks, approaches, principles, methods, and models to understand, assess, characterize, communicate, and (in a broad sense) manage risk. Examples are used to illustrate the importance of this distinct/separate risk analysis science for solving risk problems, supporting science in general and other disciplines in particular. © 2016 The Authors Risk Analysis published by Wiley Periodicals, Inc. on behalf of Society for Risk Analysis.

  10. Applying Transactional Analysis and Personality Assessment to Improve Patient Counseling and Communication Skills

    PubMed Central

    Lawrence, Lesa

    2007-01-01

    Objective To teach pharmacy students how to apply transactional analysis and personality assessment to patient counseling to improve communication. Design A lecture series for a required pharmacy communications class was developed to teach pharmacy students how to apply transactional analysis and personality assessment to patient counseling. Students were asked to apply these techniques and to report their experiences. A personality self-assessment was also conducted. Assessment After attending the lecture series, students were able to apply the techniques and demonstrated an understanding of the psychological factors that may affect patient communication, an appreciation for the diversity created by different personality types, the ability to engage patients based on adult-to-adult interaction cues, and the ability to adapt the interactive patient counseling model to different personality traits. Conclusion Students gained a greater awareness of transactional analysis and personality assessment by applying these concepts. This understanding will help students communicate more effectively with patients. PMID:17786269

  11. Risk assessment framework of fate and transport models applied to hazardous waste sites

    SciTech Connect

    Hwang, S.T.

    1993-06-01

    Risk assessment is an increasingly important part of the decision-making process in the cleanup of hazardous waste sites. Despite guidelines from regulatory agencies and considerable research efforts to reduce uncertainties in risk assessments, there are still many issues unanswered. This paper presents new research results pertaining to fate and transport models, which will be useful in estimating exposure concentrations and will help reduce uncertainties in risk assessment. These developments include an approach for (1) estimating the degree of emissions and concentration levels of volatile pollutants during the use of contaminated water, (2) absorption of organic chemicals in the soil matrix through the skin, and (3) steady state, near-field, contaminant concentrations in the aquifer within a waste boundary.

  12. Applying predictive analytics to develop an intelligent risk detection application for healthcare contexts.

    PubMed

    Moghimi, Fatemeh Hoda; Cheung, Michael; Wickramasinghe, Nilmini

    2013-01-01

    Healthcare is an information rich industry where successful outcomes require the processing of multi-spectral data and sound decision making. The exponential growth of data and big data issues coupled with a rapid increase of service demands in healthcare contexts today, requires a robust framework enabled by IT (information technology) solutions as well as real-time service handling in order to ensure superior decision making and successful healthcare outcomes. Such a context is appropriate for the application of real time intelligent risk detection decision support systems using predictive analytic techniques such as data mining. To illustrate the power and potential of data science technologies in healthcare decision making scenarios, the use of an intelligent risk detection (IRD) model is proffered for the context of Congenital Heart Disease (CHD) in children, an area which requires complex high risk decisions that need to be made expeditiously and accurately in order to ensure successful healthcare outcomes.

  13. Use-related risk analysis for medical devices based on improved FMEA.

    PubMed

    Liu, Long; Shuai, Ma; Wang, Zhu; Li, Ping

    2012-01-01

    In order to effectively analyze and control use-related risk of medical devices, quantitative methodologies must be applied. Failure Mode and Effects Analysis (FMEA) is a proactive technique for error detection and risk reduction. In this article, an improved FMEA based on Fuzzy Mathematics and Grey Relational Theory is developed to better carry out user-related risk analysis for medical devices. As an example, the analysis process using this improved FMEA method for a certain medical device (C-arm X-ray machine) is described.

  14. Analysis and classification of the tools for assessing the risks associated with industrial machines.

    PubMed

    Paques, Joseph-Jean; Gauthier, François; Perez, Alejandro

    2007-01-01

    To assess and plan future risk-analysis research projects, 275 documents describing methods and tools for assessing the risks associated with industrial machines or with other sectors such as the military, and the nuclear and aeronautics industries, etc., were collected. These documents were in the format of published books or papers, standards, technical guides and company procedures collected throughout industry. From the collected documents, 112 documents were selected for analysis; 108 methods applied or potentially applicable for assessing the risks associated with industrial machines were analyzed and classified. This paper presents the main quantitative results of the analysis of the methods and tools.

  15. Applying Risk Science and Stakeholder Engagement to Overcome Environmental Barriers to Marine and Hydrokinetic Energy Projects

    SciTech Connect

    Copping, Andrea E.; Anderson, Richard M.; Van Cleve, Frances B.

    2010-09-20

    The production of electricity from the moving waters of the ocean has the potential to be a viable addition to the portfolio of renewable energy sources worldwide. The marine and hydrokinetic (MHK) industry faces many hurdles, including technology development, challenges of offshore deployments, and financing; however, the barrier most commonly identified by industry, regulators, and stakeholders is the uncertainty surrounding potential environmental effects of devices placed in the water and the permitting processes associated with real or potential impacts. Regulatory processes are not well positioned to judge the severity of harm due to turbines or wave generators. Risks from MHK devices to endangered or protected animals in coastal waters and rivers, as well as the habitats that support them, are poorly understood. This uncertainty raises concerns about catastrophic interactions between spinning turbine blades or slack mooring lines and marine mammals, birds and fish. In order to accelerate the deployment of tidal and wave devices, there is a need to sort through the extensive list of potential interactions that may cause harm to marine organisms and ecosystems, to set priorities for regulatory triggers, and to direct future research. Identifying the risk of MHK technology components on specific marine organisms and ecosystem components can separate perceived from real risk-relevant interactions. Scientists from Pacific Northwest National Laboratory (PNNL) are developing an Environmental Risk Evaluation System (ERES) to assess environmental effects associated with MHK technologies and projects through a systematic analytical process, with specific input from key stakeholder groups. The array of stakeholders interested in the development of MHK is broad, segmenting into those whose involvement is essential for the success of the MHK project, those that are influential, and those that are interested. PNNL and their partners have engaged these groups, gaining

  16. Research in progress in applied mathematics, numerical analysis, and computer science

    NASA Technical Reports Server (NTRS)

    1990-01-01

    Research conducted at the Institute in Science and Engineering in applied mathematics, numerical analysis, and computer science is summarized. The Institute conducts unclassified basic research in applied mathematics in order to extend and improve problem solving capabilities in science and engineering, particularly in aeronautics and space.

  17. Applied behavior analysis: understanding and changing behavior in the community-a representative review.

    PubMed

    Luyben, Paul D

    2009-01-01

    Applied behavior analysis, a psychological discipline, has been characterized as the science of behavior change (Chance, 2006). Research in applied behavior analysis has been published for approximately 40 years since the initial publication of the Journal of Applied Behavior Analysis in 1968. The field now encompasses a wide range of human behavior. Although much of the published research centers on problem behaviors that occur in schools and among people with disabilities, a substantial body of knowledge has emerged in community settings. This article provides a review of the behavioral community research published in the Journal of Applied Behavior Analysis as representative of this work, including research in the areas of home and family, health, safety, community involvement and the environment, recreation and sports, crime and delinquency, and organizations. In the interest of space, research in schools and with people with disabilities has been excluded from this review.

  18. Elusive Critical Elements of Transformative Risk Assessment Practice and Interpretation: Is Alternatives Analysis the Next Step?

    PubMed

    Francis, Royce A

    2015-11-01

    This article argues that "game-changing" approaches to risk analysis must focus on "democratizing" risk analysis in the same way that information technologies have democratized access to, and production of, knowledge. This argument is motivated by the author's reading of Goble and Bier's analysis, "Risk Assessment Can Be a Game-Changing Information Technology-But Too Often It Isn't" (Risk Analysis, 2013; 33: 1942-1951), in which living risk assessments are shown to be "game changing" in probabilistic risk analysis. In this author's opinion, Goble and Bier's article focuses on living risk assessment's potential for transforming risk analysis from the perspective of risk professionals-yet, the game-changing nature of information technologies has typically achieved a much broader reach. Specifically, information technologies change who has access to, and who can produce, information. From this perspective, the author argues that risk assessment is not a game-changing technology in the same way as the printing press or the Internet because transformative information technologies reduce the cost of production of, and access to, privileged knowledge bases. The author argues that risk analysis does not reduce these costs. The author applies Goble and Bier's metaphor to the chemical risk analysis context, and in doing so proposes key features that transformative risk analysis technology should possess. The author also discusses the challenges and opportunities facing risk analysis in this context. These key features include: clarity in information structure and problem representation, economical information dissemination, increased transparency to nonspecialists, democratized manufacture and transmission of knowledge, and democratic ownership, control, and interpretation of knowledge. The chemical safety decision-making context illustrates the impact of changing the way information is produced and accessed in the risk context. Ultimately, the author concludes that although

  19. A Bayesian approach to landscape ecological risk assessment applied to the upper Grande Ronde watershed, Oregon

    Treesearch

    Kimberley K. Ayre; Wayne G. Landis

    2012-01-01

    We present a Bayesian network model based on the ecological risk assessment framework to evaluate potential impacts to habitats and resources resulting from wildfire, grazing, forest management activities, and insect outbreaks in a forested landscape in northeastern Oregon. The Bayesian network structure consisted of three tiers of nodes: landscape disturbances,...

  20. Invitational Theory and Practice Applied to Resiliency Development in At-Risk Youth

    ERIC Educational Resources Information Center

    Lee, R. Scott

    2012-01-01

    Resilience development is a growing field of study within the scholarly literature regarding social emotional achievement of at-risk students. Developing resiliency is based on the assumption that positive, pro-social, and/or strength-based values inherent in children and youth should be actively and intentionally developed. The core values of…

  1. At-Risk Students and Virtual Enterprise: Tourism and Hospitality Simulations in Applied and Academic Learning.

    ERIC Educational Resources Information Center

    Borgese, Anthony

    This paper discusses Virtual Enterprise (VE), a technology-driven business simulation program in which students conceive, create, and operate enterprises that utilize Web-based and other technologies to trade products and services around the world. The study examined the effects of VE on a learning community of at-risk students, defined as those…

  2. Applying Social Norms Theory within Affiliation Groups: Promising Interventions for High-Risk Drinking

    ERIC Educational Resources Information Center

    Bruce, Susan; Keller, Adrienne E.

    2007-01-01

    On college campuses across the country, high-risk drinking and the associated negative consequences have become a national concern. As colleges strive to find appropriate and effective approaches to deal with this issue, social norms theory provides a coherent framework for interventions that are relevant and positive. Small Group Social Norms…

  3. INTERPRETATION OF SPLP RESULTS FOR ASSESSING RISK TO GROUNDWATER FROM LAND-APPLIED GRANULAR WASTE

    EPA Science Inventory

    Scientists and engineers often rely on results from the synthetic precipitation leaching procedure (SPLP) to assess the risk of groundwater contamination posed by the land application of granular solid wastes. The concentrations of pollutants in SPLP leachate can be measured and ...

  4. Human health risk assessment of triclosan in land-applied biosolids.

    PubMed

    Verslycke, Tim; Mayfield, David B; Tabony, Jade A; Capdevielle, Marie; Slezak, Brian

    2016-09-01

    Triclosan (5-chloro-2-[2,4-dichlorophenoxy]-phenol) is an antimicrobial agent found in a variety of pharmaceutical and personal care products. Numerous studies have examined the occurrence and environmental fate of triclosan in wastewater, biosolids, biosolids-amended soils, and plants and organisms exposed to biosolid-amended soils. Triclosan has a propensity to adhere to organic carbon in biosolids and biosolid-amended soils. Land application of biosolids containing triclosan has the potential to contribute to multiple direct and indirect human health exposure pathways. To estimate exposures and human health risks from biosolid-borne triclosan, a risk assessment was conducted in general accordance with the methodology incorporated into the US Environmental Protection Agency's Part 503 biosolids rule. Human health exposures to biosolid-borne triclosan were estimated on the basis of published empirical data or modeled using upper-end environmental partitioning estimates. Similarly, a range of published triclosan human health toxicity values was evaluated. Margins of safety were estimated for 10 direct and indirect exposure pathways, both individually and combined. The present risk assessment found large margins of safety (>1000 to >100 000) for potential exposures to all pathways, even under the most conservative exposure and toxicity assumptions considered. The human health exposures and risks from biosolid-borne triclosan are concluded to be de minimis. Environ Toxicol Chem 2016;35:2358-2367. © 2016 SETAC. © 2016 SETAC.

  5. APPLYING A COGNITIVE–BEHAVIORAL MODEL OF HIV RISK TO YOUTHS IN PSYCHIATRIC CARE

    PubMed Central

    Donenberg, Geri R.; Schwartz, Rebecca Moss; Emerson, Erin; Wilson, Helen W.; Bryant, Fred B.; Coleman, Gloria

    2005-01-01

    This study examined the utility of cognitive and behavioral constructs (AIDS information, motivation, and behavioral skills) in explaining sexual risk taking among 172 12–20–year-old ethnically diverse urban youths in outpatient psychiatric care. Structural equation modeling revealed only moderate support for the model, explaining low to moderate levels of variance in global sexual risk taking. The amount of explained variance improved when age was included as a predictor in the model. Findings shed light on the contribution of AIDS information, motivation, and behavioral skills to risky sexual behavior among teens receiving outpatient psychiatric care. Results suggest that cognitive and behavioral factors alone may not explain sexual risk taking among teens whose cognitive and emotional deficits (e.g., impaired judgment, poor reality testing, affect dysregulation) interfere with HIV preventive behavior. The most powerful explanatory model will likely include a combination of cognitive, behavioral, developmental, social (e.g., family), and personal (e.g., psychopathology) risk mechanisms. PMID:16006207

  6. Invitational Theory and Practice Applied to Resiliency Development in At-Risk Youth

    ERIC Educational Resources Information Center

    Lee, R. Scott

    2012-01-01

    Resilience development is a growing field of study within the scholarly literature regarding social emotional achievement of at-risk students. Developing resiliency is based on the assumption that positive, pro-social, and/or strength-based values inherent in children and youth should be actively and intentionally developed. The core values of…

  7. Applying the Triad method in a risk assessment of a former surface treatment and metal industry site.

    PubMed

    Ribé, Veronica; Aulenius, Elisabet; Nehrenheim, Emma; Martell, Ulrika; Odlare, Monica

    2012-03-15

    With a greater focus on soil protection in the E.U., the need for ecological risk assessment tools for cost-effective characterization of site contamination is increasing. One of the challenges in assessing the risk of soil contaminants is to accurately account for changes in mobility of contaminants over time, as a result of ageing. Improved tools for measuring the bioavailable and mobile fraction of contaminants is therefore highly desirable. In this study the Triad method was used to perform a risk characterization of a former surface treatment and metal industry in Eskilstuna, Sweden. The risk assessment confirmed the environmental risk of the most heavily contaminated sample and showed that the toxic effect was most likely caused by high metal concentrations. The assessment of the two soil samples with low to moderate metal contamination levels was more complex, as there was a higher deviation between the results from the three lines of evidence; chemistry, (eco)toxicology and ecology. For the slightly less contaminated sample of the two, a weighting of the results from the ecotoxicological LoE would be recommended in order to accurately determine the risk of the metal contamination at the sampling site as the toxic effect detected in the Microtox® test and Ostracodtoxkit™ test was more likely to be due to oil contamination. The soil sample with higher total metal concentrations requires further ecotoxicological testing, as the integrated risk value indicated an environmental risk from metal contamination. The applied methodology, the Triad method, is considered appropriate for conducting improved environmental risk assessments in order to achieve sustainable remediation processes. Copyright © 2011 Elsevier B.V. All rights reserved.

  8. 49 CFR 260.17 - Credit risk premium analysis.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... Financial Assistance § 260.17 Credit risk premium analysis. (a) When Federal appropriations are not...; (D) Capital expenditures; and (E) Operating efficiency. (ii) Financial risk, based on Applicant?s... 49 Transportation 4 2011-10-01 2011-10-01 false Credit risk premium analysis. 260.17 Section 260...

  9. Applying Monte Carlo Simulation to Launch Vehicle Design and Requirements Analysis

    NASA Technical Reports Server (NTRS)

    Hanson, J. M.; Beard, B. B.

    2010-01-01

    This Technical Publication (TP) is meant to address a number of topics related to the application of Monte Carlo simulation to launch vehicle design and requirements analysis. Although the focus is on a launch vehicle application, the methods may be applied to other complex systems as well. The TP is organized so that all the important topics are covered in the main text, and detailed derivations are in the appendices. The TP first introduces Monte Carlo simulation and the major topics to be discussed, including discussion of the input distributions for Monte Carlo runs, testing the simulation, how many runs are necessary for verification of requirements, what to do if results are desired for events that happen only rarely, and postprocessing, including analyzing any failed runs, examples of useful output products, and statistical information for generating desired results from the output data. Topics in the appendices include some tables for requirements verification, derivation of the number of runs required and generation of output probabilistic data with consumer risk included, derivation of launch vehicle models to include possible variations of assembled vehicles, minimization of a consumable to achieve a two-dimensional statistical result, recontact probability during staging, ensuring duplicated Monte Carlo random variations, and importance sampling.

  10. Applying a sociolinguistic model to the analysis of informed consent documents.

    PubMed

    Granero-Molina, José; Fernández-Sola, Cayetano; Aguilera-Manrique, Gabriel

    2009-11-01

    Information on the risks and benefits related to surgical procedures is essential for patients in order to obtain their informed consent. Some disciplines, such as sociolinguistics, offer insights that are helpful for patient-professional communication in both written and oral consent. Communication difficulties become more acute when patients make decisions through an informed consent document because they may sign this with a lack of understanding and information, and consequently feel deprived of their freedom to make their choice about different treatments or surgery. This article discusses findings from documentary analysis using the sociolinguistic SPEAKING model, which was applied to the general and specific informed consent documents required for laparoscopic surgery of the bile duct at Torrecárdenas Hospital, Almería, Spain. The objective of this procedure was to identify flaws when information was provided, together with its readability, its voluntary basis, and patients' consent. The results suggest potential linguistic communication difficulties, different languages being used, cultural clashes, asymmetry of communication between professionals and patients, assignment of rights on the part of patients, and overprotection of professionals and institutions.

  11. Problem Formulation for Human Health Risk Assessments of Pathogens in Land-Applied Biosolids (Final Report)

    EPA Science Inventory

    Millions of tons of treated sewage sludges or “biosolids” are applied annually to farms, forests, rangelands, mine lands and other types of land in the United States. Biosolids are defined by the U.S. Environmental Protection Agency (EPA) as “the primarily organic solid product ...

  12. Problem Formulation for Human Health Risk Assessments of Pathogens in Land-Applied Biosolids (Final Report)

    EPA Science Inventory

    Millions of tons of treated sewage sludges or “biosolids” are applied annually to farms, forests, rangelands, mine lands and other types of land in the United States. Biosolids are defined by the U.S. Environmental Protection Agency (EPA) as “the primarily organic solid product ...

  13. Concentration of Risk Model (CORM) Verification and Analysis

    DTIC Science & Technology

    2014-06-15

    Mental Health and using data from a repository at the University of Michigan, had attempted to identify soldiers at higher-than-average risk of suicide ...TRAC-M-TR-14-023 15 June 2014 Concentration of Risk Model (CORM) Verification and Analysis TRADOC Analysis Center - Monterey 700 Dyer Road Monterey...TRAC-M-TR-14-023 15 June 2014 Concentration of Risk Model (CORM) Verification and Analysis Edward M. Masotti Sam Buttrey TRADOC Analysis Center

  14. Performance analysis of high quality parallel preconditioners applied to 3D finite element structural analysis

    SciTech Connect

    Kolotilina, L.; Nikishin, A.; Yeremin, A.

    1994-12-31

    The solution of large systems of linear equations is a crucial bottleneck when performing 3D finite element analysis of structures. Also, in many cases the reliability and robustness of iterative solution strategies, and their efficiency when exploiting hardware resources, fully determine the scope of industrial applications which can be solved on a particular computer platform. This is especially true for modern vector/parallel supercomputers with large vector length and for modern massively parallel supercomputers. Preconditioned iterative methods have been successfully applied to industrial class finite element analysis of structures. The construction and application of high quality preconditioners constitutes a high percentage of the total solution time. Parallel implementation of high quality preconditioners on such architectures is a formidable challenge. Two common types of existing preconditioners are the implicit preconditioners and the explicit preconditioners. The implicit preconditioners (e.g. incomplete factorizations of several types) are generally high quality but require solution of lower and upper triangular systems of equations per iteration which are difficult to parallelize without deteriorating the convergence rate. The explicit type of preconditionings (e.g. polynomial preconditioners or Jacobi-like preconditioners) require sparse matrix-vector multiplications and can be parallelized but their preconditioning qualities are less than desirable. The authors present results of numerical experiments with Factorized Sparse Approximate Inverses (FSAI) for symmetric positive definite linear systems. These are high quality preconditioners that possess a large resource of parallelism by construction without increasing the serial complexity.

  15. Developing scenarios to assess future landslide risks: a model-based approach applied to mountainous regions

    NASA Astrophysics Data System (ADS)

    Vacquie, Laure; Houet, Thomas

    2016-04-01

    In the last century, European mountain landscapes have experienced significant transformations. Natural and anthropogenic changes, climate changes, touristic and industrial development, socio-economic interactions, and their implications in terms of LUCC (land use and land cover changes) have directly influenced the spatial organization and vulnerability of mountain landscapes. This study is conducted as part of the SAMCO project founded by the French National Science Agency (ANR). It aims at developing a methodological approach, combining various tools, modelling platforms and methods, to identify vulnerable regions to landslide hazards accounting for futures LUCC. It presents an integrated approach combining participative scenarios and a LULC changes simulation models to assess the combined effects of LUCC and climate change on landslide risks in the Cauterets valley (French Pyrenees Mountains) up to 2100. Through vulnerability and risk mapping, the objective is to gather information to support landscape planning and implement land use strategies with local stakeholders for risk management. Four contrasting scenarios are developed and exhibit contrasting trajectories of socio-economic development. Prospective scenarios are based on national and international socio-economic contexts relying on existing assessment reports. The methodological approach integrates knowledge from local stakeholders to refine each scenario during their construction and to reinforce their plausibility and relevance by accounting for local specificities, e.g. logging and pastoral activities, touristic development, urban planning, etc. A process-based model, the Forecasting Scenarios for Mountains (ForeSceM) model, developed on the Dinamica Ego modelling platform is used to spatially allocate futures LUCC for each prospective scenario. Concurrently, a spatial decision support tool, i.e. the SYLVACCESS model, is used to identify accessible areas for forestry in scenario projecting logging

  16. Overcoming barriers to integrating economic analysis into risk assessment.

    PubMed

    Hoffmann, Sandra

    2011-09-01

    Regulatory risk analysis is designed to provide decisionmakers with a clearer understanding of how policies are likely to affect risk. The systems that produce risk are biological, physical, and social and economic. As a result, risk analysis is an inherently interdisciplinary task. Yet in practice, risk analysis has been interdisciplinary in only limited ways. Risk analysis could provide more accurate assessments of risk if there were better integration of economics and other social sciences into risk assessment itself. This essay examines how discussions about risk analysis policy have influenced the roles of various disciplines in risk analysis. It explores ways in which integrated bio/physical-economic modeling could contribute to more accurate assessments of risk. It reviews examples of the kind of integrated economics-bio/physical modeling that could be used to enhance risk assessment. The essay ends with a discussion of institutional barriers to greater integration of economic modeling into risk assessment and provides suggestions on how these might be overcome.

  17. Environmental Risk Assessment of antimicrobials applied in veterinary medicine-A field study and laboratory approach.

    PubMed

    Slana, Marko; Dolenc, Marija Sollner

    2013-01-01

    The fate and environmental risk of antimicrobial compounds of different groups of veterinary medicine pharmaceuticals (VMP's) have been compared. The aim was to demonstrate a correlation between the physical and chemical properties of active compounds and their metabolism in target animals, as well as their fate in the environment. In addition, the importance of techniques for manure management and agricultural practice and their influence on the fate of active compounds is discussed. The selected active compounds are shown to be susceptible to at least one environmental factor (sun, water, bacterial or fungal degradation) to which they are exposed during their life cycle, which contributes to its degradation. Degradation under a number of environmental factors has also to be considered as authentic information additional to that observed in the limited conditions in laboratory studies and in Environmental Risk Assessment calculations. Copyright © 2012 Elsevier B.V. All rights reserved.

  18. Locating and applying sociological theories of risk-taking to develop public health interventions for adolescents.

    PubMed

    Pound, Pandora; Campbell, Rona

    2015-01-02

    Sociological theories seldom inform public health interventions at the community level. The reasons for this are unclear but may include difficulties in finding, understanding or operationalising theories. We conducted a study to explore the feasibility of locating sociological theories within a specific field of public health, adolescent risk-taking, and to consider their potential for practical application. We identified a range of sociological theories. These explained risk-taking: (i) as being due to lack of social integration; (ii) as a consequence of isolation from mainstream society; (iii) as a rite of passage; (iv) as a response to social constraints; (v) as resistance; (vi) as an aspect of adolescent development; (vii) by the theory of the 'habitus'; (viii) by situated rationality and social action theories; and (ix) as social practice. We consider these theories in terms of their potential to inform public health interventions for young people.

  19. Locating and applying sociological theories of risk-taking to develop public health interventions for adolescents

    PubMed Central

    Pound, Pandora; Campbell, Rona

    2015-01-01

    Sociological theories seldom inform public health interventions at the community level. The reasons for this are unclear but may include difficulties in finding, understanding or operationalising theories. We conducted a study to explore the feasibility of locating sociological theories within a specific field of public health, adolescent risk-taking, and to consider their potential for practical application. We identified a range of sociological theories. These explained risk-taking: (i) as being due to lack of social integration; (ii) as a consequence of isolation from mainstream society; (iii) as a rite of passage; (iv) as a response to social constraints; (v) as resistance; (vi) as an aspect of adolescent development; (vii) by the theory of the ‘habitus’; (viii) by situated rationality and social action theories; and (ix) as social practice. We consider these theories in terms of their potential to inform public health interventions for young people. PMID:25999784

  20. HANFORD SAFETY ANALYSIS & RISK ASSESSMENT HANDBOOK (SARAH)

    SciTech Connect

    EVANS, C B

    2004-12-21

    The purpose of the Hanford Safety Analysis and Risk Assessment Handbook (SARAH) is to support the development of safety basis documentation for Hazard Category 2 and 3 (HC-2 and 3) U.S. Department of Energy (DOE) nuclear facilities to meet the requirements of 10 CFR 830, ''Nuclear Safety Management''. Subpart B, ''Safety Basis Requirements.'' Consistent with DOE-STD-3009-94, Change Notice 2, ''Preparation Guide for U.S. Department of Energy Nonreactor Nuclear Facility Documented Safety Analyses'' (STD-3009), and DOE-STD-3011-2002, ''Guidance for Preparation of Basis for Interim Operation (BIO) Documents'' (STD-3011), the Hanford SARAH describes methodology for performing a safety analysis leading to development of a Documented Safety Analysis (DSA) and derivation of Technical Safety Requirements (TSR), and provides the information necessary to ensure a consistently rigorous approach that meets DOE expectations. The DSA and TSR documents, together with the DOE-issued Safety Evaluation Report (SER), are the basic components of facility safety basis documentation. For HC-2 or 3 nuclear facilities in long-term surveillance and maintenance (S&M), for decommissioning activities, where source term has been eliminated to the point that only low-level, residual fixed contamination is present, or for environmental remediation activities outside of a facility structure, DOE-STD-1120-98, ''Integration of Environment, Safety, and Health into Facility Disposition Activities'' (STD-1120), may serve as the basis for the DSA. HC-2 and 3 environmental remediation sites also are subject to the hazard analysis methodologies of this standard.