An Integrated Approach to Risk Assessment for Concurrent Design
NASA Technical Reports Server (NTRS)
Meshkat, Leila; Voss, Luke; Feather, Martin; Cornford, Steve
2005-01-01
This paper describes an approach to risk assessment and analysis suited to the early phase, concurrent design of a space mission. The approach integrates an agile, multi-user risk collection tool, a more in-depth risk analysis tool, and repositories of risk information. A JPL developed tool, named RAP, is used for collecting expert opinions about risk from designers involved in the concurrent design of a space mission. Another in-house developed risk assessment tool, named DDP, is used for the analysis.
Risk analysis of computer system designs
NASA Technical Reports Server (NTRS)
Vallone, A.
1981-01-01
Adverse events during implementation can affect final capabilities, schedule and cost of a computer system even though the system was accurately designed and evaluated. Risk analysis enables the manager to forecast the impact of those events and to timely ask for design revisions or contingency plans before making any decision. This paper presents a structured procedure for an effective risk analysis. The procedure identifies the required activities, separates subjective assessments from objective evaluations, and defines a risk measure to determine the analysis results. The procedure is consistent with the system design evaluation and enables a meaningful comparison among alternative designs.
An Example of Risk Informed Design
NASA Technical Reports Server (NTRS)
Banke, Rick; Grant, Warren; Wilson, Paul
2014-01-01
NASA Engineering requested a Probabilistic Risk Assessment (PRA) to compare the difference in the risk of Loss of Crew (LOC) and Loss of Mission (LOM) between different designs of a fluid assembly. They were concerned that the configuration favored by the design team was more susceptible to leakage than a second proposed design, but realized that a quantitative analysis to compare the risks between the two designs might strengthen their argument. The analysis showed that while the second design did help improve the probability of LOC, it did not help from a probability of LOM perspective. This drove the analysis team to propose a minor design change that would drive the probability of LOM down considerably. The analysis also demonstrated that there was another major risk driver that was not immediately obvious from a typical engineering study of the design and was therefore unexpected. None of the proposed alternatives were addressing this risk. This type of trade study demonstrates the importance of performing a PRA in order to completely understand a system's design. It allows managers to use risk as another one of the commodities (e.g., mass, cost, schedule, fault tolerance) that can be traded early in the design of a new system.
A method for scenario-based risk assessment for robust aerospace systems
NASA Astrophysics Data System (ADS)
Thomas, Victoria Katherine
In years past, aircraft conceptual design centered around creating a feasible aircraft that could be built and could fly the required missions. More recently, aircraft viability entered into conceptual design, allowing that the product's potential to be profitable should also be examined early in the design process. While examining an aerospace system's feasibility and viability early in the design process is extremely important, it is also important to examine system risk. In traditional aerospace systems risk analysis, risk is examined from the perspective of performance, schedule, and cost. Recently, safety and reliability analysis have been brought forward in the design process to also be examined during late conceptual and early preliminary design. While these analyses work as designed, existing risk analysis methods and techniques are not designed to examine an aerospace system's external operating environment and the risks present there. A new method has been developed here to examine, during the early part of concept design, the risk associated with not meeting assumptions about the system's external operating environment. The risks are examined in five categories: employment, culture, government and politics, economics, and technology. The risks are examined over a long time-period, up to the system's entire life cycle. The method consists of eight steps over three focus areas. The first focus area is Problem Setup. During problem setup, the problem is defined and understood to the best of the decision maker's ability. There are four steps in this area, in the following order: Establish the Need, Scenario Development, Identify Solution Alternatives, and Uncertainty and Risk Identification. There is significant iteration between steps two through four. Focus area two is Modeling and Simulation. In this area the solution alternatives and risks are modeled, and a numerical value for risk is calculated. A risk mitigation model is also created. The four steps involved in completing the modeling and simulation are: Alternative Solution Modeling, Uncertainty Quantification, Risk Assessment, and Risk Mitigation. Focus area three consists of Decision Support. In this area a decision support interface is created that allows for game playing between solution alternatives and risk mitigation. A multi-attribute decision making process is also implemented to aid in decision making. A demonstration problem inspired by Airbus' mid 1980s decision to break into the widebody long-range market was developed to illustrate the use of this method. The results showed that the method is able to capture additional types of risk than previous analysis methods, particularly at the early stages of aircraft design. It was also shown that the method can be used to help create a system that is robust to external environmental factors. The addition of an external environment risk analysis in the early stages of conceptual design can add another dimension to the analysis of feasibility and viability. The ability to take risk into account during the early stages of the design process can allow for the elimination of potentially feasible and viable but too-risky alternatives. The addition of a scenario-based analysis instead of a traditional probabilistic analysis enabled uncertainty to be effectively bound and examined over a variety of potential futures instead of only a single future. There is also potential for a product to be groomed for a specific future that one believes is likely to happen, or for a product to be steered during design as the future unfolds.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Szilard, Ronaldo Henriques
A Risk Informed Safety Margin Characterization (RISMC) toolkit and methodology are proposed for investigating nuclear power plant core, fuels design and safety analysis, including postulated Loss-of-Coolant Accident (LOCA) analysis. This toolkit, under an integrated evaluation model framework, is name LOCA toolkit for the US (LOTUS). This demonstration includes coupled analysis of core design, fuel design, thermal hydraulics and systems analysis, using advanced risk analysis tools and methods to investigate a wide range of results.
An Analysis of Risk and Function Information in Early Stage Design
NASA Technical Reports Server (NTRS)
Barrientos, Francesca; Tumer, Irem; Grantham, Katie; VanWie, Michael; Stone, Robert
2005-01-01
The concept of function offers a high potential for thinking and reasoning about designs as well as providing a common thread for relating together other design information. This paper focuses specifically on the relation between function and risk by examining how this information is addressed for a design team conducting early stage design for space missions. Risk information is decomposed into a set of key attributes which are then used to scrutinize the risk information using three approaches from the pragmatics sub-field of linguistics: i) Gricean, ii) Relevance Theory, and Functional Analysis. Results of this linguistics-based approach descriptively account for the context of designer communication with respect to function and risk, and offer prescriptive guidelines for improving designer communication.
NASA Astrophysics Data System (ADS)
Qi, Wei
2017-11-01
Cost-benefit analysis is commonly used for engineering planning and design problems in practice. However, previous cost-benefit based design flood estimation is based on stationary assumption. This study develops a non-stationary cost-benefit based design flood estimation approach. This approach integrates a non-stationary probability distribution function into cost-benefit analysis, and influence of non-stationarity on expected total cost (including flood damage and construction costs) and design flood estimation can be quantified. To facilitate design flood selections, a 'Risk-Cost' analysis approach is developed, which reveals the nexus of extreme flood risk, expected total cost and design life periods. Two basins, with 54-year and 104-year flood data respectively, are utilized to illustrate the application. It is found that the developed approach can effectively reveal changes of expected total cost and extreme floods in different design life periods. In addition, trade-offs are found between extreme flood risk and expected total cost, which reflect increases in cost to mitigate risk. Comparing with stationary approaches which generate only one expected total cost curve and therefore only one design flood estimation, the proposed new approach generate design flood estimation intervals and the 'Risk-Cost' approach selects a design flood value from the intervals based on the trade-offs between extreme flood risk and expected total cost. This study provides a new approach towards a better understanding of the influence of non-stationarity on expected total cost and design floods, and could be beneficial to cost-benefit based non-stationary design flood estimation across the world.
USDA-ARS?s Scientific Manuscript database
Food risk analysis is a holistic approach to food safety because it considers all aspects of the problem. Risk assessment modeling is the foundation of food risk analysis. Proper design and simulation of the risk assessment model is important to properly predict and control risk. Because of knowl...
Conceptual Launch Vehicle and Spacecraft Design for Risk Assessment
NASA Technical Reports Server (NTRS)
Motiwala, Samira A.; Mathias, Donovan L.; Mattenberger, Christopher J.
2014-01-01
One of the most challenging aspects of developing human space launch and exploration systems is minimizing and mitigating the many potential risk factors to ensure the safest possible design while also meeting the required cost, weight, and performance criteria. In order to accomplish this, effective risk analyses and trade studies are needed to identify key risk drivers, dependencies, and sensitivities as the design evolves. The Engineering Risk Assessment (ERA) team at NASA Ames Research Center (ARC) develops advanced risk analysis approaches, models, and tools to provide such meaningful risk and reliability data throughout vehicle development. The goal of the project presented in this memorandum is to design a generic launch 7 vehicle and spacecraft architecture that can be used to develop and demonstrate these new risk analysis techniques without relying on other proprietary or sensitive vehicle designs. To accomplish this, initial spacecraft and launch vehicle (LV) designs were established using historical sizing relationships for a mission delivering four crewmembers and equipment to the International Space Station (ISS). Mass-estimating relationships (MERs) were used to size the crew capsule and launch vehicle, and a combination of optimization techniques and iterative design processes were employed to determine a possible two-stage-to-orbit (TSTO) launch trajectory into a 350-kilometer orbit. Primary subsystems were also designed for the crewed capsule architecture, based on a 24-hour on-orbit mission with a 7-day contingency. Safety analysis was also performed to identify major risks to crew survivability and assess the system's overall reliability. These procedures and analyses validate that the architecture's basic design and performance are reasonable to be used for risk trade studies. While the vehicle designs presented are not intended to represent a viable architecture, they will provide a valuable initial platform for developing and demonstrating innovative risk assessment capabilities.
Claycamp, H Gregg; Kona, Ravikanth; Fahmy, Raafat; Hoag, Stephen W
2016-04-01
Qualitative risk assessment methods are often used as the first step to determining design space boundaries; however, quantitative assessments of risk with respect to the design space, i.e., calculating the probability of failure for a given severity, are needed to fully characterize design space boundaries. Quantitative risk assessment methods in design and operational spaces are a significant aid to evaluating proposed design space boundaries. The goal of this paper is to demonstrate a relatively simple strategy for design space definition using a simplified Bayesian Monte Carlo simulation. This paper builds on a previous paper that used failure mode and effects analysis (FMEA) qualitative risk assessment and Plackett-Burman design of experiments to identity the critical quality attributes. The results show that the sequential use of qualitative and quantitative risk assessments can focus the design of experiments on a reduced set of critical material and process parameters that determine a robust design space under conditions of limited laboratory experimentation. This approach provides a strategy by which the degree of risk associated with each known parameter can be calculated and allocates resources in a manner that manages risk to an acceptable level.
Teaching Risk Analysis in an Aircraft Gas Turbine Engine Design Capstone Course
2016-01-01
American Institute of Aeronautics and Astronautics 1 Teaching Risk Analysis in an Aircraft Gas Turbine Engine Design Capstone Course...development costs, engine production costs, and scheduling (Byerley A. R., 2013) as well as the linkage between turbine inlet temperature, blade cooling...analysis SE majors have studied and how this is linked to the specific issues they must face in aircraft gas turbine engine design. Aeronautical and
Applying machine learning to pattern analysis for automated in-design layout optimization
NASA Astrophysics Data System (ADS)
Cain, Jason P.; Fakhry, Moutaz; Pathak, Piyush; Sweis, Jason; Gennari, Frank; Lai, Ya-Chieh
2018-04-01
Building on previous work for cataloging unique topological patterns in an integrated circuit physical design, a new process is defined in which a risk scoring methodology is used to rank patterns based on manufacturing risk. Patterns with high risk are then mapped to functionally equivalent patterns with lower risk. The higher risk patterns are then replaced in the design with their lower risk equivalents. The pattern selection and replacement is fully automated and suitable for use for full-chip designs. Results from 14nm product designs show that the approach can identify and replace risk patterns with quantifiable positive impact on the risk score distribution after replacement.
Towards Risk Based Design for NASA's Missions
NASA Technical Reports Server (NTRS)
Tumer, Irem Y.; Barrientos, Francesca; Meshkat, Leila
2004-01-01
This paper describes the concept of Risk Based Design in the context of NASA s low volume, high cost missions. The concept of accounting for risk in the design lifecycle has been discussed and proposed under several research topics, including reliability, risk analysis, optimization, uncertainty, decision-based design, and robust design. This work aims to identify and develop methods to enable and automate a means to characterize and optimize risk, and use risk as a tradeable resource to make robust and reliable decisions, in the context of the uncertain and ambiguous stage of early conceptual design. This paper first presents a survey of the related topics explored in the design research community as they relate to risk based design. Then, a summary of the topics from the NASA-led Risk Colloquium is presented, followed by current efforts within NASA to account for risk in early design. Finally, a list of "risk elements", identified for early-phase conceptual design at NASA, is presented. The purpose is to lay the foundation and develop a roadmap for future work and collaborations for research to eliminate and mitigate these risk elements in early phase design.
Towards a systems approach to risk considerations for concurrent design
NASA Technical Reports Server (NTRS)
Meshkat, Leila; Oberto, Robert E.
2004-01-01
This paper describes the new process used by the Project Design Center at NASA's Jet Propulsion Laboratory for the identification, assessment and communication of risk elements throughout the lifecycle of a mission design. This process includes a software tool, 'RAP' that collects and communicates risk information between the various designers and a 'risk expert' who mediates this process. The establishment of this process is an attempt towards the systematic consideration of risk in the design decision making process. Using this process, we are able to better keep track of the risks associated with the design decisions. Furthermore, it helps us develop better risk profiles for the studies under consideration. We aim to refine and expand the current process to enable more thorough risk analysis capabilities in the future.
Developing safety performance functions incorporating reliability-based risk measures.
Ibrahim, Shewkar El-Bassiouni; Sayed, Tarek
2011-11-01
Current geometric design guides provide deterministic standards where the safety margin of the design output is generally unknown and there is little knowledge of the safety implications of deviating from these standards. Several studies have advocated probabilistic geometric design where reliability analysis can be used to account for the uncertainty in the design parameters and to provide a risk measure of the implication of deviation from design standards. However, there is currently no link between measures of design reliability and the quantification of safety using collision frequency. The analysis presented in this paper attempts to bridge this gap by incorporating a reliability-based quantitative risk measure such as the probability of non-compliance (P(nc)) in safety performance functions (SPFs). Establishing this link will allow admitting reliability-based design into traditional benefit-cost analysis and should lead to a wider application of the reliability technique in road design. The present application is concerned with the design of horizontal curves, where the limit state function is defined in terms of the available (supply) and stopping (demand) sight distances. A comprehensive collision and geometric design database of two-lane rural highways is used to investigate the effect of the probability of non-compliance on safety. The reliability analysis was carried out using the First Order Reliability Method (FORM). Two Negative Binomial (NB) SPFs were developed to compare models with and without the reliability-based risk measures. It was found that models incorporating the P(nc) provided a better fit to the data set than the traditional (without risk) NB SPFs for total, injury and fatality (I+F) and property damage only (PDO) collisions. Copyright © 2011 Elsevier Ltd. All rights reserved.
Stingray Failure Mode, Effects and Criticality Analysis: WEC Risk Registers
Ken Rhinefrank
2016-07-25
Analysis method to systematically identify all potential failure modes and their effects on the Stingray WEC system. This analysis is incorporated early in the development cycle such that the mitigation of the identified failure modes can be achieved cost effectively and efficiently. The FMECA can begin once there is enough detail to functions and failure modes of a given system, and its interfaces with other systems. The FMECA occurs coincidently with the design process and is an iterative process which allows for design changes to overcome deficiencies in the analysis.Risk Registers for major subsystems completed according to the methodology described in "Failure Mode Effects and Criticality Analysis Risk Reduction Program Plan.pdf" document below, in compliance with the DOE Risk Management Framework developed by NREL.
Multi-Mission System Analysis for Planetary Entry (M-SAPE) Version 1
NASA Technical Reports Server (NTRS)
Samareh, Jamshid; Glaab, Louis; Winski, Richard G.; Maddock, Robert W.; Emmett, Anjie L.; Munk, Michelle M.; Agrawal, Parul; Sepka, Steve; Aliaga, Jose; Zarchi, Kerry;
2014-01-01
This report describes an integrated system for Multi-mission System Analysis for Planetary Entry (M-SAPE). The system in its current form is capable of performing system analysis and design for an Earth entry vehicle suitable for sample return missions. The system includes geometry, mass sizing, impact analysis, structural analysis, flight mechanics, TPS, and a web portal for user access. The report includes details of M-SAPE modules and provides sample results. Current M-SAPE vehicle design concept is based on Mars sample return (MSR) Earth entry vehicle design, which is driven by minimizing risk associated with sample containment (no parachute and passive aerodynamic stability). By M-SAPE exploiting a common design concept, any sample return mission, particularly MSR, will benefit from significant risk and development cost reductions. The design provides a platform by which technologies and design elements can be evaluated rapidly prior to any costly investment commitment.
Althuis, Michelle D; Weed, Douglas L; Frankenfeld, Cara L
2014-07-23
Assessment of design heterogeneity conducted prior to meta-analysis is infrequently reported; it is often presented post hoc to explain statistical heterogeneity. However, design heterogeneity determines the mix of included studies and how they are analyzed in a meta-analysis, which in turn can importantly influence the results. The goal of this work is to introduce ways to improve the assessment and reporting of design heterogeneity prior to statistical summarization of epidemiologic studies. In this paper, we use an assessment of sugar-sweetened beverages (SSB) and type 2 diabetes (T2D) as an example to show how a technique called 'evidence mapping' can be used to organize studies and evaluate design heterogeneity prior to meta-analysis.. Employing a systematic and reproducible approach, we evaluated the following elements across 11 selected cohort studies: variation in definitions of SSB, T2D, and co-variables, design features and population characteristics associated with specific definitions of SSB, and diversity in modeling strategies. Evidence mapping strategies effectively organized complex data and clearly depicted design heterogeneity. For example, across 11 studies of SSB and T2D, 7 measured diet only once (with 7 to 16 years of disease follow-up), 5 included primarily low SSB consumers, and 3 defined the study variable (SSB) as consumption of either sugar or artificially-sweetened beverages. This exercise also identified diversity in analysis strategies, such as adjustment for 11 to 17 co-variables and a large degree of fluctuation in SSB-T2D risk estimates depending on variables selected for multivariable models (2 to 95% change in the risk estimate from the age-adjusted model). Meta-analysis seeks to understand heterogeneity in addition to computing a summary risk estimate. This strategy effectively documents design heterogeneity, thus improving the practice of meta-analysis by aiding in: 1) protocol and analysis planning, 2) transparent reporting of differences in study designs, and 3) interpretation of pooled estimates. We recommend expanding the practice of meta-analysis reporting to include a table that summarizes design heterogeneity. This would provide readers with more evidence to interpret the summary risk estimates.
2014-01-01
Background Assessment of design heterogeneity conducted prior to meta-analysis is infrequently reported; it is often presented post hoc to explain statistical heterogeneity. However, design heterogeneity determines the mix of included studies and how they are analyzed in a meta-analysis, which in turn can importantly influence the results. The goal of this work is to introduce ways to improve the assessment and reporting of design heterogeneity prior to statistical summarization of epidemiologic studies. Methods In this paper, we use an assessment of sugar-sweetened beverages (SSB) and type 2 diabetes (T2D) as an example to show how a technique called ‘evidence mapping’ can be used to organize studies and evaluate design heterogeneity prior to meta-analysis.. Employing a systematic and reproducible approach, we evaluated the following elements across 11 selected cohort studies: variation in definitions of SSB, T2D, and co-variables, design features and population characteristics associated with specific definitions of SSB, and diversity in modeling strategies. Results Evidence mapping strategies effectively organized complex data and clearly depicted design heterogeneity. For example, across 11 studies of SSB and T2D, 7 measured diet only once (with 7 to 16 years of disease follow-up), 5 included primarily low SSB consumers, and 3 defined the study variable (SSB) as consumption of either sugar or artificially-sweetened beverages. This exercise also identified diversity in analysis strategies, such as adjustment for 11 to 17 co-variables and a large degree of fluctuation in SSB-T2D risk estimates depending on variables selected for multivariable models (2 to 95% change in the risk estimate from the age-adjusted model). Conclusions Meta-analysis seeks to understand heterogeneity in addition to computing a summary risk estimate. This strategy effectively documents design heterogeneity, thus improving the practice of meta-analysis by aiding in: 1) protocol and analysis planning, 2) transparent reporting of differences in study designs, and 3) interpretation of pooled estimates. We recommend expanding the practice of meta-analysis reporting to include a table that summarizes design heterogeneity. This would provide readers with more evidence to interpret the summary risk estimates. PMID:25055879
An Approach to Risk-Based Design Incorporating Damage Tolerance Analyses
NASA Technical Reports Server (NTRS)
Knight, Norman F., Jr.; Glaessgen, Edward H.; Sleight, David W.
2002-01-01
Incorporating risk-based design as an integral part of spacecraft development is becoming more and more common. Assessment of uncertainties associated with design parameters and environmental aspects such as loading provides increased knowledge of the design and its performance. Results of such studies can contribute to mitigating risk through a system-level assessment. Understanding the risk of an event occurring, the probability of its occurrence, and the consequences of its occurrence can lead to robust, reliable designs. This paper describes an approach to risk-based structural design incorporating damage-tolerance analysis. The application of this approach to a candidate Earth-entry vehicle is described. The emphasis of the paper is on describing an approach for establishing damage-tolerant structural response inputs to a system-level probabilistic risk assessment.
Geotechnical risk analysis user's guide
DOT National Transportation Integrated Search
1987-03-01
All geotechnical predictions involve uncertainties. These are accounted for additionally by conservative factors of safety. Risk based design, on the other hand, attempts to quantify uncertainties and to adjust design conservatism accordingly. Such m...
Vulnerabilities, Influences and Interaction Paths: Failure Data for Integrated System Risk Analysis
NASA Technical Reports Server (NTRS)
Malin, Jane T.; Fleming, Land
2006-01-01
We describe graph-based analysis methods for identifying and analyzing cross-subsystem interaction risks from subsystem connectivity information. By discovering external and remote influences that would be otherwise unexpected, these methods can support better communication among subsystem designers at points of potential conflict and to support design of more dependable and diagnosable systems. These methods identify hazard causes that can impact vulnerable functions or entities if propagated across interaction paths from the hazard source to the vulnerable target. The analysis can also assess combined impacts of And-Or trees of disabling influences. The analysis can use ratings of hazards and vulnerabilities to calculate cumulative measures of the severity and importance. Identification of cross-subsystem hazard-vulnerability pairs and propagation paths across subsystems will increase coverage of hazard and risk analysis and can indicate risk control and protection strategies.
Risk analysis of gravity dam instability using credibility theory Monte Carlo simulation model.
Xin, Cao; Chongshi, Gu
2016-01-01
Risk analysis of gravity dam stability involves complicated uncertainty in many design parameters and measured data. Stability failure risk ratio described jointly by probability and possibility has deficiency in characterization of influence of fuzzy factors and representation of the likelihood of risk occurrence in practical engineering. In this article, credibility theory is applied into stability failure risk analysis of gravity dam. Stability of gravity dam is viewed as a hybrid event considering both fuzziness and randomness of failure criterion, design parameters and measured data. Credibility distribution function is conducted as a novel way to represent uncertainty of influence factors of gravity dam stability. And combining with Monte Carlo simulation, corresponding calculation method and procedure are proposed. Based on a dam section, a detailed application of the modeling approach on risk calculation of both dam foundation and double sliding surfaces is provided. The results show that, the present method is feasible to be applied on analysis of stability failure risk for gravity dams. The risk assessment obtained can reflect influence of both sorts of uncertainty, and is suitable as an index value.
Towards Improved Considerations of Risk in Seismic Design (Plinius Medal Lecture)
NASA Astrophysics Data System (ADS)
Sullivan, T. J.
2012-04-01
The aftermath of recent earthquakes is a reminder that seismic risk is a very relevant issue for our communities. Implicit within the seismic design standards currently in place around the world is that minimum acceptable levels of seismic risk will be ensured through design in accordance with the codes. All the same, none of the design standards specify what the minimum acceptable level of seismic risk actually is. Instead, a series of deterministic limit states are set which engineers then demonstrate are satisfied for their structure, typically through the use of elastic dynamic analyses adjusted to account for non-linear response using a set of empirical correction factors. From the early nineties the seismic engineering community has begun to recognise numerous fundamental shortcomings with such seismic design procedures in modern codes. Deficiencies include the use of elastic dynamic analysis for the prediction of inelastic force distributions, the assignment of uniform behaviour factors for structural typologies irrespective of the structural proportions and expected deformation demands, and the assumption that hysteretic properties of a structure do not affect the seismic displacement demands, amongst other things. In light of this a number of possibilities have emerged for improved control of risk through seismic design, with several innovative displacement-based seismic design methods now well developed. For a specific seismic design intensity, such methods provide a more rational means of controlling the response of a structure to satisfy performance limit states. While the development of such methodologies does mark a significant step forward for the control of seismic risk, they do not, on their own, identify the seismic risk of a newly designed structure. In the U.S. a rather elaborate performance-based earthquake engineering (PBEE) framework is under development, with the aim of providing seismic loss estimates for new buildings. The PBEE framework consists of the following four main analysis stages: (i) probabilistic seismic hazard analysis to give the mean occurrence rate of earthquake events having an intensity greater than a threshold value, (ii) structural analysis to estimate the global structural response, given a certain value of seismic intensity, (iii) damage analysis, in which fragility functions are used to express the probability that a building component exceeds a damage state, as a function of the global structural response, (iv) loss analysis, in which the overall performance is assessed based on the damage state of all components. This final step gives estimates of the mean annual frequency with which various repair cost levels (or other decision variables) are exceeded. The realisation of this framework does suggest that risk-based seismic design is now possible. However, comparing current code approaches with the proposed PBEE framework, it becomes apparent that mainstream consulting engineers would have to go through a massive learning curve in order to apply the new procedures in practice. With this in mind, it is proposed that simplified loss-based seismic design procedures are a logical means of helping the engineering profession transition from what are largely deterministic seismic design procedures in current codes, to more rational risk-based seismic design methodologies. Examples are provided to illustrate the likely benefits of adopting loss-based seismic design approaches in practice.
Reliability and Probabilistic Risk Assessment - How They Play Together
NASA Technical Reports Server (NTRS)
Safie, Fayssal; Stutts, Richard; Huang, Zhaofeng
2015-01-01
Since the Space Shuttle Challenger accident in 1986, NASA has extensively used probabilistic analysis methods to assess, understand, and communicate the risk of space launch vehicles. Probabilistic Risk Assessment (PRA), used in the nuclear industry, is one of the probabilistic analysis methods NASA utilizes to assess Loss of Mission (LOM) and Loss of Crew (LOC) risk for launch vehicles. PRA is a system scenario based risk assessment that uses a combination of fault trees, event trees, event sequence diagrams, and probability distributions to analyze the risk of a system, a process, or an activity. It is a process designed to answer three basic questions: 1) what can go wrong that would lead to loss or degraded performance (i.e., scenarios involving undesired consequences of interest), 2) how likely is it (probabilities), and 3) what is the severity of the degradation (consequences). Since the Challenger accident, PRA has been used in supporting decisions regarding safety upgrades for launch vehicles. Another area that was given a lot of emphasis at NASA after the Challenger accident is reliability engineering. Reliability engineering has been a critical design function at NASA since the early Apollo days. However, after the Challenger accident, quantitative reliability analysis and reliability predictions were given more scrutiny because of their importance in understanding failure mechanism and quantifying the probability of failure, which are key elements in resolving technical issues, performing design trades, and implementing design improvements. Although PRA and reliability are both probabilistic in nature and, in some cases, use the same tools, they are two different activities. Specifically, reliability engineering is a broad design discipline that deals with loss of function and helps understand failure mechanism and improve component and system design. PRA is a system scenario based risk assessment process intended to assess the risk scenarios that could lead to a major/top undesirable system event, and to identify those scenarios that are high-risk drivers. PRA output is critical to support risk informed decisions concerning system design. This paper describes the PRA process and the reliability engineering discipline in detail. It discusses their differences and similarities and how they work together as complementary analyses to support the design and risk assessment processes. Lessons learned, applications, and case studies in both areas are also discussed in the paper to demonstrate and explain these differences and similarities.
NASA Astrophysics Data System (ADS)
Klügel, J.
2006-12-01
Deterministic scenario-based seismic hazard analysis has a long tradition in earthquake engineering for developing the design basis of critical infrastructures like dams, transport infrastructures, chemical plants and nuclear power plants. For many applications besides of the design of infrastructures it is of interest to assess the efficiency of the design measures taken. These applications require a method allowing to perform a meaningful quantitative risk analysis. A new method for a probabilistic scenario-based seismic risk analysis has been developed based on a probabilistic extension of proven deterministic methods like the MCE- methodology. The input data required for the method are entirely based on the information which is necessary to perform any meaningful seismic hazard analysis. The method is based on the probabilistic risk analysis approach common for applications in nuclear technology developed originally by Kaplan & Garrick (1981). It is based (1) on a classification of earthquake events into different size classes (by magnitude), (2) the evaluation of the frequency of occurrence of events, assigned to the different classes (frequency of initiating events, (3) the development of bounding critical scenarios assigned to each class based on the solution of an optimization problem and (4) in the evaluation of the conditional probability of exceedance of critical design parameters (vulnerability analysis). The advantage of the method in comparison with traditional PSHA consists in (1) its flexibility, allowing to use different probabilistic models for earthquake occurrence as well as to incorporate advanced physical models into the analysis, (2) in the mathematically consistent treatment of uncertainties, and (3) in the explicit consideration of the lifetime of the critical structure as a criterion to formulate different risk goals. The method was applied for the evaluation of the risk of production interruption losses of a nuclear power plant during its residual lifetime.
FINDING A METHOD FOR THE MADNESS: A COMPARATIVE ANALYSIS OF STRATEGIC DESIGN METHODOLOGIES
2017-06-01
FINDING A METHOD FOR THE MADNESS: A COMPARATIVE ANALYSIS OF STRATEGIC DESIGN METHODOLOGIES BY AMANDA DONNELLY A THESIS...work develops a comparative model for strategic design methodologies, focusing on the primary elements of vision, time, process, communication and...collaboration, and risk assessment. My analysis dissects and compares three potential design methodologies including, net assessment, scenarios and
Advancing effects analysis for integrated, large-scale wildfire risk assessment
Matthew P. Thompson; David E. Calkin; Julie W. Gilbertson-Day; Alan A. Ager
2011-01-01
In this article, we describe the design and development of a quantitative, geospatial risk assessment tool intended to facilitate monitoring trends in wildfire risk over time and to provide information useful in prioritizing fuels treatments and mitigation measures. The research effort is designed to develop, from a strategic view, a first approximation of how both...
NASA Astrophysics Data System (ADS)
Bearden, David A.; Duclos, Donald P.; Barrera, Mark J.; Mosher, Todd J.; Lao, Norman Y.
1997-12-01
Emerging technologies and micro-instrumentation are changing the way remote sensing spacecraft missions are developed and implemented. Government agencies responsible for procuring space systems are increasingly requesting analyses to estimate cost, performance and design impacts of advanced technology insertion for both state-of-the-art systems as well as systems to be built 5 to 10 years in the future. Numerous spacecraft technology development programs are being sponsored by Department of Defense (DoD) and National Aeronautics and Space Administration (NASA) agencies with the goal of enhancing spacecraft performance, reducing mass, and reducing cost. However, it is often the case that technology studies, in the interest of maximizing subsystem-level performance and/or mass reduction, do not anticipate synergistic system-level effects. Furthermore, even though technical risks are often identified as one of the largest cost drivers for space systems, many cost/design processes and models ignore effects of cost risk in the interest of quick estimates. To address these issues, the Aerospace Corporation developed a concept analysis methodology and associated software tools. These tools, collectively referred to as the concept analysis and design evaluation toolkit (CADET), facilitate system architecture studies and space system conceptual designs focusing on design heritage, technology selection, and associated effects on cost, risk and performance at the system and subsystem level. CADET allows: (1) quick response to technical design and cost questions; (2) assessment of the cost and performance impacts of existing and new designs/technologies; and (3) estimation of cost uncertainties and risks. These capabilities aid mission designers in determining the configuration of remote sensing missions that meet essential requirements in a cost- effective manner. This paper discuses the development of CADET modules and their application to several remote sensing satellite mission concepts.
Rotor systems research aircraft predesign study. Volume 3: Predesign report
NASA Technical Reports Server (NTRS)
Schmidt, S. A.; Linden, A. W.
1972-01-01
The features of two aircraft designs were selected to be included in the single RSRA configuration. A study was conducted for further preliminary design and a more detailed analysis of development plans and costs. An analysis was also made of foreseeable technical problems and risks, identification of parallel research which would reduce risks and/or add to the basic capability of the aircraft, and a draft aircraft specification.
Comprehensive risk analysis for structure type selection.
DOT National Transportation Integrated Search
2010-04-01
Optimization of bridge selection and design traditionally has been sought in terms of the finished structure. This study presents a : more comprehensive risk-based analysis that includes user costs and accidents during the construction phase. Costs f...
NASA Astrophysics Data System (ADS)
Guo, Aijun; Chang, Jianxia; Wang, Yimin; Huang, Qiang; Zhou, Shuai
2018-05-01
Traditional flood risk analysis focuses on the probability of flood events exceeding the design flood of downstream hydraulic structures while neglecting the influence of sedimentation in river channels on regional flood control systems. This work advances traditional flood risk analysis by proposing a univariate and copula-based bivariate hydrological risk framework which incorporates both flood control and sediment transport. In developing the framework, the conditional probabilities of different flood events under various extreme precipitation scenarios are estimated by exploiting the copula-based model. Moreover, a Monte Carlo-based algorithm is designed to quantify the sampling uncertainty associated with univariate and bivariate hydrological risk analyses. Two catchments located on the Loess plateau are selected as study regions: the upper catchments of the Xianyang and Huaxian stations (denoted as UCX and UCH, respectively). The univariate and bivariate return periods, risk and reliability in the context of uncertainty for the purposes of flood control and sediment transport are assessed for the study regions. The results indicate that sedimentation triggers higher risks of damaging the safety of local flood control systems compared with the event that AMF exceeds the design flood of downstream hydraulic structures in the UCX and UCH. Moreover, there is considerable sampling uncertainty affecting the univariate and bivariate hydrologic risk evaluation, which greatly challenges measures of future flood mitigation. In addition, results also confirm that the developed framework can estimate conditional probabilities associated with different flood events under various extreme precipitation scenarios aiming for flood control and sediment transport. The proposed hydrological risk framework offers a promising technical reference for flood risk analysis in sandy regions worldwide.
Design and validation of a questionnaire for measuring perceived risk of skin cancer.
Morales-Sánchez, M A; Peralta-Pedrero, M L; Domínguez-Gómez, M A
2014-04-01
A perceived risk of cancer encourages preventive behavior while the lack of such a perception is a barrier to risk reduction. There are no instruments in Spanish to measure this perceived risk and thus quantify response to interventions for preventing this disease at a population level. The aim of this study was to design and validate a self-administered questionnaire for measuring the perceived risk of skin cancer. A self-administered questionnaire with a visual Likert-type scale was designed based on the results of the analysis of the content of a survey performed in 100 patients in the Dr. Ladislao de la Pascua Skin Clinic, Distrito Federal México, Mexico. Subsequently, the questionnaire was administered to a sample of 359 adult patients who attended the clinic for the first time. As no gold standard exists for measuring the perceived risk of skin cancer, the construct was validated through factor analysis. The final questionnaire had 18 items. The internal consistency measured with Cronbach α was 0.824 overall. In the factor analysis, 4 factors (denoted as affective, behavioral, severity, and susceptibility) and an indicator of risk accounted for 65.133% of the variance. The psychometric properties of the scale were appropriate for measuring the perception of risk in adult patients (aged 18 years or more) who attended the dermatology clinic. Copyright © 2013 Elsevier España, S.L. and AEDV. All rights reserved.
Aircraft Conceptual Design and Risk Analysis Using Physics-Based Noise Prediction
NASA Technical Reports Server (NTRS)
Olson, Erik D.; Mavris, Dimitri N.
2006-01-01
An approach was developed which allows for design studies of commercial aircraft using physics-based noise analysis methods while retaining the ability to perform the rapid trade-off and risk analysis studies needed at the conceptual design stage. A prototype integrated analysis process was created for computing the total aircraft EPNL at the Federal Aviation Regulations Part 36 certification measurement locations using physics-based methods for fan rotor-stator interaction tones and jet mixing noise. The methodology was then used in combination with design of experiments to create response surface equations (RSEs) for the engine and aircraft performance metrics, geometric constraints and take-off and landing noise levels. In addition, Monte Carlo analysis was used to assess the expected variability of the metrics under the influence of uncertainty, and to determine how the variability is affected by the choice of engine cycle. Finally, the RSEs were used to conduct a series of proof-of-concept conceptual-level design studies demonstrating the utility of the approach. The study found that a key advantage to using physics-based analysis during conceptual design lies in the ability to assess the benefits of new technologies as a function of the design to which they are applied. The greatest difficulty in implementing physics-based analysis proved to be the generation of design geometry at a sufficient level of detail for high-fidelity analysis.
Public Risk Assessment Program
NASA Technical Reports Server (NTRS)
Mendeck, Gavin
2010-01-01
The Public Entry Risk Assessment (PERA) program addresses risk to the public from shuttle or other spacecraft re-entry trajectories. Managing public risk to acceptable levels is a major component of safe spacecraft operation. PERA is given scenario inputs of vehicle trajectory, probability of failure along that trajectory, the resulting debris characteristics, and field size and distribution, and returns risk metrics that quantify the individual and collective risk posed by that scenario. Due to the large volume of data required to perform such a risk analysis, PERA was designed to streamline the analysis process by using innovative mathematical analysis of the risk assessment equations. Real-time analysis in the event of a shuttle contingency operation, such as damage to the Orbiter, is possible because PERA allows for a change to the probability of failure models, therefore providing a much quicker estimation of public risk. PERA also provides the ability to generate movie files showing how the entry risk changes as the entry develops. PERA was designed to streamline the computation of the enormous amounts of data needed for this type of risk assessment by using an average distribution of debris on the ground, rather than pinpointing the impact point of every piece of debris. This has reduced the amount of computational time significantly without reducing the accuracy of the results. PERA was written in MATLAB; a compiled version can run from a DOS or UNIX prompt.
Integrated Hybrid System Architecture for Risk Analysis
NASA Technical Reports Server (NTRS)
Moynihan, Gary P.; Fonseca, Daniel J.; Ray, Paul S.
2010-01-01
A conceptual design has been announced of an expert-system computer program, and the development of a prototype of the program, intended for use as a project-management tool. The program integrates schedule and risk data for the purpose of determining the schedule applications of safety risks and, somewhat conversely, the effects of changes in schedules on changes on safety. It is noted that the design has been delivered to a NASA client and that it is planned to disclose the design in a conference presentation.
Earthquakes and building design: a primer for the laboratory animal professional.
Vogelweid, Catherine M; Hill, James B; Shea, Robert A; Johnson, Daniel B
2005-01-01
Earthquakes can occur in most regions of the United States, so it might be necessary to reinforce vulnerable animal facilities to better protect research animals during these unpredictable events. A risk analysis should include an evaluation of the seismic hazard risk at the proposed building site balanced against the estimated consequences of losses. Risk analysis can help in better justifying and recommending to building owners the costs of incorporating additional seismic reinforcements. The planning team needs to specify the level of post-earthquake building function that is desired in the facility, and then design the facility to it.
Managing Risk to Ensure a Successful Cassini/Huygens Saturn Orbit Insertion (SOI)
NASA Technical Reports Server (NTRS)
Witkowski, Mona M.; Huh, Shin M.; Burt, John B.; Webster, Julie L.
2004-01-01
I. Design: a) S/C designed to be largely single fault tolerant; b) Operate in flight demonstrated envelope, with margin; and c) Strict compliance with requirements & flight rules. II. Test: a) Baseline, fault & stress testing using flight system testbeds (H/W & S/W); b) In-flight checkout & demos to remove first time events. III. Failure Analysis: a) Critical event driven fault tree analysis; b) Risk mitigation & development of contingencies. IV) Residual Risks: a) Accepted pre-launch waivers to Single Point Failures; b) Unavoidable risks (e.g. natural disaster). V) Mission Assurance: a) Strict process for characterization of variances (ISAs, PFRs & Waivers; b) Full time Mission Assurance Manager reports to Program Manager: 1) Independent assessment of compliance with institutional standards; 2) Oversight & risk assessment of ISAs, PFRs & Waivers etc.; and 3) Risk Management Process facilitator.
Seismic Hazard Analysis — Quo vadis?
NASA Astrophysics Data System (ADS)
Klügel, Jens-Uwe
2008-05-01
The paper is dedicated to the review of methods of seismic hazard analysis currently in use, analyzing the strengths and weaknesses of different approaches. The review is performed from the perspective of a user of the results of seismic hazard analysis for different applications such as the design of critical and general (non-critical) civil infrastructures, technical and financial risk analysis. A set of criteria is developed for and applied to an objective assessment of the capabilities of different analysis methods. It is demonstrated that traditional probabilistic seismic hazard analysis (PSHA) methods have significant deficiencies, thus limiting their practical applications. These deficiencies have their roots in the use of inadequate probabilistic models and insufficient understanding of modern concepts of risk analysis, as have been revealed in some recent large scale studies. These deficiencies result in the lack of ability of a correct treatment of dependencies between physical parameters and finally, in an incorrect treatment of uncertainties. As a consequence, results of PSHA studies have been found to be unrealistic in comparison with empirical information from the real world. The attempt to compensate these problems by a systematic use of expert elicitation has, so far, not resulted in any improvement of the situation. It is also shown that scenario-earthquakes developed by disaggregation from the results of a traditional PSHA may not be conservative with respect to energy conservation and should not be used for the design of critical infrastructures without validation. Because the assessment of technical as well as of financial risks associated with potential damages of earthquakes need a risk analysis, current method is based on a probabilistic approach with its unsolved deficiencies. Traditional deterministic or scenario-based seismic hazard analysis methods provide a reliable and in general robust design basis for applications such as the design of critical infrastructures, especially with systematic sensitivity analyses based on validated phenomenological models. Deterministic seismic hazard analysis incorporates uncertainties in the safety factors. These factors are derived from experience as well as from expert judgment. Deterministic methods associated with high safety factors may lead to too conservative results, especially if applied for generally short-lived civil structures. Scenarios used in deterministic seismic hazard analysis have a clear physical basis. They are related to seismic sources discovered by geological, geomorphologic, geodetic and seismological investigations or derived from historical references. Scenario-based methods can be expanded for risk analysis applications with an extended data analysis providing the frequency of seismic events. Such an extension provides a better informed risk model that is suitable for risk-informed decision making.
Risk-based design of process plants with regard to domino effects and land use planning.
Khakzad, Nima; Reniers, Genserik
2015-12-15
Land use planning (LUP) as an effective and crucial safety measure has widely been employed by safety experts and decision makers to mitigate off-site risks posed by major accidents. Accordingly, the concept of LUP in chemical plants has traditionally been considered from two perspectives: (i) land developments around existing chemical plants considering potential off-site risks posed by major accidents and (ii) development of existing chemical plants considering nearby land developments and the level of additional off-site risks the land developments would be exposed to. However, the attempts made to design chemical plants with regard to LUP requirements have been few, most of which have neglected the role of domino effects in risk analysis of major accidents. To overcome the limitations of previous work, first, we developed a Bayesian network methodology to calculate both on-site and off-site risks of major accidents while taking domino effects into account. Second, we combined the results of risk analysis with Analytic Hierarchical Process to design an optimal layout for which the levels of on-site and off-site risks would be minimum. Copyright © 2015 Elsevier B.V. All rights reserved.
Introduction to Political Risk Analysis. Learning Packages in the Policy Sciences, PS-24.
ERIC Educational Resources Information Center
Coplin, William D.; O'Leary, Michael K.
This package introduces college students to the kind of analysis that multinational corporations undertake to assess risks to their business operations due to political and economic conditions. Designed to be completed in 3 weeks, the four exercises enable students to (1) identify the major sources of political risk; (2) determine what social,…
Early Design Choices: Capture, Model, Integrate, Analyze, Simulate
NASA Technical Reports Server (NTRS)
Malin, Jane T.
2004-01-01
I. Designs are constructed incrementally to meet requirements and solve problems: a) Requirements types: objectives, scenarios, constraints, ilities. etc. b) Problem/issue types: risk/safety, cost/difficulty, interaction, conflict, etc. II. Capture requirements, problems and solutions: a) Collect design and analysis products and make them accessible for integration and analysis; b) Link changes in design requirements, problems and solutions; and c) Harvest design data for design models and choice structures. III. System designs are constructed by multiple groups designing interacting subsystems a) Diverse problems, choice criteria, analysis methods and point solutions. IV. Support integration and global analysis of repercussions: a) System implications of point solutions; b) Broad analysis of interactions beyond totals of mass, cost, etc.
Lofthouse, Rachael; Golding, Laura; Totsika, Vasiliki; Hastings, Richard; Lindsay, William
2017-12-01
Risk assessments assist professionals in the identification and management of risk of aggression. The present study aimed to systematically review evidence on the efficacy of assessments for managing the risk of physical aggression in adults with intellectual disabilities (ID). A literature search was conducted using the databases PsycINFO, EMBASE, MEDLINE, Web of Science, and Google Scholar. Electronic and hand searches identified 14 studies that met the inclusion criteria. Standardised mean difference effect sizes Area Under Curve (AUC) were calculated for studies. Random effects subgroup analysis was used to compare different types of risk measures (Actuarial, Structured Professional Judgment and dynamic), and prospective vs. catch-up longitudinal study designs. Overall, evidence of predictive validity was found for risk measures with ID populations: (AUC)=0.724, 95% CI [0.681, 0.768]. There was no variation in the performance of different types of risk measures, or different study design. Risk assessment measures predict the likelihood of aggression in ID population and are comparable to those in mainstream populations. Further meta-analysis is necessary when risk measures are more established in this population. Copyright © 2017 Elsevier Ltd. All rights reserved.
Game Theory and Risk-Based Levee System Design
NASA Astrophysics Data System (ADS)
Hui, R.; Lund, J. R.; Madani, K.
2014-12-01
Risk-based analysis has been developed for optimal levee design for economic efficiency. Along many rivers, two levees on opposite riverbanks act as a simple levee system. Being rational and self-interested, land owners on each river bank would tend to independently optimize their levees with risk-based analysis, resulting in a Pareto-inefficient levee system design from the social planner's perspective. Game theory is applied in this study to analyze decision making process in a simple levee system in which the land owners on each river bank develop their design strategies using risk-based economic optimization. For each land owner, the annual expected total cost includes expected annual damage cost and annualized construction cost. The non-cooperative Nash equilibrium is identified and compared to the social planner's optimal distribution of flood risk and damage cost throughout the system which results in the minimum total flood cost for the system. The social planner's optimal solution is not feasible without appropriate level of compensation for the transferred flood risk to guarantee and improve conditions for all parties. Therefore, cooperative game theory is then employed to develop an economically optimal design that can be implemented in practice. By examining the game in the reversible and irreversible decision making modes, the cost of decision making myopia is calculated to underline the significance of considering the externalities and evolution path of dynamic water resource problems for optimal decision making.
Hydrologic Extremes and Risk Assessment under Non-stationarity
NASA Astrophysics Data System (ADS)
Mondal, A.
2015-12-01
In the context of hydrologic designs, robust assessment and communication of risk is crucial to ascertain a sustainable water future. Traditional methods for defining return period, risk or reliability assumes a stationary regime which may no longer be valid because of natural or man-made changes. Reformulations are suggested in recent literature to account for non-stationarity in the definition of hydrologic risk, as time evolves. This study presents a comparative analysis of design levels under non-stationarity based on time varying annual exceedance probabilities, waiting time of a hazardous event, number of hazardous events and probability of failure. A case study application is shown for peak streamflow in the flood-prone delta area of the Krishna River in India where an increasing trend in annual maximum flows are observed owing to persistent silting. Considerable disagreement is found between the design magnitudes of flood obtained by the different definitions of hydrologic risk. Such risk is also found to be highly sensitive to the assumed design life period and projections of trend in that period or beyond. Additionally, some critical points on the assumption of a deterministic non-stationary model for an observed natural process are also discussed. The findings highlight the necessity for a unifying framework for assessment and communication of hydrologic risk under transient hydro-climatic conditions. The concepts can also be extended to other applications such as regional hydrologic frequency analysis or development of precipitation intensity-duration-frequency relationships for infrastructure design.
Reliability and safety, and the risk of construction damage in mining areas
NASA Astrophysics Data System (ADS)
Skrzypczak, Izabela; Kogut, Janusz P.; Kokoszka, Wanda; Oleniacz, Grzegorz
2018-04-01
This article concerns the reliability and safety of building structures in mining areas, with a particular emphasis on the quantitative risk analysis of buildings. The issues of threat assessment and risk estimation, in the design of facilities in mining exploitation areas, are presented here, indicating the difficulties and ambiguities associated with their quantification and quantitative analysis. This article presents the concept of quantitative risk assessment of the impact of mining exploitation, in accordance with ISO 13824 [1]. The risk analysis is illustrated through an example of a construction located within an area affected by mining exploitation.
Dynamic Blowout Risk Analysis Using Loss Functions.
Abimbola, Majeed; Khan, Faisal
2018-02-01
Most risk analysis approaches are static; failing to capture evolving conditions. Blowout, the most feared accident during a drilling operation, is a complex and dynamic event. The traditional risk analysis methods are useful in the early design stage of drilling operation while falling short during evolving operational decision making. A new dynamic risk analysis approach is presented to capture evolving situations through dynamic probability and consequence models. The dynamic consequence models, the focus of this study, are developed in terms of loss functions. These models are subsequently integrated with the probability to estimate operational risk, providing a real-time risk analysis. The real-time evolving situation is considered dependent on the changing bottom-hole pressure as drilling progresses. The application of the methodology and models are demonstrated with a case study of an offshore drilling operation evolving to a blowout. © 2017 Society for Risk Analysis.
Cyber-Informed Engineering: The Need for a New Risk Informed and Design Methodology
DOE Office of Scientific and Technical Information (OSTI.GOV)
Price, Joseph Daniel; Anderson, Robert Stephen
Current engineering and risk management methodologies do not contain the foundational assumptions required to address the intelligent adversary’s capabilities in malevolent cyber attacks. Current methodologies focus on equipment failures or human error as initiating events for a hazard, while cyber attacks use the functionality of a trusted system to perform operations outside of the intended design and without the operator’s knowledge. These threats can by-pass or manipulate traditionally engineered safety barriers and present false information, invalidating the fundamental basis of a safety analysis. Cyber threats must be fundamentally analyzed from a completely new perspective where neither equipment nor human operationmore » can be fully trusted. A new risk analysis and design methodology needs to be developed to address this rapidly evolving threatscape.« less
Ganga, G M D; Esposto, K F; Braatz, D
2012-01-01
The occupational exposure limits of different risk factors for development of low back disorders (LBDs) have not yet been established. One of the main problems in setting such guidelines is the limited understanding of how different risk factors for LBDs interact in causing injury, since the nature and mechanism of these disorders are relatively unknown phenomena. Industrial ergonomists' role becomes further complicated because the potential risk factors that may contribute towards the onset of LBDs interact in a complex manner, which makes it difficult to discriminate in detail among the jobs that place workers at high or low risk of LBDs. The purpose of this paper was to develop a comparative study between predictions based on the neural network-based model proposed by Zurada, Karwowski & Marras (1997) and a linear discriminant analysis model, for making predictions about industrial jobs according to their potential risk of low back disorders due to workplace design. The results obtained through applying the discriminant analysis-based model proved that it is as effective as the neural network-based model. Moreover, the discriminant analysis-based model proved to be more advantageous regarding cost and time savings for future data gathering.
NASA Technical Reports Server (NTRS)
Tompkins, F. G.
1984-01-01
Guidance is presented to NASA Computer Security Officials for determining the acceptability or unacceptability of ADP security risks based on the technical, operational and economic feasibility of potential safeguards. The risk management process is reviewed as a specialized application of the systems approach to problem solving and information systems analysis and design. Reporting the results of the risk reduction analysis to management is considered. Report formats for the risk reduction study are provided.
SADA: Ecological Risk Based Decision Support System for Selective Remediation
Spatial Analysis and Decision Assistance (SADA) is freeware that implements terrestrial ecological risk assessment and yields a selective remediation design using its integral geographical information system, based on ecological and risk assessment inputs. Selective remediation ...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Prooijen, Monique van; Breen, Stephen
Purpose: Our treatment for choroidal melanoma utilizes the GTC frame. The patient looks at a small LED to stabilize target position. The LED is attached to a metal arm attached to the GTC frame. A camera on the arm allows therapists to monitor patient compliance. To move to mask-based immobilization we need a new LED/camera attachment mechanism. We used a Hazard-Risk Analysis (HRA) to guide the design of the new tool. Method: A pre-clinical model was built with input from therapy and machine shop personnel. It consisted of an aluminum frame placed in aluminum guide posts attached to the couchmore » top. Further development was guided by the Department of Defense Standard Practice - System Safety hazard risk analysis technique. Results: An Orfit mask was selected because it allowed access to indexes on the couch top which assist with setup reproducibility. The first HRA table was created considering mechanical failure modes of the device. Discussions with operators and manufacturers identified other failure modes and solutions. HRA directed the design towards a safe clinical device. Conclusion: A new immobilization tool has been designed using hazard-risk analysis which resulted in an easier-to-use and safer tool compared to the initial design. The remaining risks are all low probability events and not dissimilar from those currently faced with the GTC setup. Given the gains in ease of use for therapists and patients as well as the lower costs for the hospital, we will implement this new tool.« less
The Littoral Combat Ship: Is the US Navy Assuming Too Much Risk?
2006-06-16
whereas the General Dynamics design will focus on an all- aluminum trimaran hull. Additional study of the pros and cons based on these two designs and......also important to understand that not all risk is bad. The SWOT analysis, which looks at the strengths, weaknesses, opportunities, and threats
14 CFR 437.55 - Hazard analysis.
Code of Federal Regulations, 2014 CFR
2014-01-01
... TRANSPORTATION LICENSING EXPERIMENTAL PERMITS Safety Requirements § 437.55 Hazard analysis. (a) A permittee must...) Design inadequacies; or (vi) Procedural deficiencies. (2) Determine the likelihood of occurrence and... include one or more of the following: (i) Designing for minimum risk, (ii) Incorporating safety devices...
14 CFR 437.55 - Hazard analysis.
Code of Federal Regulations, 2010 CFR
2010-01-01
... TRANSPORTATION LICENSING EXPERIMENTAL PERMITS Safety Requirements § 437.55 Hazard analysis. (a) A permittee must...) Design inadequacies; or (vi) Procedural deficiencies. (2) Determine the likelihood of occurrence and... include one or more of the following: (i) Designing for minimum risk, (ii) Incorporating safety devices...
14 CFR 437.55 - Hazard analysis.
Code of Federal Regulations, 2011 CFR
2011-01-01
... TRANSPORTATION LICENSING EXPERIMENTAL PERMITS Safety Requirements § 437.55 Hazard analysis. (a) A permittee must...) Design inadequacies; or (vi) Procedural deficiencies. (2) Determine the likelihood of occurrence and... include one or more of the following: (i) Designing for minimum risk, (ii) Incorporating safety devices...
Irestig, Magnus; Timpka, Toomas
2010-02-01
We set out to examine design conflict resolution tactics used in development of large information systems for health services and to outline the design consequences for these tactics. Discourse analysis methods were applied to data collected from meetings conducted during the development of a web-based system in a public health context. We found that low risk tactics were characterized by design issues being managed within the formal mandate and competences of the design group. In comparison, high risk tactics were associated with irresponsible compromises, i.e. decisions being passed on to others or to later phases of the design process. The consequence of this collective disregard of issues such as responsibility and legitimacy is that the system design will be impossible to implement in factual health service contexts. The results imply that downstream responsibility issues have to be continuously dealt with in system development in health services.
WEC Design Response Toolbox v. 1.0
DOE Office of Scientific and Technical Information (OSTI.GOV)
Coe, Ryan; Michelen, Carlos; Eckert-Gallup, Aubrey
2016-03-30
The WEC Design Response Toolbox (WDRT) is a numerical toolbox for design-response analysis of wave energy converters (WECs). The WDRT was developed during a series of efforts to better understand WEC survival design. The WDRT has been designed as a tool for researchers and developers, enabling the straightforward application of statistical and engineering methods. The toolbox includes methods for short-term extreme response, environmental characterization, long-term extreme response and risk analysis, fatigue, and design wave composition.
NASA Astrophysics Data System (ADS)
Martino, P.
1980-12-01
A general methodology is presented for conducting an analysis of the various aspects of the hazards associated with the storage and transportation of liquefied natural gas (LNG) which should be considered during the planning stages of a typical LNG ship terminal. The procedure includes the performance of a hazards and system analysis of the proposed site, a probability analysis of accident scenarios and safety impacts, an analysis of the consequences of credible accidents such as tanker accidents, spills and fires, the assessment of risks and the design and evaluation of risk mitigation measures.
Paolantonacci, Philippe; Appourchaux, Philippe; Claudel, Béatrice; Ollivier, Monique; Dennett, Richard; Siret, Laurent
2018-01-01
Polyvalent human normal immunoglobulins for intravenous use (IVIg), indicated for rare and often severe diseases, are complex plasma-derived protein preparations. A quality by design approach has been used to develop the Laboratoire Français du Fractionnement et des Biotechnologies new-generation IVIg, targeting a high level of purity to generate an enhanced safety profile while maintaining a high level of efficacy. A modular approach of quality by design was implemented consisting of five consecutive steps to cover all the stages from the product design to the final product control strategy.A well-defined target product profile was translated into 27 product quality attributes that formed the basis of the process design. In parallel, a product risk analysis was conducted and identified 19 critical quality attributes among the product quality attributes. Process risk analysis was carried out to establish the links between process parameters and critical quality attributes. Twelve critical steps were identified, and for each of these steps a risk mitigation plan was established.Among the different process risk mitigation exercises, five process robustness studies were conducted at qualified small scale with a design of experiment approach. For each process step, critical process parameters were identified and, for each critical process parameter, proven acceptable ranges were established. The quality risk management and risk mitigation outputs, including verification of proven acceptable ranges, were used to design the process verification exercise at industrial scale.Finally, the control strategy was established using a mix, or hybrid, of the traditional approach plus elements of the quality by design enhanced approach, as illustrated, to more robustly assign material and process controls and in order to securely meet product specifications.The advantages of this quality by design approach were improved process knowledge for industrial design and process validation and a clear justification of the process and product specifications as a basis for control strategy and future comparability exercises. © PDA, Inc. 2018.
Structural Analysis Made 'NESSUSary'
NASA Technical Reports Server (NTRS)
2005-01-01
Everywhere you look, chances are something that was designed and tested by a computer will be in plain view. Computers are now utilized to design and test just about everything imaginable, from automobiles and airplanes to bridges and boats, and elevators and escalators to streets and skyscrapers. Computer-design engineering first emerged in the 1970s, in the automobile and aerospace industries. Since computers were in their infancy, however, architects and engineers during the time were limited to producing only designs similar to hand-drafted drawings. (At the end of 1970s, a typical computer-aided design system was a 16-bit minicomputer with a price tag of $125,000.) Eventually, computers became more affordable and related software became more sophisticated, offering designers the "bells and whistles" to go beyond the limits of basic drafting and rendering, and venture into more skillful applications. One of the major advancements was the ability to test the objects being designed for the probability of failure. This advancement was especially important for the aerospace industry, where complicated and expensive structures are designed. The ability to perform reliability and risk assessment without using extensive hardware testing is critical to design and certification. In 1984, NASA initiated the Probabilistic Structural Analysis Methods (PSAM) project at Glenn Research Center to develop analysis methods and computer programs for the probabilistic structural analysis of select engine components for current Space Shuttle and future space propulsion systems. NASA envisioned that these methods and computational tools would play a critical role in establishing increased system performance and durability, and assist in structural system qualification and certification. Not only was the PSAM project beneficial to aerospace, it paved the way for a commercial risk- probability tool that is evaluating risks in diverse, down- to-Earth application
Highly-Complex Environmentally-Realistic Mixtures: Challenges and Advances
The difficulties involved in design, conduct, analysis and interpretation of defmed mixtures experiments and use of the resulting data in risk assessment are now wellknown to the toxicology, risk assessment and risk management communities. The arena of highly-complex environment...
Beregovykh, V V; Spitskiy, O R
2014-01-01
Risk-based approach is used for examination of impact of different factors on quality of medicinal products in technology transfer. A general diagram is offered for risk analysis execution in technology transfer from pharmaceutical development to production. When transferring technology to full- scale commercial production it is necessary to investigate and simulate production process application beforehand in new real conditions. The manufacturing process is the core factorfor risk analysis having the most impact on quality attributes of a medicinal product. Further importantfactors are linked to materials and products to be handled and manufacturing environmental conditions such as premises, equipment and personnel. Usage of risk-based approach in designing of multipurpose production facility of medicinal products is shown where quantitative risk analysis tool RAMM (Risk Analysis and Mitigation Matrix) was applied.
NASA Technical Reports Server (NTRS)
Zabinsky, J. M.; Burnham, R. W.; Flora, C. C.; Gotlieb, P.; Grande, D. L.; Gunnarson, D. W.; Howard, W. M.; Hunt, D.; Jakubowski, G. W.; Johnson, P. E.
1975-01-01
An assessment of risk, in terms of delivery delays, cost overrun, and performance achievement, associated with the V/STOL technology airplane is presented. The risk is discussed in terms of weight, structure, aerodynamics, propulsion, mechanical drive, and flight controls. The analysis ensures that risks associated with the design and development of the airplane will be eliminated in the course of the program and a useful technology airplane that meets the predicted cost, schedule, and performance can be produced.
Federal Register 2010, 2011, 2012, 2013, 2014
2013-10-24
... methodological issues that arise in the use of meta-analyses to evaluate safety risks, followed by a discussion... design, conduct and use of meta-analysis. Although many external stakeholders conduct meta-analyses, FDA... meeting. FDA expects that this meeting will build upon prior stakeholder feedback on the design, conduct...
Ibrahim, Shewkar E; Sayed, Tarek; Ismail, Karim
2012-11-01
Several earlier studies have noted the shortcomings with existing geometric design guides which provide deterministic standards. In these standards the safety margin of the design output is generally unknown and there is little knowledge of the safety implications of deviating from the standards. To mitigate these shortcomings, probabilistic geometric design has been advocated where reliability analysis can be used to account for the uncertainty in the design parameters and to provide a mechanism for risk measurement to evaluate the safety impact of deviations from design standards. This paper applies reliability analysis for optimizing the safety of highway cross-sections. The paper presents an original methodology to select a suitable combination of cross-section elements with restricted sight distance to result in reduced collisions and consistent risk levels. The purpose of this optimization method is to provide designers with a proactive approach to the design of cross-section elements in order to (i) minimize the risk associated with restricted sight distance, (ii) balance the risk across the two carriageways of the highway, and (iii) reduce the expected collision frequency. A case study involving nine cross-sections that are parts of two major highway developments in British Columbia, Canada, was presented. The results showed that an additional reduction in collisions can be realized by incorporating the reliability component, P(nc) (denoting the probability of non-compliance), in the optimization process. The proposed approach results in reduced and consistent risk levels for both travel directions in addition to further collision reductions. Copyright © 2012 Elsevier Ltd. All rights reserved.
Development of a Probabilistic Tsunami Hazard Analysis in Japan
DOE Office of Scientific and Technical Information (OSTI.GOV)
Toshiaki Sakai; Tomoyoshi Takeda; Hiroshi Soraoka
2006-07-01
It is meaningful for tsunami assessment to evaluate phenomena beyond the design basis as well as seismic design. Because once we set the design basis tsunami height, we still have possibilities tsunami height may exceeds the determined design tsunami height due to uncertainties regarding the tsunami phenomena. Probabilistic tsunami risk assessment consists of estimating for tsunami hazard and fragility of structures and executing system analysis. In this report, we apply a method for probabilistic tsunami hazard analysis (PTHA). We introduce a logic tree approach to estimate tsunami hazard curves (relationships between tsunami height and probability of excess) and present anmore » example for Japan. Examples of tsunami hazard curves are illustrated, and uncertainty in the tsunami hazard is displayed by 5-, 16-, 50-, 84- and 95-percentile and mean hazard curves. The result of PTHA will be used for quantitative assessment of the tsunami risk for important facilities located on coastal area. Tsunami hazard curves are the reasonable input data for structures and system analysis. However the evaluation method for estimating fragility of structures and the procedure of system analysis is now being developed. (authors)« less
Geothermal FIT Design: International Experience and U.S. Considerations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rickerson, W.; Gifford, J.; Grace, R.
2012-08-01
Developing power plants is a risky endeavor, whether conventional or renewable generation. Feed-in tariff (FIT) policies can be designed to address some of these risks, and their design can be tailored to geothermal electric plant development. Geothermal projects face risks similar to other generation project development, including finding buyers for power, ensuring adequate transmission capacity, competing to supply electricity and/or renewable energy certificates (RECs), securing reliable revenue streams, navigating the legal issues related to project development, and reacting to changes in existing regulations or incentives. Although FITs have not been created specifically for geothermal in the United States to date,more » a variety of FIT design options could reduce geothermal power plant development risks and are explored. This analysis focuses on the design of FIT incentive policies for geothermal electric projects and how FITs can be used to reduce risks (excluding drilling unproductive exploratory wells).« less
Improving risk management: from lame excuses to principled practice.
Paté-Cornell, Elisabeth; Cox, Louis Anthony
2014-07-01
The three classic pillars of risk analysis are risk assessment (how big is the risk and how sure can we be?), risk management (what shall we do about it?), and risk communication (what shall we say about it, to whom, when, and how?). We propose two complements as important parts of these three bases: risk attribution (who or what addressable conditions actually caused an accident or loss?) and learning from experience about risk reduction (what works, and how well?). Failures in complex systems usually evoke blame, often with insufficient attention to root causes of failure, including some aspects of the situation, design decisions, or social norms and culture. Focusing on blame, however, can inhibit effective learning, instead eliciting excuses to deflect attention and perceived culpability. Productive understanding of what went wrong, and how to do better, thus requires moving past recrimination and excuses. This article identifies common blame-shifting "lame excuses" for poor risk management. These generally contribute little to effective improvements and may leave real risks and preventable causes unaddressed. We propose principles from risk and decision sciences and organizational design to improve results. These start with organizational leadership. More specifically, they include: deliberate testing and learning-especially from near-misses and accident precursors; careful causal analysis of accidents; risk quantification; candid expression of uncertainties about costs and benefits of risk-reduction options; optimization of tradeoffs between gathering additional information and immediate action; promotion of safety culture; and mindful allocation of people, responsibilities, and resources to reduce risks. We propose that these principles provide sound foundations for improving successful risk management. © 2014 Society for Risk Analysis.
Meta-analysis on shift work and risks of specific obesity types.
Sun, M; Feng, W; Wang, F; Li, P; Li, Z; Li, M; Tse, G; Vlaanderen, J; Vermeulen, R; Tse, L A
2018-01-01
This systematic review and meta-analysis evaluated the associations between shift work patterns and risks of specific types of obesity. PubMed was searched until March 2017 for observational studies that examined the relationships between shift work patterns and obesity. Odds ratio for obesity was extracted using a fixed-effects or random-effects model. Subgroup meta-analyses were carried out for study design, specific obesity types and characteristics of shift work pattern. A total of 28 studies were included in this meta-analysis. The overall odds ratio of night shift work was 1.23 (95% confidence interval = 1.17-1.29) for risk of obesity/overweight. Cross-sectional studies showed a higher risk of 1.26 than those with the cohort design (risk ratio = 1.10). Shift workers had a higher frequency of developing abdominal obesity (odds ratio = 1.35) than other obesity types. Permanent night workers demonstrated a 29% higher risk than rotating shift workers (odds ratio 1.43 vs. 1.14). This meta-analysis confirmed the risks of night shift work for the development of overweight and obesity with a potential gradient association suggested, especially for abdominal obesity. Modification of working schedules is recommended, particularly for prolonged permanent night work. More accurate and detailed measurements on shift work patterns should be conducted in future research. © 2017 World Obesity Federation.
Risk assessment as standard work in design.
Morrill, Patricia W
2013-01-01
This case study article examines a formal risk assessment as part of the decision making process for design solutions in high risk areas. The overview of the Failure Modes and Effects Analysis (FMEA) tool with examples of its application in hospital building projects will demonstrate the benefit of those structured conversations. This article illustrates how two hospitals used FMEA when integrating operational processes with building projects: (1) adjacency decision for Intensive Care Unit (ICU); and (2) distance concern for handling of specimens from Surgery to Lab. Both case studies involved interviews that exposed facility solution concerns. Just-in-time studies using the FMEA followed the same risk assessment process with the same workshop facilitator involving structured conversations in analyzing risks. In both cases, participants uncovered key areas of risk enabling them to take the necessary next steps. While the focus of this article is not the actual design solution, it is apparent that the risk assessment brought clarity to the situations resulting in prompt decision making about facility solutions. Hospitals are inherently risky environments; therefore, use of the formal risk assessment process, FMEA, is an opportunity for design professionals to apply more rigor to design decision making when facility solutions impact operations in high risk areas. Case study, decision making, hospital, infection control, strategy, work environment.
An Introduction to Risk with a Focus on Design Diversity in the Stockpile
DOE Office of Scientific and Technical Information (OSTI.GOV)
Noone, Bailey C
2012-08-13
The maintenance and security of nuclear weapons in the stockpile involves decisions based on risk analysis and quantitative measures of risk. Risk is a factor in all decisions, a particularly important factor in decisions of a large scale. One example of high-risk decisions we will discuss is the risk involved in design diversity within the stockpile of nuclear weapons arsenal. Risk is defined as 'possibility of loss or injury' and the 'degree of probability of such loss' (Kaplan and Garrick 12). To introduce the risk involved with maintaining the weapons stockpile we will draw a parallel to the design andmore » maintenance of Southwest Airlines fleet of Boeing 737 planes. The clear benefits for cost savings in maintenance of having a uniform fleet are what historically drove Southwest to have only Boeing 737s in their fleet. Less money and resources are need for maintenance, training, and materials. Naturally, risk accompanies those benefits. A defect in a part of the plane indicates a potential defect in that same part in all the planes of the fleet. As a result, safety, business, and credibility are at risk. How much variety or diversity does the fleet need to mitigate that risk? With that question in mind, a balance is needed to accommodate the different risks and benefits of the situation. In a similar way, risk is analyzed for the design and maintenance of nuclear weapons in the stockpile. In conclusion, risk must be as low as possible when it comes to the nuclear weapons stockpile. Design and care to keep the stockpile healthy involves all aspects of risk management. Design diversity is a method that helps to mitigate risk, and to help balance options in stockpile stewardship.« less
Engineering risk reduction in satellite programs
NASA Technical Reports Server (NTRS)
Dean, E. S., Jr.
1979-01-01
Methods developed in planning and executing system safety engineering programs for Lockheed satellite integration contracts are presented. These procedures establish the applicable safety design criteria, document design compliance and assess the residual risks where non-compliant design is proposed, and provide for hazard analysis of system level test, handling and launch preparations. Operations hazard analysis identifies product protection and product liability hazards prior to the preparation of operational procedures and provides safety requirements for inclusion in them. The method developed for documenting all residual hazards for the attention of program management assures an acceptable minimum level of risk prior to program deployment. The results are significant for persons responsible for managing or engineering the deployment and production of complex high cost equipment under current product liability law and cost/time constraints, have a responsibility to minimize the possibility of an accident, and should have documentation to provide a defense in a product liability suit.
Conceptual design study of Fusion Experimental Reactor (FY86 FER): Safety
NASA Astrophysics Data System (ADS)
Seki, Yasushi; Iida, Hiromasa; Honda, Tsutomu
1987-08-01
This report describes the study on safety for FER (Fusion Experimental Reactor) which has been designed as a next step machine to the JT-60. Though the final purpose of this study is to have an image of design base accident, maximum credible accident and to assess their risk or probability, etc., as FER plant system, the emphasis of this years study is placed on fuel-gas circulation system where the tritium inventory is maximum. The report consists of two chapters. The first chapter summarizes the FER system and describes FMEA (Failure Mode and Effect Analysis) and related accident progression sequence for FER plant system as a whole. The second chapter of this report is focused on fuel-gas circulation system including purification, isotope separation and storage. Probability of risk is assessed by the probabilistic risk analysis (PRA) procedure based on FMEA, ETA and FTA.
ERIC Educational Resources Information Center
Brosnan, Julie; Moeyaert, Mariola; Brooks Newsome, Kendra; Healy, Olive; Heyvaert, Mieke; Onghena, Patrick; Van den Noortgate, Wim
2018-01-01
In this article, multiple-baseline across participants designs were used to evaluate the impact of a precision teaching (PT) program, within a Tier 2 Response to Intervention framework, targeting fluency in foundational reading skills with at risk kindergarten readers. Thirteen multiple-baseline design experiments that included participation from…
Use of Model-Based Design Methods for Enhancing Resiliency Analysis of Unmanned Aerial Vehicles
NASA Astrophysics Data System (ADS)
Knox, Lenora A.
The most common traditional non-functional requirement analysis is reliability. With systems becoming more complex, networked, and adaptive to environmental uncertainties, system resiliency has recently become the non-functional requirement analysis of choice. Analysis of system resiliency has challenges; which include, defining resilience for domain areas, identifying resilience metrics, determining resilience modeling strategies, and understanding how to best integrate the concepts of risk and reliability into resiliency. Formal methods that integrate all of these concepts do not currently exist in specific domain areas. Leveraging RAMSoS, a model-based reliability analysis methodology for Systems of Systems (SoS), we propose an extension that accounts for resiliency analysis through evaluation of mission performance, risk, and cost using multi-criteria decision-making (MCDM) modeling and design trade study variability modeling evaluation techniques. This proposed methodology, coined RAMSoS-RESIL, is applied to a case study in the multi-agent unmanned aerial vehicle (UAV) domain to investigate the potential benefits of a mission architecture where functionality to complete a mission is disseminated across multiple UAVs (distributed) opposed to being contained in a single UAV (monolithic). The case study based research demonstrates proof of concept for the proposed model-based technique and provides sufficient preliminary evidence to conclude which architectural design (distributed vs. monolithic) is most resilient based on insight into mission resilience performance, risk, and cost in addition to the traditional analysis of reliability.
Optical and system engineering in the development of a high-quality student telescope kit
NASA Astrophysics Data System (ADS)
Pompea, Stephen M.; Pfisterer, Richard N.; Ellis, Scott; Arion, Douglas N.; Fienberg, Richard Tresch; Smith, Thomas C.
2010-07-01
The Galileoscope student telescope kit was developed by a volunteer team of astronomers, science education experts, and optical engineers in conjunction with the International Year of Astronomy 2009. This refracting telescope is in production with over 180,000 units produced and distributed with 25,000 units in production. The telescope was designed to be able to resolve the rings of Saturn and to be used in urban areas. The telescope system requirements, performance metrics, and architecture were established after an analysis of current inexpensive telescopes and student telescope kits. The optical design approaches used in the various prototypes and the optical system engineering tradeoffs will be described. Risk analysis, risk management, and change management were critical as was cost management since the final product was to cost around 15 (but had to perform as well as 100 telescopes). In the system engineering of the Galileoscope a variety of analysis and testing approaches were used, including stray light design and analysis using the powerful optical analysis program FRED.
A theoretical treatment of technical risk in modern propulsion system design
NASA Astrophysics Data System (ADS)
Roth, Bryce Alexander
2000-09-01
A prevalent trend in modern aerospace systems is increasing complexity and cost, which in turn drives increased risk. Consequently, there is a clear and present need for the development of formalized methods to analyze the impact of risk on the design of aerospace vehicles. The objective of this work is to develop such a method that enables analysis of risk via a consistent, comprehensive treatment of aerothermodynamic and mass properties aspects of vehicle design. The key elements enabling the creation of this methodology are recent developments in the analytical estimation of work potential based on the second law of thermodynamics. This dissertation develops the theoretical foundation of a vehicle analysis method based on work potential and validates it using the Northrop F-5E with GE J85-GE-21 engines as a case study. Although the method is broadly applicable, emphasis is given to aircraft propulsion applications. Three work potential figures of merit are applied using this method: exergy, available energy, and thrust work potential. It is shown that each possesses unique properties making them useful for specific vehicle analysis tasks, though the latter two are actually special cases of exergy. All three are demonstrated on the analysis of the J85-GE-21 propulsion system, resulting in a comprehensive description of propulsion system thermodynamic loss. This "loss management" method is used to analyze aerodynamic drag loss of the F-5E and is then used in conjunction with the propulsive loss model to analyze the usage of fuel work potential throughout the F-5E design mission. The results clearly show how and where work potential is used during flight and yield considerable insight as to where the greatest opportunity for design improvement is. Next, usage of work potential is translated into fuel weight so that the aerothermodynamic performance of the F-5E can be expressed entirely in terms of vehicle gross weight. This technique is then applied as a means to quantify the impact of engine cycle technologies on the F-5E airframe. Finally, loss management methods are used in conjunction with probabilistic analysis methods to quantify the impact of risk on F-5E aerothermodynamic performance.
Li, Rongxia; Stewart, Brock; Weintraub, Eric
2016-01-01
The self-controlled case series (SCCS) and self-controlled risk interval (SCRI) designs have recently become widely used in the field of post-licensure vaccine safety monitoring to detect potential elevated risks of adverse events following vaccinations. The SCRI design can be viewed as a subset of the SCCS method in that a reduced comparison time window is used for the analysis. Compared to the SCCS method, the SCRI design has less statistical power due to fewer events occurring in the shorter control interval. In this study, we derived the asymptotic relative efficiency (ARE) between these two methods to quantify this loss in power in the SCRI design. The equation is formulated as [Formula: see text] (a: control window-length ratio between SCRI and SCCS designs; b: ratio of risk window length and control window length in the SCCS design; and [Formula: see text]: relative risk of exposed window to control window). According to this equation, the relative efficiency declines as the ratio of control-period length between SCRI and SCCS methods decreases, or with an increase in the relative risk [Formula: see text]. We provide an example utilizing data from the Vaccine Safety Datalink (VSD) to study the potential elevated risk of febrile seizure following seasonal influenza vaccine in the 2010-2011 season.
Gopalakrishnan, Chennat; Okada, Norio
2007-12-01
The goal of integrated disaster risk management is to promote an overall improvement in the quality of safety and security in a region, city or community at disaster risk. This paper presents the case for a thorough overhaul of the institutional component of integrated disaster risk management. A review of disaster management institutions in the United States indicates significant weaknesses in their ability to contribute effectively to the implementation of integrated disaster risk management. Our analysis and findings identify eight key elements for the design of dynamic new disaster management institutions. Six specific approaches are suggested for incorporating the identified key elements in building new institutions that would have significant potential for enhancing the effective implementation of integrated disaster risk management. We have developed a possible blueprint for effective design and construction of efficient, sustainable and functional disaster management institutions.
Redesign of Transjakarta Bus Driver's Cabin
NASA Astrophysics Data System (ADS)
Mardi Safitri, Dian; Azmi, Nora; Singh, Gurbinder; Astuti, Pudji
2016-02-01
Ergonomic risk at work stations with type Seated Work Control was one of the problems faced by Transjakarta bus driver. Currently “Trisakti” type bus, one type of bus that is used by Transjakarta in corridor 9, serving route Pinang Ranti - Pluit, gained many complaints from drivers. From the results of Nordic Body Map questionnaires given to 30 drivers, it was known that drivers feel pain in the neck, arms, hips, and buttocks. Allegedly this was due to the seat position and the button/panel bus has a considerable distance range (1 meter) to be achieved by drivers. In addition, preliminary results of the questionnaire using Workstation Checklist identified their complaints about uncomfortable cushion, driver's seat backrest, and the exact position of the AC is above the driver head. To reduce the risk level of ergonomics, then did research to design the cabin by using a generic approach to designing products. The risk analysis driver posture before the design was done by using Rapid Upper Limb Assessment (RULA), Rapid Entire Body Assessment (REBA), and Quick Exposure Checklist (QEC), while the calculation of the moment the body is done by using software Mannequin Pro V10.2. Furthermore, the design of generic products was done through the stages: need metric-matrix, house of quality, anthropometric data collection, classification tree concept, concept screening, scoring concept, design and manufacture of products in the form of two-dimensional. While the design after design risk analysis driver posture was done by using RULA, REBA, and calculation of moments body as well as the design visualized using software 3DMax. From the results of analysis before the draft design improvements cabin RULA obtained scores of 6, REBA 9, and the result amounted to 57.38% QEC and moment forces on the back is 247.3 LbF.inch and on the right hip is 72.9 LbF.in. While the results of the proposed improvements cabin design RULA obtained scores of 3, REBA 4, and the moment of force on the back is 90.3 LbF.in and on the right hip is 70.6 LbF.in. This indicated improvement cabin design can reduce ergonomic risk with lower scores on several parts of the body.
Sexual Victimization and Health-Risk Behaviors: A Prospective Analysis of College Women
ERIC Educational Resources Information Center
Gidycz, Christine A.; Orchowski, Lindsay M.; King, Carrie R.; Rich, Cindy L.
2008-01-01
The present study utilizes the National College Health Risk Behavior Survey to examine the relationship between health-risk behaviors and sexual victimization among a sample of college women. A prospective design is utilized to examine the relationship between health-risk behaviors as measured at baseline and sexual victimization during a 3-month…
NASA Astrophysics Data System (ADS)
Wang, Y.; Chang, J.; Guo, A.
2017-12-01
Traditional flood risk analysis focuses on the probability of flood events exceeding the design flood of downstream hydraulic structures while neglecting the influence of sedimentation in river channels on flood control systems. Given this focus, a univariate and copula-based bivariate hydrological risk framework focusing on flood control and sediment transport is proposed in the current work. Additionally, the conditional probabilities of occurrence of different flood events under various extreme precipitation scenarios are estimated by exploiting the copula model. Moreover, a Monte Carlo-based algorithm is used to evaluate the uncertainties of univariate and bivariate hydrological risk. Two catchments located on the Loess plateau are selected as study regions: the upper catchments of the Xianyang and Huaxian stations (denoted as UCX and UCH, respectively). The results indicate that (1) 2-day and 3-day consecutive rainfall are highly correlated with the annual maximum flood discharge (AMF) in UCX and UCH, respectively; and (2) univariate and bivariate return periods, risk and reliability for the purposes of flood control and sediment transport are successfully estimated. Sedimentation triggers higher risks of damaging the safety of local flood control systems compared with the AMF, exceeding the design flood of downstream hydraulic structures in the UCX and UCH. Most importantly, there was considerable sampling uncertainty in the univariate and bivariate hydrologic risk analysis, which would greatly challenge measures of future flood mitigation. The proposed hydrological risk framework offers a promising technical reference for flood risk analysis in sandy regions worldwide.
NASA Technical Reports Server (NTRS)
1975-01-01
A system analysis of the shuttle orbiter baseline system management (SM) computer function is performed. This analysis results in an alternative SM design which is also described. The alternative design exhibits several improvements over the baseline, some of which are increased crew usability, improved flexibility, and improved growth potential. The analysis consists of two parts: an application assessment and an implementation assessment. The former is concerned with the SM user needs and design functional aspects. The latter is concerned with design flexibility, reliability, growth potential, and technical risk. The system analysis is supported by several topical investigations. These include: treatment of false alarms, treatment of off-line items, significant interface parameters, and a design evaluation checklist. An in-depth formulation of techniques, concepts, and guidelines for design of automated performance verification is discussed.
Managing the Risks Associated with End-User Computing.
ERIC Educational Resources Information Center
Alavi, Maryam; Weiss, Ira R.
1986-01-01
Identifies organizational risks of end-user computing (EUC) associated with different stages of the end-user applications life cycle (analysis, design, implementation). Generic controls are identified that address each of the risks enumerated in a manner that allows EUC management to select those most appropriate to their EUC environment. (5…
How Engineers Really Think About Risk: A Study of JPL Engineers
NASA Technical Reports Server (NTRS)
Hihn, Jairus; Chattopadhyay, Deb; Valerdi, Ricardo
2011-01-01
The objectives of this work are: To improve risk assessment practices as used during the mission design process by JPL's concurrent engineering teams. (1) Developing effective ways to identify and assess mission risks (2) Providing a process for more effective dialog between stakeholders about the existence and severity of mission risks (3) Enabling the analysis of interactions of risks across concurrent engineering roles.
ERIC Educational Resources Information Center
Duffy, Larry B.; And Others
The Educational Technology Assessment Model (ETAM) is a set of comprehensive procedures and variables for the analysis, synthesis, and decision making, in regard to the benefits, costs, and risks associated with introducing technical innovations in education and training. This final report summarizes the analysis, design, and development…
Development of economic consequence methodology for process risk analysis.
Zadakbar, Omid; Khan, Faisal; Imtiaz, Syed
2015-04-01
A comprehensive methodology for economic consequence analysis with appropriate models for risk analysis of process systems is proposed. This methodology uses loss functions to relate process deviations in a given scenario to economic losses. It consists of four steps: definition of a scenario, identification of losses, quantification of losses, and integration of losses. In this methodology, the process deviations that contribute to a given accident scenario are identified and mapped to assess potential consequences. Losses are assessed with an appropriate loss function (revised Taguchi, modified inverted normal) for each type of loss. The total loss is quantified by integrating different loss functions. The proposed methodology has been examined on two industrial case studies. Implementation of this new economic consequence methodology in quantitative risk assessment will provide better understanding and quantification of risk. This will improve design, decision making, and risk management strategies. © 2014 Society for Risk Analysis.
Xie, Zhiyi; Hu, Xin; Zan, Xin; Lin, Sen; Li, Hao; You, Chao
2017-10-01
Hydrocephalus is a well-recognized complication after aneurysmal subarachnoid hemorrhage (aSAH). This study aimed to identify predictors for shunt-dependent hydrocephalus (SDHC) after aSAH via a systematic review and meta-analysis. A systematic search was conducted using the Embase, MEDLINE, and Web of Science databases for studies pertaining to aSAH and SDHC. Risk factors were assessed by meta-analysis when they were reported by at least 2 studies. The results were presented as odd ratios or risk ratios according to the study design with the corresponding 95% confidence intervals (CI). Twenty-five studies were included. In primary analysis of 14 potential risk factors, 12 were identified as predictors of SDHC after aSAH including age ≥50 years, female gender, high Hunt-Hess grade, Glasgow Coma Scale ≤8, Fisher grade ≥3, acute hydrocephalus, external ventricular drainage insertion, intraventricular hemorrhage, postcirculation aneurysm, anterior communicating artery aneurysm, meningitis, and rebleeding. The meta-analysis based on cohort studies found a significantly increased risk for SDHC in patients with aSAH treated by coiling (risk ratio, 1.16; 95% CI, 1.05-1.29), while the meta-analysis based on case-controlled studies failed to replicate this finding (odds ratio, 1.27; 95% CI, 0.95-1.71). Several new predictors of SDHC after aSAH were identified that may assist with the early recognition and prevention of SDHC. The controversial evidence found in this study was insufficient to support the potential of neurosurgical clipping for reducing the risk of shunt dependency. Further well-designed studies are warranted to explore the effect of treatment modality on SDHC risk. Copyright © 2017 Elsevier Inc. All rights reserved.
Derailment-based Fault Tree Analysis on Risk Management of Railway Turnout Systems
NASA Astrophysics Data System (ADS)
Dindar, Serdar; Kaewunruen, Sakdirat; An, Min; Gigante-Barrera, Ángel
2017-10-01
Railway turnouts are fundamental mechanical infrastructures, which allow a rolling stock to divert one direction to another. As those are of a large number of engineering subsystems, e.g. track, signalling, earthworks, these particular sub-systems are expected to induce high potential through various kind of failure mechanisms. This could be a cause of any catastrophic event. A derailment, one of undesirable events in railway operation, often results, albeit rare occurs, in damaging to rolling stock, railway infrastructure and disrupt service, and has the potential to cause casualties and even loss of lives. As a result, it is quite significant that a well-designed risk analysis is performed to create awareness of hazards and to identify what parts of the systems may be at risk. This study will focus on all types of environment based failures as a result of numerous contributing factors noted officially as accident reports. This risk analysis is designed to help industry to minimise the occurrence of accidents at railway turnouts. The methodology of the study relies on accurate assessment of derailment likelihood, and is based on statistical multiple factors-integrated accident rate analysis. The study is prepared in the way of establishing product risks and faults, and showing the impact of potential process by Boolean algebra.
NASA Astrophysics Data System (ADS)
Magilligan, F. J.; Goldstien, P.
2011-12-01
River restoration projects with the goal of restoring a wide range of morphologic and ecologic channel processes and functions have become common. The complex interactions between flow and sediment-transport make it challenging to design river channels that are both self-sustaining and improve ecosystem function. The relative immaturity of the field of river restoration and shortcomings in existing methodologies for evaluating channel designs contribute to this problem, often leading to project failures. The call for increased monitoring of constructed channels to evaluate which restoration techniques do and do not work is ubiquitous and may lead to improved channel restoration projects. However, an alternative approach is to detect project flaws before the channels are built by using numerical models to simulate hydraulic and sediment-transport processes and habitat in the proposed channel (Restoration Design Analysis). Multi-dimensional models provide spatially distributed quantities throughout the project domain that may be used to quantitatively evaluate restoration designs for such important metrics as (1) the change in water-surface elevation which can affect the extent and duration of floodplain reconnection, (2) sediment-transport and morphologic change which can affect the channel stability and long-term maintenance of the design; and (3) habitat changes. These models also provide an efficient way to evaluate such quantities over a range of appropriate discharges including low-probability events which often prove the greatest risk to the long-term stability of restored channels. Currently there are many free and open-source modeling frameworks available for such analysis including iRIC, Delft3D, and TELEMAC. In this presentation we give examples of Restoration Design Analysis for each of the metrics above from projects on the Russian River, CA and the Kootenai River, ID. These examples demonstrate how detailed Restoration Design Analysis can be used to guide design elements and how this method can point out potential stability problems or other risks before designs proceed to the construction phase.
Assessing and Mitigating Hurricane Storm Surge Risk in a Changing Environment
NASA Astrophysics Data System (ADS)
Lin, N.; Shullman, E.; Xian, S.; Feng, K.
2017-12-01
Hurricanes have induced devastating storm surge flooding worldwide. The impacts of these storms may worsen in the coming decades because of rapid coastal development coupled with sea-level rise and possibly increasing storm activity due to climate change. Major advances in coastal flood risk management are urgently needed. We present an integrated dynamic risk analysis for flooding task (iDraft) framework to assess and manage coastal flood risk at the city or regional scale, considering integrated dynamic effects of storm climatology change, sea-level rise, and coastal development. We apply the framework to New York City. First, we combine climate-model projected storm surge climatology and sea-level rise with engineering- and social/economic-model projected coastal exposure and vulnerability to estimate the flood damage risk for the city over the 21st century. We derive temporally-varying risk measures such as the annual expected damage as well as temporally-integrated measures such as the present value of future losses. We also examine the individual and joint contributions to the changing risk of the three dynamic factors (i.e., sea-level rise, storm change, and coastal development). Then, we perform probabilistic cost-benefit analysis for various coastal flood risk mitigation strategies for the city. Specifically, we evaluate previously proposed mitigation measures, including elevating houses on the floodplain and constructing flood barriers at the coast, by comparing their estimated cost and probability distribution of the benefit (i.e., present value of avoided future losses). We also propose new design strategies, including optimal design (e.g., optimal house elevation) and adaptive design (e.g., flood protection levels that are designed to be modified over time in a dynamic and uncertain environment).
The importance of operations, risk, and cost assessment to space transfer systems design
NASA Technical Reports Server (NTRS)
Ball, J. M.; Komerska, R. J.; Rowell, L. F.
1992-01-01
This paper examines several methodologies which contribute to comprehensive subsystem cost estimation. The example of a space-based lunar space transfer vehicle (STV) design is used to illustrate how including both primary and secondary factors into cost affects the decision of whether to use aerobraking or propulsion for earth orbit capture upon lunar return. The expected dominant cost factor in this decision is earth-to-orbit launch cost driven by STV mass. However, to quantify other significant cost factors, this cost comparison included a risk analysis to identify development and testing costs, a Taguchi design of experiments to determine a minimum mass aerobrake design, and a detailed operations analysis. As a result, the predicted cost advantage of aerobraking, while still positive, was subsequently reduced by about 30 percent compared to the simpler mass-based cost estimates.
Root Source Analysis/ValuStream[Trade Mark] - A Methodology for Identifying and Managing Risks
NASA Technical Reports Server (NTRS)
Brown, Richard Lee
2008-01-01
Root Source Analysis (RoSA) is a systems engineering methodology that has been developed at NASA over the past five years. It is designed to reduce costs, schedule, and technical risks by systematically examining critical assumptions and the state of the knowledge needed to bring to fruition the products that satisfy mission-driven requirements, as defined for each element of the Work (or Product) Breakdown Structure (WBS or PBS). This methodology is sometimes referred to as the ValuStream method, as inherent in the process is the linking and prioritizing of uncertainties arising from knowledge shortfalls directly to the customer's mission driven requirements. RoSA and ValuStream are synonymous terms. RoSA is not simply an alternate or improved method for identifying risks. It represents a paradigm shift. The emphasis is placed on identifying very specific knowledge shortfalls and assumptions that are the root sources of the risk (the why), rather than on assessing the WBS product(s) themselves (the what). In so doing RoSA looks forward to anticipate, identify, and prioritize knowledge shortfalls and assumptions that are likely to create significant uncertainties/ risks (as compared to Root Cause Analysis, which is most often used to look back to discover what was not known, or was assumed, that caused the failure). Experience indicates that RoSA, with its primary focus on assumptions and the state of the underlying knowledge needed to define, design, build, verify, and operate the products, can identify critical risks that historically have been missed by the usual approaches (i.e., design review process and classical risk identification methods). Further, the methodology answers four critical questions for decision makers and risk managers: 1. What s been included? 2. What's been left out? 3. How has it been validated? 4. Has the real source of the uncertainty/ risk been identified, i.e., is the perceived problem the real problem? Users of the RoSA methodology have characterized it as a true bottoms up risk assessment.
NASA Technical Reports Server (NTRS)
Lee, Alice T.; Gunn, Todd; Pham, Tuan; Ricaldi, Ron
1994-01-01
This handbook documents the three software analysis processes the Space Station Software Analysis team uses to assess space station software, including their backgrounds, theories, tools, and analysis procedures. Potential applications of these analysis results are also presented. The first section describes how software complexity analysis provides quantitative information on code, such as code structure and risk areas, throughout the software life cycle. Software complexity analysis allows an analyst to understand the software structure, identify critical software components, assess risk areas within a software system, identify testing deficiencies, and recommend program improvements. Performing this type of analysis during the early design phases of software development can positively affect the process, and may prevent later, much larger, difficulties. The second section describes how software reliability estimation and prediction analysis, or software reliability, provides a quantitative means to measure the probability of failure-free operation of a computer program, and describes the two tools used by JSC to determine failure rates and design tradeoffs between reliability, costs, performance, and schedule.
Crew Exploration Vehicle (CEV) Avionics Integration Laboratory (CAIL) Independent Analysis
NASA Technical Reports Server (NTRS)
Davis, Mitchell L.; Aguilar, Michael L.; Mora, Victor D.; Regenie, Victoria A.; Ritz, William F.
2009-01-01
Two approaches were compared to the Crew Exploration Vehicle (CEV) Avionics Integration Laboratory (CAIL) approach: the Flat-Sat and Shuttle Avionics Integration Laboratory (SAIL). The Flat-Sat and CAIL/SAIL approaches are two different tools designed to mitigate different risks. Flat-Sat approach is designed to develop a mission concept into a flight avionics system and associated ground controller. The SAIL approach is designed to aid in the flight readiness verification of the flight avionics system. The approaches are complimentary in addressing both the system development risks and mission verification risks. The following NESC team findings were identified: The CAIL assumption is that the flight subsystems will be matured for the system level verification; The Flat-Sat and SAIL approaches are two different tools designed to mitigate different risks. The following NESC team recommendation was provided: Define, document, and manage a detailed interface between the design and development (EDL and other integration labs) to the verification laboratory (CAIL).
Azoulay, Laurent; Suissa, Samy
2017-05-01
Recent randomized trials have compared the newer antidiabetic agents to treatments involving sulfonylureas, drugs associated with increased cardiovascular risks and mortality in some observational studies with conflicting results. We reviewed the methodology of these observational studies by searching MEDLINE from inception to December 2015 for all studies of the association between sulfonylureas and cardiovascular events or mortality. Each study was appraised with respect to the comparator, the outcome, and study design-related sources of bias. A meta-regression analysis was used to evaluate heterogeneity. A total of 19 studies were identified, of which six had no major design-related biases. Sulfonylureas were associated with an increased risk of cardiovascular events and mortality in five of these studies (relative risks 1.16-1.55). Overall, the 19 studies resulted in 36 relative risks as some studies assessed multiple outcomes or comparators. Of the 36 analyses, metformin was the comparator in 27 (75%) and death was the outcome in 24 (67%). The relative risk was higher by 13% when the comparator was metformin, by 20% when death was the outcome, and by 7% when the studies had design-related biases. The lowest predicted relative risk was for studies with no major bias, comparator other than metformin, and cardiovascular outcome (1.06 [95% CI 0.92-1.23]), whereas the highest was for studies with bias, metformin comparator, and mortality outcome (1.53 [95% CI 1.43-1.65]). In summary, sulfonylureas were associated with an increased risk of cardiovascular events and mortality in the majority of studies with no major design-related biases. Among studies with important biases, the association varied significantly with respect to the comparator, the outcome, and the type of bias. With the introduction of new antidiabetic drugs, the use of appropriate design and analytical tools will provide their more accurate cardiovascular safety assessment in the real-world setting. © 2017 by the American Diabetes Association.
Ermolieva, T; Filatova, T; Ermoliev, Y; Obersteiner, M; de Bruijn, K M; Jeuken, A
2017-01-01
As flood risks grow worldwide, a well-designed insurance program engaging various stakeholders becomes a vital instrument in flood risk management. The main challenge concerns the applicability of standard approaches for calculating insurance premiums of rare catastrophic losses. This article focuses on the design of a flood-loss-sharing program involving private insurance based on location-specific exposures. The analysis is guided by a developed integrated catastrophe risk management (ICRM) model consisting of a GIS-based flood model and a stochastic optimization procedure with respect to location-specific risk exposures. To achieve the stability and robustness of the program towards floods with various recurrences, the ICRM uses stochastic optimization procedure, which relies on quantile-related risk functions of a systemic insolvency involving overpayments and underpayments of the stakeholders. Two alternative ways of calculating insurance premiums are compared: the robust derived with the ICRM and the traditional average annual loss approach. The applicability of the proposed model is illustrated in a case study of a Rotterdam area outside the main flood protection system in the Netherlands. Our numerical experiments demonstrate essential advantages of the robust premiums, namely, that they: (1) guarantee the program's solvency under all relevant flood scenarios rather than one average event; (2) establish a tradeoff between the security of the program and the welfare of locations; and (3) decrease the need for other risk transfer and risk reduction measures. © 2016 Society for Risk Analysis.
Cui, Lingling; Liu, Xinxin; Tian, Yalan; Xie, Chen; Li, Qianwen; Cui, Han; Sun, Changqing
2016-06-08
Flavonoids have been suggested to play a chemopreventive role in carcinogenesis. However, the epidemiologic studies assessing dietary intake of flavonoids and esophageal cancer risk have yielded inconsistent results. This study was designed to examine the association between flavonoids, each flavonoid subclass, and the risk of esophageal cancer with a meta-analysis approach. We searched for all relevant studies with a prospective cohort or case-control study design published from January 1990 to April 2016, using PUBMED, EMBASE, and Web of Science. Pooled odds ratios (ORs) were calculated using fixed or random-effect models. In total, seven articles including 2629 cases and 481,193 non-cases were selected for the meta-analysis. Comparing the highest-intake patients with the lowest-intake patients for total flavonoids and for each flavonoid subclass, we found that anthocyanidins (OR = 0.60, 95% CI: 0.49-0.74), flavanones (OR = 0.65, 95% CI: 0.49-0.86), and flavones (OR = 0.78, 95% CI 0.64-0.95) were inversely associated with the risk of esophageal cancer. However, total flavonoids showed marginal association with esophageal cancer risk (OR = 0.78, 95% CI: 0.59-1.04). In conclusion, our study suggested that dietary intake of total flavonoids, anthocyanidins, flavanones, and flavones might reduce the risk of esophageal cancer.
Trade Studies of Space Launch Architectures using Modular Probabilistic Risk Analysis
NASA Technical Reports Server (NTRS)
Mathias, Donovan L.; Go, Susie
2006-01-01
A top-down risk assessment in the early phases of space exploration architecture development can provide understanding and intuition of the potential risks associated with new designs and technologies. In this approach, risk analysts draw from their past experience and the heritage of similar existing systems as a source for reliability data. This top-down approach captures the complex interactions of the risk driving parts of the integrated system without requiring detailed knowledge of the parts themselves, which is often unavailable in the early design stages. Traditional probabilistic risk analysis (PRA) technologies, however, suffer several drawbacks that limit their timely application to complex technology development programs. The most restrictive of these is a dependence on static planning scenarios, expressed through fault and event trees. Fault trees incorporating comprehensive mission scenarios are routinely constructed for complex space systems, and several commercial software products are available for evaluating fault statistics. These static representations cannot capture the dynamic behavior of system failures without substantial modification of the initial tree. Consequently, the development of dynamic models using fault tree analysis has been an active area of research in recent years. This paper discusses the implementation and demonstration of dynamic, modular scenario modeling for integration of subsystem fault evaluation modules using the Space Architecture Failure Evaluation (SAFE) tool. SAFE is a C++ code that was originally developed to support NASA s Space Launch Initiative. It provides a flexible framework for system architecture definition and trade studies. SAFE supports extensible modeling of dynamic, time-dependent risk drivers of the system and functions at the level of fidelity for which design and failure data exists. The approach is scalable, allowing inclusion of additional information as detailed data becomes available. The tool performs a Monte Carlo analysis to provide statistical estimates. Example results of an architecture system reliability study are summarized for an exploration system concept using heritage data from liquid-fueled expendable Saturn V/Apollo launch vehicles.
System Risk Assessment and Allocation in Conceptual Design
NASA Technical Reports Server (NTRS)
Mahadevan, Sankaran; Smith, Natasha L.; Zang, Thomas A. (Technical Monitor)
2003-01-01
As aerospace systems continue to evolve in addressing newer challenges in air and space transportation, there exists a heightened priority for significant improvement in system performance, cost effectiveness, reliability, and safety. Tools, which synthesize multidisciplinary integration, probabilistic analysis, and optimization, are needed to facilitate design decisions allowing trade-offs between cost and reliability. This study investigates tools for probabilistic analysis and probabilistic optimization in the multidisciplinary design of aerospace systems. A probabilistic optimization methodology is demonstrated for the low-fidelity design of a reusable launch vehicle at two levels, a global geometry design and a local tank design. Probabilistic analysis is performed on a high fidelity analysis of a Navy missile system. Furthermore, decoupling strategies are introduced to reduce the computational effort required for multidisciplinary systems with feedback coupling.
Analysis of Risk Management in Adapted Physical Education Textbooks
ERIC Educational Resources Information Center
Murphy, Kelle L.; Donovan, Jacqueline B.; Berg, Dominck A.
2016-01-01
Physical education teacher education (PETE) programs vary on how the topics of safe teaching and risk management are addressed. Common practices to cover such issues include requiring textbooks, lesson planning, peer teaching, videotaping, reflecting, and reading case law analyses. We used a mixed methods design to examine how risk management is…
Multi-modal vehicle display design and analysis
DOT National Transportation Integrated Search
2004-10-01
There is increasing interest in systematically studying the risks encountered while driving. In some cases, the : focus is on potential risks such as those associated with the use of in-car devices (e.g., mobile phones, radio, navigational : displays...
Behavioral economics and regulatory analysis.
Robinson, Lisa A; Hammitt, James K
2011-09-01
Behavioral economics has captured the interest of scholars and the general public by demonstrating ways in which individuals make decisions that appear irrational. While increasing attention is being focused on the implications of this research for the design of risk-reducing policies, less attention has been paid to how it affects the economic valuation of policy consequences. This article considers the latter issue, reviewing the behavioral economics literature and discussing its implications for the conduct of benefit-cost analysis, particularly in the context of environmental, health, and safety regulations. We explore three concerns: using estimates of willingness to pay or willingness to accept compensation for valuation, considering the psychological aspects of risk when valuing mortality-risk reductions, and discounting future consequences. In each case, we take the perspective that analysts should avoid making judgments about whether values are "rational" or "irrational." Instead, they should make every effort to rely on well-designed studies, using ranges, sensitivity analysis, or probabilistic modeling to reflect uncertainty. More generally, behavioral research has led some to argue for a more paternalistic approach to policy analysis. We argue instead for continued focus on describing the preferences of those affected, while working to ensure that these preferences are based on knowledge and careful reflection. © 2011 Society for Risk Analysis.
Design and statistical problems in prevention.
Gullberg, B
1996-01-01
Clinical and epidemiological research in osteoporosis can benefit from using the methods and techniques established in the area of chronic disease epidemiology. However, attention has to be given to the special characteristics such as the multifactorial nature and the fact that the subjects usually are of high ages. In order to evaluate prevention it is of course first necessary to detect and confirm reversible risk factors. The advantage and disadvantage of different design (cross-sectional, cohort and case-control) are well known. The effects of avoidable biases, e.g. selection, observation and confounding have to be balanced against practical conveniences like time, expenses, recruitment etc. The translation of relative risks into population attributable risks (etiologic fractions, prevented fractions) are complex and are usually performed under unrealistic, simplified assumptions. The consequences of interactions (synergy) between risk factors are often neglected. The multifactorial structure requires application of more advanced multi-level statistical techniques. The common strategy in prevention to target a cluster of risk factors in order to avoid the multifactorial nature implies that in the end it is impossible to separate each unique factor. Experimental designs for evaluating prevention like clinical trials and intervention have to take into account the distinction between explanatory and pragmatic studies. An explanatory approach is similar to an idealized laboratory trial while the pragmatic design is more realistic, practical and has a general public health perspective. The statistical techniques to be used in osteoporosis research are implemented in easy available computer-packages like SAS, SPSS, BMDP and GLIM. In addition to the traditional logistic regression methods like Cox analysis and Poisson regression also analysis of repeated measurement and cluster analysis are relevant.
Mortier, Séverine Thérèse F C; Van Bockstal, Pieter-Jan; Corver, Jos; Nopens, Ingmar; Gernaey, Krist V; De Beer, Thomas
2016-06-01
Large molecules, such as biopharmaceuticals, are considered the key driver of growth for the pharmaceutical industry. Freeze-drying is the preferred way to stabilise these products when needed. However, it is an expensive, inefficient, time- and energy-consuming process. During freeze-drying, there are only two main process variables to be set, i.e. the shelf temperature and the chamber pressure, however preferably in a dynamic way. This manuscript focuses on the essential use of uncertainty analysis for the determination and experimental verification of the dynamic primary drying Design Space for pharmaceutical freeze-drying. Traditionally, the chamber pressure and shelf temperature are kept constant during primary drying, leading to less optimal process conditions. In this paper it is demonstrated how a mechanistic model of the primary drying step gives the opportunity to determine the optimal dynamic values for both process variables during processing, resulting in a dynamic Design Space with a well-known risk of failure. This allows running the primary drying process step as time efficient as possible, hereby guaranteeing that the temperature at the sublimation front does not exceed the collapse temperature. The Design Space is the multidimensional combination and interaction of input variables and process parameters leading to the expected product specifications with a controlled (i.e., high) probability. Therefore, inclusion of parameter uncertainty is an essential part in the definition of the Design Space, although it is often neglected. To quantitatively assess the inherent uncertainty on the parameters of the mechanistic model, an uncertainty analysis was performed to establish the borders of the dynamic Design Space, i.e. a time-varying shelf temperature and chamber pressure, associated with a specific risk of failure. A risk of failure acceptance level of 0.01%, i.e. a 'zero-failure' situation, results in an increased primary drying process time compared to the deterministic dynamic Design Space; however, the risk of failure is under control. Experimental verification revealed that only a risk of failure acceptance level of 0.01% yielded a guaranteed zero-defect quality end-product. The computed process settings with a risk of failure acceptance level of 0.01% resulted in a decrease of more than half of the primary drying time in comparison with a regular, conservative cycle with fixed settings. Copyright © 2016. Published by Elsevier B.V.
Reliability and Probabilistic Risk Assessment - How They Play Together
NASA Technical Reports Server (NTRS)
Safie, Fayssal M.; Stutts, Richard G.; Zhaofeng, Huang
2015-01-01
PRA methodology is one of the probabilistic analysis methods that NASA brought from the nuclear industry to assess the risk of LOM, LOV and LOC for launch vehicles. PRA is a system scenario based risk assessment that uses a combination of fault trees, event trees, event sequence diagrams, and probability and statistical data to analyze the risk of a system, a process, or an activity. It is a process designed to answer three basic questions: What can go wrong? How likely is it? What is the severity of the degradation? Since 1986, NASA, along with industry partners, has conducted a number of PRA studies to predict the overall launch vehicles risks. Planning Research Corporation conducted the first of these studies in 1988. In 1995, Science Applications International Corporation (SAIC) conducted a comprehensive PRA study. In July 1996, NASA conducted a two-year study (October 1996 - September 1998) to develop a model that provided the overall Space Shuttle risk and estimates of risk changes due to proposed Space Shuttle upgrades. After the Columbia accident, NASA conducted a PRA on the Shuttle External Tank (ET) foam. This study was the most focused and extensive risk assessment that NASA has conducted in recent years. It used a dynamic, physics-based, integrated system analysis approach to understand the integrated system risk due to ET foam loss in flight. Most recently, a PRA for Ares I launch vehicle has been performed in support of the Constellation program. Reliability, on the other hand, addresses the loss of functions. In a broader sense, reliability engineering is a discipline that involves the application of engineering principles to the design and processing of products, both hardware and software, for meeting product reliability requirements or goals. It is a very broad design-support discipline. It has important interfaces with many other engineering disciplines. Reliability as a figure of merit (i.e. the metric) is the probability that an item will perform its intended function(s) for a specified mission profile. In general, the reliability metric can be calculated through the analyses using reliability demonstration and reliability prediction methodologies. Reliability analysis is very critical for understanding component failure mechanisms and in identifying reliability critical design and process drivers. The following sections discuss the PRA process and reliability engineering in detail and provide an application where reliability analysis and PRA were jointly used in a complementary manner to support a Space Shuttle flight risk assessment.
Association between the BRCA2 rs144848 polymorphism and cancer susceptibility: a meta-analysis.
Li, Qiuyan; Guan, Rongwei; Qiao, Yuandong; Liu, Chang; He, Ning; Zhang, Xuelong; Jia, Xueyuan; Sun, Haiming; Yu, Jingcui; Xu, Lidan
2017-06-13
The BRCA2 gene plays an important role in cancer carcinogenesis, and polymorphisms in this gene have been associated with cancer risk. The BRCA2 rs144848 polymorphism has been associated with several cancers, but results have been inconsistent. In the present study, a meta-analysis was performed to assess the association between the rs144848 polymorphism and cancer risk. Literature was searched from the databases of PubMed, Embase and Google Scholar before April 2016. The fixed or random effects model was used to calculate pooled odd ratios on the basis of heterogeneity. Meta-regression, sensitivity analysis, subgroup analysis and publication bias assessment were also performed using STATA 11.0 software according to Preferred Reporting Items for Systematic Reviews and Meta-Analyses 2009. A total of 40 relevant studies from 30 publications including 34,911 cases and 48,329 controls were included in the final meta-analysis. Among them, 22 studies focused on breast cancer, seven on ovarian cancer, five on non-Hodgkin lymphoma, and the remaining six studies examined various other cancers. The meta-analysis results showed that there were significant associations between the rs144848 polymorphism and cancer risk in all genetic models. Stratified by cancer type, the rs144848 polymorphism was associated with non-Hodgkin lymphoma. Stratified by study design, the allele model was associated with breast cancer risk in population-based studies. The meta-analysis suggests that the BRCA2 rs144848 polymorphism may play a role in cancer risk. Further well-designed studies are warranted to confirm these results.
Design of a secure remote management module for a software-operated medical device.
Burnik, Urban; Dobravec, Štefan; Meža, Marko
2017-12-09
Software-based medical devices need to be maintained throughout their entire life cycle. The efficiency of after-sales maintenance can be improved by managing medical systems remotely. This paper presents how to design the remote access function extensions in order to prevent risks imposed by uncontrolled remote access. A thorough analysis of standards and legislation requirements regarding safe operation and risk management of medical devices is presented. Based on the formal requirements, a multi-layer machine design solution is proposed that eliminates remote connectivity risks by strict separation of regular device functionalities from remote management service, deploys encrypted communication links and uses digital signatures to prevent mishandling of software images. The proposed system may also be used as an efficient version update of the existing medical device designs.
Seismic hazard assessment: Issues and alternatives
Wang, Z.
2011-01-01
Seismic hazard and risk are two very important concepts in engineering design and other policy considerations. Although seismic hazard and risk have often been used inter-changeably, they are fundamentally different. Furthermore, seismic risk is more important in engineering design and other policy considerations. Seismic hazard assessment is an effort by earth scientists to quantify seismic hazard and its associated uncertainty in time and space and to provide seismic hazard estimates for seismic risk assessment and other applications. Although seismic hazard assessment is more a scientific issue, it deserves special attention because of its significant implication to society. Two approaches, probabilistic seismic hazard analysis (PSHA) and deterministic seismic hazard analysis (DSHA), are commonly used for seismic hazard assessment. Although PSHA has been pro-claimed as the best approach for seismic hazard assessment, it is scientifically flawed (i.e., the physics and mathematics that PSHA is based on are not valid). Use of PSHA could lead to either unsafe or overly conservative engineering design or public policy, each of which has dire consequences to society. On the other hand, DSHA is a viable approach for seismic hazard assessment even though it has been labeled as unreliable. The biggest drawback of DSHA is that the temporal characteristics (i.e., earthquake frequency of occurrence and the associated uncertainty) are often neglected. An alternative, seismic hazard analysis (SHA), utilizes earthquake science and statistics directly and provides a seismic hazard estimate that can be readily used for seismic risk assessment and other applications. ?? 2010 Springer Basel AG.
Li, Shi; Mukherjee, Bhramar; Batterman, Stuart; Ghosh, Malay
2013-12-01
Case-crossover designs are widely used to study short-term exposure effects on the risk of acute adverse health events. While the frequentist literature on this topic is vast, there is no Bayesian work in this general area. The contribution of this paper is twofold. First, the paper establishes Bayesian equivalence results that require characterization of the set of priors under which the posterior distributions of the risk ratio parameters based on a case-crossover and time-series analysis are identical. Second, the paper studies inferential issues under case-crossover designs in a Bayesian framework. Traditionally, a conditional logistic regression is used for inference on risk-ratio parameters in case-crossover studies. We consider instead a more general full likelihood-based approach which makes less restrictive assumptions on the risk functions. Formulation of a full likelihood leads to growth in the number of parameters proportional to the sample size. We propose a semi-parametric Bayesian approach using a Dirichlet process prior to handle the random nuisance parameters that appear in a full likelihood formulation. We carry out a simulation study to compare the Bayesian methods based on full and conditional likelihood with the standard frequentist approaches for case-crossover and time-series analysis. The proposed methods are illustrated through the Detroit Asthma Morbidity, Air Quality and Traffic study, which examines the association between acute asthma risk and ambient air pollutant concentrations. © 2013, The International Biometric Society.
Space station data system analysis/architecture study. Task 4: System definition report
NASA Technical Reports Server (NTRS)
1985-01-01
Functional/performance requirements for the Space Station Data System (SSDS) are analyzed and architectural design concepts are derived and evaluated in terms of their performance and growth potential, technical feasibility and risk, and cost effectiveness. The design concepts discussed are grouped under five major areas: SSDS top-level architecture overview, end-to-end SSDS design and operations perspective, communications assumptions and traffic analysis, onboard SSDS definition, and ground SSDS definition.
Perspectives in adolescent risk-taking through instrument development.
Busen, N H; Kouzekanani, K
2000-01-01
Understanding the high-risk adolescent's perception of risk taking is essential for health professionals to determine appropriate interventions. The purpose of this study was to examine the psychometric properties of the revised Adolescent Risk-Taking Instrument (ARTI) designed to measure the high-risk adolescent's perception of risk taking. This study also examined the variables that are most predictive of social adaptation and risk taking. An ex post facto design was used to standardize data collection and to assess the psychometric properties of the revised ARTI. The nonprobability sample consisted of 167 adolescents attending school in an urban, health-underserved area. Exploratory factor analysis supported construct validity, and Chronbach's Coefficient Alpha supported internal consistency reliability. The reliability coefficient for the risk taking and social adaptation constructs were .80 and .77, respectively. Current perspectives on adolescent risk taking and implications for the use of the ARTI in clinical practice are addressed.
Gray, Benjamin J; Bracken, Richard M; Turner, Daniel; Morgan, Kerry; Thomas, Michael; Williams, Sally P; Williams, Meurig; Rice, Sam; Stephens, Jeffrey W
2015-01-01
Background Use of a validated risk-assessment tool to identify individuals at high risk of developing type 2 diabetes is currently recommended. It is under-reported, however, whether a different risk tool alters the predicted risk of an individual. Aim This study explored any differences between commonly used validated risk-assessment tools for type 2 diabetes. Design and setting Cross-sectional analysis of individuals who participated in a workplace-based risk assessment in Carmarthenshire, South Wales. Method Retrospective analysis of 676 individuals (389 females and 287 males) who participated in a workplace-based diabetes risk-assessment initiative. Ten-year risk of type 2 diabetes was predicted using the validated QDiabetes®, Leicester Risk Assessment (LRA), FINDRISC, and Cambridge Risk Score (CRS) algorithms. Results Differences between the risk-assessment tools were apparent following retrospective analysis of individuals. CRS categorised the highest proportion (13.6%) of individuals at ‘high risk’ followed by FINDRISC (6.6%), QDiabetes (6.1%), and, finally, the LRA was the most conservative risk tool (3.1%). Following further analysis by sex, over one-quarter of males were categorised at high risk using CRS (25.4%), whereas a greater percentage of females were categorised as high risk using FINDRISC (7.8%). Conclusion The adoption of a different valid risk-assessment tool can alter the predicted risk of an individual and caution should be used to identify those individuals who really are at high risk of type 2 diabetes. PMID:26541180
Living near nuclear power plants and thyroid cancer risk: A systematic review and meta-analysis.
Kim, Jaeyoung; Bang, Yejin; Lee, Won Jin
2016-02-01
There has been public concern regarding the safety of residing near nuclear power plants, and the extent of risk for thyroid cancer among adults living near nuclear power plants has not been fully explored. In the present study, a systematic review and meta-analysis of epidemiologic studies was conducted to investigate the association between living near nuclear power plants and the risk of thyroid cancer. A comprehensive literature search was performed on studies published up to March 2015 on the association between nuclear power plants and thyroid cancer risk. The summary standardized incidence ratio (SIR), standardized mortality ratio (SMR), and 95% confidence intervals (CIs) were calculated using a random-effect model of meta-analysis. Sensitivity analyses were performed by study quality. Thirteen studies were included in the meta-analysis, covering 36 nuclear power stations in 10 countries. Overall, summary estimates showed no significant increased thyroid cancer incidence or mortality among residents living near nuclear power plants (summary SIR=0.98; 95% CI 0.87-1.11, summary SMR=0.80; 95% CI 0.62-1.04). The pooled estimates did not reveal different patterns of risk by gender, exposure definition, or reference population. However, sensitivity analysis by exposure definition showed that living less than 20 km from nuclear power plants was associated with a significant increase in the risk of thyroid cancer in well-designed studies (summary OR=1.75; 95% CI 1.17-2.64). Our study does not support an association between living near nuclear power plants and risk of thyroid cancer but does support a need for well-designed future studies. Copyright © 2015 Elsevier Ltd. All rights reserved.
Development of risk-based decision methodology for facility design.
DOT National Transportation Integrated Search
2014-06-01
This report develops a methodology for CDOT to use in the risk analysis of various types of facilities and provides : illustrative examples for the use of the proposed framework. An overview of the current practices and applications to : illustrate t...
Dhat, Shalaka; Pund, Swati; Kokare, Chandrakant; Sharma, Pankaj; Shrivastava, Birendra
2017-01-01
Rapidly evolving technical and regulatory landscapes of the pharmaceutical product development necessitates risk management with application of multivariate analysis using Process Analytical Technology (PAT) and Quality by Design (QbD). Poorly soluble, high dose drug, Satranidazole was optimally nanoprecipitated (SAT-NP) employing principles of Formulation by Design (FbD). The potential risk factors influencing the critical quality attributes (CQA) of SAT-NP were identified using Ishikawa diagram. Plackett-Burman screening design was adopted to screen the eight critical formulation and process parameters influencing the mean particle size, zeta potential and dissolution efficiency at 30min in pH7.4 dissolution medium. Pareto charts (individual and cumulative) revealed three most critical factors influencing CQA of SAT-NP viz. aqueous stabilizer (Polyvinyl alcohol), release modifier (Eudragit® S 100) and volume of aqueous phase. The levels of these three critical formulation attributes were optimized by FbD within established design space to minimize mean particle size, poly dispersity index, and maximize encapsulation efficiency of SAT-NP. Lenth's and Bayesian analysis along with mathematical modeling of results allowed identification and quantification of critical formulation attributes significantly active on the selected CQAs. The optimized SAT-NP exhibited mean particle size; 216nm, polydispersity index; 0.250, zeta potential; -3.75mV and encapsulation efficiency; 78.3%. The product was lyophilized using mannitol to form readily redispersible powder. X-ray diffraction analysis confirmed the conversion of crystalline SAT to amorphous form. In vitro release of SAT-NP in gradually pH changing media showed <20% release in pH1.2 and pH6.8 in 5h, while, complete release (>95%) in pH7.4 in next 3h, indicative of burst release after a lag time. This investigation demonstrated effective application of risk management and QbD tools in developing site-specific release SAT-NP by nanoprecipitation. Copyright © 2016 Elsevier B.V. All rights reserved.
Ergonomics principles to design clothing work for electrical workers in Colombia.
Castillo, Juan; Cubillos, A
2012-01-01
The recent development of the Colombian legislation, have been identified the need to develop protective clothing to work according to specifications from the work done and in compliance with international standards. These involve the development and design of new strategies and measures for work clothing design. In this study we analyzes the activities of the workers in the electrical sector, the method analyzes the risks activity data in various activities, that activities include power generation plants, local facilities, industrial facilities and maintenance of urban and rural networks. The analyses method is focused on ergonomic approach, risk analysis is done, we evaluate the role of security expert and we use a design algorithm developed for this purpose. The result of this study is the identification of constraints and variables that contribute to the development of a model of analysis that leads to the development the work protective clothes.
Impact of Atmospheric Aerosols on Solar Photovoltaic Electricity Generation in China
NASA Astrophysics Data System (ADS)
Li, X.; Mauzerall, D. L.; Wagner, F.; Peng, W.; Yang, J.
2016-12-01
Hurricanes have induced devastating storm surge flooding worldwide. The impacts of these storms may worsen in the coming decades because of rapid coastal development coupled with sea-level rise and possibly increasing storm activity due to climate change. Major advances in coastal flood risk management are urgently needed. We present an integrated dynamic risk analysis for flooding task (iDraft) framework to assess and manage coastal flood risk at the city or regional scale, considering integrated dynamic effects of storm climatology change, sea-level rise, and coastal development. We apply the framework to New York City. First, we combine climate-model projected storm surge climatology and sea-level rise with engineering- and social/economic-model projected coastal exposure and vulnerability to estimate the flood damage risk for the city over the 21st century. We derive temporally-varying risk measures such as the annual expected damage as well as temporally-integrated measures such as the present value of future losses. We also examine the individual and joint contributions to the changing risk of the three dynamic factors (i.e., sea-level rise, storm change, and coastal development). Then, we perform probabilistic cost-benefit analysis for various coastal flood risk mitigation strategies for the city. Specifically, we evaluate previously proposed mitigation measures, including elevating houses on the floodplain and constructing flood barriers at the coast, by comparing their estimated cost and probability distribution of the benefit (i.e., present value of avoided future losses). We also propose new design strategies, including optimal design (e.g., optimal house elevation) and adaptive design (e.g., flood protection levels that are designed to be modified over time in a dynamic and uncertain environment).
Wu, Qi-Jun; Wu, Lang; Zheng, Li-Qiang; Xu, Xin; Ji, Chao; Gong, Ting-Ting
2016-05-01
Observational studies have reported inconsistent results on the association between fruit and vegetable intake and the risk of pancreatic cancer. We carried out a meta-analysis of epidemiological studies to summarize available evidence. We searched PubMed, Scopus, and ISI Web of Science databases for relevant studies published until the end of January 2015. Fixed-effects and random-effects models were used to estimate the summary relative risks (RRs) and 95% confidence intervals (CIs) for the associations between fruit and vegetable intake and the risk of pancreatic cancer. A total of 15 case-control studies, eight prospective studies, and one pooled analysis fulfilled the inclusion criteria. The summary RR for the highest versus the lowest intake was 0.73 (95% CI=0.53-1.00) for fruit and vegetables, 0.73 (95% CI=0.63-0.84) for fruit, and 0.76 (95% CI=0.69-0.83) for vegetables, with significant heterogeneities (I=70.5, 55.7, and 43.0%, respectively). Inverse associations were observed in the stratified analysis by study design, although the results of prospective studies showed borderline significance, with corresponding RR=0.90 (95% CI=0.77-1.05) for fruit and vegetable intake, 0.93 (95% CI=0.83-1.03) for fruit intake, and 0.89 (95% CI=0.80-1.00) for vegetable intake. Besides, significant inverse associations were observed in the majority of other subgroup analyses by study quality, geographic location, exposure assessment method, and adjustment for potential confounders. Findings from the present meta-analysis support that fruit and vegetable intake is associated inversely with the risk of pancreatic cancer. However, study design may play a key role in the observed magnitude of the aforementioned association. Future well-designed prospective studies are warranted to confirm these findings.
The TPS Advanced Development Project for CEV
NASA Technical Reports Server (NTRS)
Reuther, James; Wercinski, Paul; Venkatapathy, Ethiraj; Ellerby, Don; Raiche, George; Bowman, Lynn; Jones, Craig; Kowal, John
2006-01-01
The CEV TPS Advanced Development Project (ADP) is a NASA in-house activity for providing two heatshield preliminary designs (a Lunar direct return as well as a LEO only return) for the CEV, including the TPS, the carrier structure, the interfaces and the attachments. The project s primary objective is the development of a single heatshield preliminary design that meets both Lunar direct return and LEO return requirements. The effort to develop the Lunar direct return capable heatshield is considered a high risk item for the NASA CEV development effort due to the low TRL (approx. 4) of the candidate TPS materials. By initiating the TPS ADP early in the development cycle, the intent is to use materials analysis and testing in combination with manufacturing demonstrations to reduce the programmatic risk of using advanced TPS technologies in the critical path for CEV. Due to the technical and schedule risks associated a Lunar return heatshield, the ADP will pursue a parallel path design approach, whereby a back-up TPS/heatshield design that only meets LEO return requirements is also developed. The TPS materials and carrier structure design concept selections will be based on testing, analysis, design and evaluation of scalability and manufacturing performed under the ADP. At the TPS PDR, the preferred programmatic strategy is to transfer the continued (detailed) design, development, testing and evaluation (DDT&E) of both the Lunar direct and LEO return designs to a government/prime contractor coordinated sub-system design team. The CEV prime contractor would have responsibility for the continued heatshield sub-system development. Continued government participation would include analysis, testing and evaluation as well as decision authority at TPS Final System Decision (FSD) (choosing between the primary and back-up heatshields) occurring between TPS PDR and TPS Critical Design Review (CDR). After TPS FSD the prime CEV contractor will complete the detailed design, certification testing, procurement, and integration of the CEV TPS.
Managing Large Scale Project Analysis Teams through a Web Accessible Database
NASA Technical Reports Server (NTRS)
O'Neil, Daniel A.
2008-01-01
Large scale space programs analyze thousands of requirements while mitigating safety, performance, schedule, and cost risks. These efforts involve a variety of roles with interdependent use cases and goals. For example, study managers and facilitators identify ground-rules and assumptions for a collection of studies required for a program or project milestone. Task leaders derive product requirements from the ground rules and assumptions and describe activities to produce needed analytical products. Disciplined specialists produce the specified products and load results into a file management system. Organizational and project managers provide the personnel and funds to conduct the tasks. Each role has responsibilities to establish information linkages and provide status reports to management. Projects conduct design and analysis cycles to refine designs to meet the requirements and implement risk mitigation plans. At the program level, integrated design and analysis cycles studies are conducted to eliminate every 'to-be-determined' and develop plans to mitigate every risk. At the agency level, strategic studies analyze different approaches to exploration architectures and campaigns. This paper describes a web-accessible database developed by NASA to coordinate and manage tasks at three organizational levels. Other topics in this paper cover integration technologies and techniques for process modeling and enterprise architectures.
NASA Technical Reports Server (NTRS)
Unal, Resit; Keating, Charles; Conway, Bruce; Chytka, Trina
2004-01-01
A comprehensive expert-judgment elicitation methodology to quantify input parameter uncertainty and analysis tool uncertainty in a conceptual launch vehicle design analysis has been developed. The ten-phase methodology seeks to obtain expert judgment opinion for quantifying uncertainties as a probability distribution so that multidisciplinary risk analysis studies can be performed. The calibration and aggregation techniques presented as part of the methodology are aimed at improving individual expert estimates, and provide an approach to aggregate multiple expert judgments into a single probability distribution. The purpose of this report is to document the methodology development and its validation through application to a reference aerospace vehicle. A detailed summary of the application exercise, including calibration and aggregation results is presented. A discussion of possible future steps in this research area is given.
The safer clinical systems project in renal care.
Weale, Andy R
2013-09-01
Current systems in place in healthcare are designed to detect harm after it has happened (e.g critical incident reports) and make recommendations based on an assessment of that event. Safer Clinical Systems, a Health Foundation funded project, is designed to proactively search for risk within systems, rather than being reactive to harm. The aim of the Safer Clinical Systems project in Renal Care was to reduce the risks associated with shared care for patients who are undergoing surgery but are looked after peri-operatively by nephrology teams on nephrology wards. This report details our findings of the diagnostic phase of Safer Clinical Systems: the proactive search for risk. We have evaluated the current system of care using a set of risk evaluation and process mapping tools (Failure Modes and Effects Analysis (FMEA) and Hierarchical Task Analysis HTA). We have engaged staff with the process mapping and risk assessment tools. We now understand our system and understand where the highest risk tasks are undertaken during a renal in-patient stay during which a patient has an operation. These key tasks occur across the perioperaive period and are not confined to one aspect of care. A measurement strategy and intervention plan have been designed around these tasks. Safer Clinical Systems has identified high risk, low reliability tasks in our system. We look forward to fully reporting these data in 2014. © 2013 European Dialysis and Transplant Nurses Association/European Renal Care Association.
LinkIT: a ludic elicitation game for eliciting risk perceptions.
Cao, Yan; McGill, William L
2013-06-01
The mental models approach, a leading strategy to develop risk communications, involves a time- and labor-intensive interview process and a lengthy questionnaire to elicit group-level risk perceptions. We propose that a similarity ratings approach for structural knowledge elicitation can be adopted to assist the risk mental models approach. The LinkIT game, inspired by games with a purpose (GWAP) technology, is a ludic elicitation tool designed to elicit group understanding of the relations between risk factors in a more enjoyable and productive manner when compared to traditional approaches. That is, consistent with the idea of ludic elicitation, LinkIT was designed to make the elicitation process fun and enjoyable in the hopes of increasing participation and data quality in risk studies. Like the mental models approach, the group mental model obtained via the LinkIT game can hence be generated and represented in a form of influence diagrams. In order to examine the external validity of LinkIT, we conducted a study to compare its performance with respect to a more conventional questionnaire-driven approach. Data analysis results conclude that the two group mental models elicited from the two approaches are similar to an extent. Yet, LinkIT was more productive and enjoyable than the questionnaire. However, participants commented that the current game has some usability concerns. This presentation summarizes the design and evaluation of the LinkIT game and suggests areas for future work. © 2012 Society for Risk Analysis.
Space Shuttle Probabilistic Risk Assessment (SPRA) Iteration 3.2
NASA Technical Reports Server (NTRS)
Boyer, Roger L.
2010-01-01
The Shuttle is a very reliable vehicle in comparison with other launch systems. Much of the risk posed by Shuttle operations is related to fundamental aspects of the spacecraft design and the environments in which it operates. It is unlikely that significant design improvements can be implemented to address these risks prior to the end of the Shuttle program. The model will continue to be used to identify possible emerging risk drivers and allow management to make risk-informed decisions on future missions. Potential uses of the SPRA in the future include: - Calculate risk impact of various mission contingencies (e.g. late inspection, crew rescue, etc.). - Assessing the risk impact of various trade studies (e.g. flow control valves). - Support risk analysis on mission specific events, such as in flight anomalies. - Serve as a guiding star and data source for future NASA programs.
Rapid Modeling and Analysis Tools: Evolution, Status, Needs and Directions
NASA Technical Reports Server (NTRS)
Knight, Norman F., Jr.; Stone, Thomas J.; Ransom, Jonathan B. (Technical Monitor)
2002-01-01
Advanced aerospace systems are becoming increasingly more complex, and customers are demanding lower cost, higher performance, and high reliability. Increased demands are placed on the design engineers to collaborate and integrate design needs and objectives early in the design process to minimize risks that may occur later in the design development stage. High performance systems require better understanding of system sensitivities much earlier in the design process to meet these goals. The knowledge, skills, intuition, and experience of an individual design engineer will need to be extended significantly for the next generation of aerospace system designs. Then a collaborative effort involving the designer, rapid and reliable analysis tools and virtual experts will result in advanced aerospace systems that are safe, reliable, and efficient. This paper discusses the evolution, status, needs and directions for rapid modeling and analysis tools for structural analysis. First, the evolution of computerized design and analysis tools is briefly described. Next, the status of representative design and analysis tools is described along with a brief statement on their functionality. Then technology advancements to achieve rapid modeling and analysis are identified. Finally, potential future directions including possible prototype configurations are proposed.
Brauer, Ruth; Ruigómez, Ana; Klungel, Olaf; Reynolds, Robert; Feudjo Tepie, Maurille; Smeeth, Liam; Douglas, Ian
2016-03-01
The aims of this study were two-fold: (i) to investigate the effect of exposure to antibiotic agents on the risk of acute liver injury using a self-controlled case series and case-crossover study and (ii) to compare the results between the case-only studies. For the self-controlled case series study relative incidence ratios (IRR) were calculated by dividing the rate of acute liver injury experienced during patients' periods of exposure to antibiotics to patients' rate of events during non-exposed time using conditional Poisson regression. For the case-crossover analysis we calculated Odds Ratios (OR) using conditional logistic regression by comparing exposure during 14- and 30-day risk windows with exposure during control moments. Using the self-controlled case series approach, the IRR was highest during the first 7 days after receipt of a prescription (10.01, 95% CI 6.59-15.18). Omitting post-exposure washout periods lowered the IRR to 7.2. The highest estimate in the case-crossover analysis was found when two 30-day control periods 1 year prior to the 30-day ALI risk period were retained in the analysis: OR = 6.5 (95% CI, 3.95-10.71). The lowest estimate was found when exposure in the 14-day risk period was compared to exposure in four consecutive 14-day control periods immediately prior to the risk period (OR = 3.05, 95% CI, 2.06-4.53). An increased relative risk of acute liver injury was consistently observed using both self-controlled case series and case-crossover designs. Case-only designs can be used as a viable alternative study design to study the risk of acute liver injury, albeit with some limitations. © 2015 The Authors Pharmacoepidemiology and Drug Safety Published by John Wiley & Sons Ltd.
A Strategic Approach to Medical Care for Exploration Missions
NASA Technical Reports Server (NTRS)
Canga, Michael A.; Shah, Ronak V.; Mindock, Jennifer A.; Antonsen, Erik L.
2016-01-01
Exploration missions will present significant new challenges to crew health, including effects of variable gravity environments, limited communication with Earth-based personnel for diagnosis and consultation for medical events, limited resupply, and limited ability for crew return. Providing health care capabilities for exploration class missions will require system trades be performed to identify a minimum set of requirements and crosscutting capabilities, which can be used in design of exploration medical systems. Medical data, information, and knowledge collected during current space missions must be catalogued and put in formats that facilitate querying and analysis. These data are used to inform the medical research and development program through analysis of risk trade studies between medical care capabilities and system constraints such as mass, power, volume, and training. Medical capability as a quantifiable variable is proposed as a surrogate risk metric and explored for trade space analysis that can improve communication between the medical and engineering approaches to mission design. The resulting medical system design approach selected will inform NASA mission architecture, vehicle, and subsystem design for the next generation of spacecraft.
Altair Lander Life Support: Design Analysis Cycles 1, 2, and 3
NASA Technical Reports Server (NTRS)
Anderson, Molly; Rotter, Hank; Stambaugh, Imelda; Curley, Su
2009-01-01
NASA is working to develop a new lunar lander to support lunar exploration. The development process that the Altair project is using for this vehicle is unlike most others. In Lander Design Analysis Cycle 1 (LDAC-1), a single-string, minimum functionality design concept was developed, including life support systems for different vehicle configuration concepts, first for a combination of an ascent vehicle and a habitat with integral airlocks, and then for a combined ascent vehicle-habitat with a detachable airlock. In LDAC-2, the Altair team took the ascent vehicle-habitat with detachable airlock and analyzed the design for the components that were the largest contributors to the risk of loss of crew (LOC). For life support, the largest drivers were related to oxygen supply and carbon dioxide control. Integrated abort options were developed at the vehicle level. Many life support failures were not considered to result in LOC because they had a long enough time to effect that abort was considered a feasible option to safely end the mission before the situation became life threatening. These failures were then classified as loss of mission (LOM) failures. Many options to reduce LOC risk were considered, and mass efficient solutions to the LOC problems were added to the vehicle design at the end of LDAC-2. In LDAC-3, the new design was analyzed for large contributors to the risk of LOM. To avoid ending the mission early or being unable to accomplish goals like performing all planned extravehicular activities (EVAs), various options were assessed for their combination of risk reduction and mass cost. This paper outlines the major assumptions, design features, and decisions related to the development of the life support system for the Altair project through LDAC-3.
Propellant injection systems and processes
NASA Technical Reports Server (NTRS)
Ito, Jackson I.
1995-01-01
The previous 'Art of Injector Design' is maturing and merging with the more systematic 'Science of Combustion Device Analysis.' This technology can be based upon observation, correlation, experimentation and ultimately analytical modeling based upon basic engineering principles. This methodology is more systematic and far superior to the historical injector design process of 'Trial and Error' or blindly 'Copying Past Successes.' The benefit of such an approach is to be able to rank candidate design concepts for relative probability of success or technical risk in all the important combustion device design requirements and combustion process development risk categories before committing to an engine development program. Even if a single analytical design concept cannot be developed to predict satisfying all requirements simultaneously, a series of risk mitigation key enabling technologies can be identified for early resolution. Lower cost subscale or laboratory experimentation to demonstrate proof of principle, critical instrumentation requirements, and design discriminating test plans can be developed based on the physical insight provided by these analyses.
Long-Term Unemployment and Suicide: A Systematic Review and Meta-Analysis
Milner, Allison; Page, Andrew; LaMontagne, Anthony D.
2013-01-01
Purpose There have been a number of reviews on the association+ between unemployment and suicide, but none have investigated how this relationship is influenced by duration of unemployment. Method A systematic review and meta-analysis was conducted of those studies that assessed duration of unemployment as a risk factor for suicide. Studies considered as eligible for inclusion were population-based cohort or case-control designs; population-based ecological designs, or hospital based clinical cohort or case-control designs published in the year 1980 or later. Results The review identified 16 eligible studies, out of a possible 10,358 articles resulting from a search of four databases: PubMed, Web of Knowledge, Scopus and Proquest. While all 16 studies measured unemployment duration in different ways, a common finding was that longer duration of unemployment was related to greater risk of suicide and suicide attempt. A random effects meta-analysis on a subsample of six cohort studies indicated that the pooled relative risk of suicide in relation to average follow-up time after unemployment was 1.70 (95% CI 1.22 to 2.18). However, results also suggested a possible habituation effect to unemployment over time, with the greatest risk of suicide occurring within five years of unemployment compared to the employed population (RR = 2.50, 95% CI 1.83 to 3.17). Relative risk appeared to decline in studies of those unemployed between 12 and 16 years compared to those currently employed (RR = 1.21, 95% CI 1.10 to 1.33). Conclusion Findings suggest that long-term unemployment is associated with greater incidence of suicide. Results of the meta-analysis suggest that risk is greatest in the first five years, and persists at a lower but elevated level up to 16 years after unemployment. These findings are limited by the paucity of data on this topic. PMID:23341881
NASA Astrophysics Data System (ADS)
Ndu, Obibobi Kamtochukwu
To ensure that estimates of risk and reliability inform design and resource allocation decisions in the development of complex engineering systems, early engagement in the design life cycle is necessary. An unfortunate constraint on the accuracy of such estimates at this stage of concept development is the limited amount of high fidelity design and failure information available on the actual system under development. Applying the human ability to learn from experience and augment our state of knowledge to evolve better solutions mitigates this limitation. However, the challenge lies in formalizing a methodology that takes this highly abstract, but fundamentally human cognitive, ability and extending it to the field of risk analysis while maintaining the tenets of generalization, Bayesian inference, and probabilistic risk analysis. We introduce an integrated framework for inferring the reliability, or other probabilistic measures of interest, of a new system or a conceptual variant of an existing system. Abstractly, our framework is based on learning from the performance of precedent designs and then applying the acquired knowledge, appropriately adjusted based on degree of relevance, to the inference process. This dissertation presents a method for inferring properties of the conceptual variant using a pseudo-spatial model that describes the spatial configuration of the family of systems to which the concept belongs. Through non-metric multidimensional scaling, we formulate the pseudo-spatial model based on rank-ordered subjective expert perception of design similarity between systems that elucidate the psychological space of the family. By a novel extension of Kriging methods for analysis of geospatial data to our "pseudo-space of comparable engineered systems", we develop a Bayesian inference model that allows prediction of the probabilistic measure of interest.
Correlates of HIV knowledge and Sexual risk behaviors among Female Military Personnel
Essien, E. James; Monjok, Emmanuel; Chen, Hua; Abughosh, Susan; Ekong, Ernest; Peters, Ronald J.; Holmes, Laurens; Holstad, Marcia M.; Mgbere, Osaro
2010-01-01
Objective Uniformed services personnel are at an increased risk of HIV infection. We examined the HIV/AIDS knowledge and sexual risk behaviors among female military personnel to determine the correlates of HIV risk behaviors in this population. Method The study used a cross-sectional design to examine HIV/AIDS knowledge and sexual risk behaviors in a sample of 346 females drawn from two military cantonments in Southwestern Nigeria. Data was collected between 2006 and 2008. Using bivariate analysis and multivariate logistic regression, HIV/AIDS knowledge and sexual behaviors were described in relation to socio-demographic characteristics of the participants. Results Multivariate logistic regression analysis revealed that level of education and knowing someone with HIV/AIDS were significant (p<0.05) predictors of HIV knowledge in this sample. HIV prevention self-efficacy was significantly (P<0.05) predicted by annual income and race/ethnicity. Condom use attitudes were also significantly (P<0.05) associated with number of children, annual income, and number of sexual partners. Conclusion Data indicates the importance of incorporating these predictor variables into intervention designs. PMID:20387111
Cui, Lingling; Liu, Xinxin; Tian, Yalan; Xie, Chen; Li, Qianwen; Cui, Han; Sun, Changqing
2016-01-01
Flavonoids have been suggested to play a chemopreventive role in carcinogenesis. However, the epidemiologic studies assessing dietary intake of flavonoids and esophageal cancer risk have yielded inconsistent results. This study was designed to examine the association between flavonoids, each flavonoid subclass, and the risk of esophageal cancer with a meta-analysis approach. We searched for all relevant studies with a prospective cohort or case-control study design published from January 1990 to April 2016, using PUBMED, EMBASE, and Web of Science. Pooled odds ratios (ORs) were calculated using fixed or random-effect models. In total, seven articles including 2629 cases and 481,193 non-cases were selected for the meta-analysis. Comparing the highest-intake patients with the lowest-intake patients for total flavonoids and for each flavonoid subclass, we found that anthocyanidins (OR = 0.60, 95% CI: 0.49–0.74), flavanones (OR = 0.65, 95% CI: 0.49–0.86), and flavones (OR = 0.78, 95% CI 0.64–0.95) were inversely associated with the risk of esophageal cancer. However, total flavonoids showed marginal association with esophageal cancer risk (OR = 0.78, 95% CI: 0.59–1.04). In conclusion, our study suggested that dietary intake of total flavonoids, anthocyanidins, flavanones, and flavones might reduce the risk of esophageal cancer. PMID:27338463
Statistical power, the Belmont report, and the ethics of clinical trials.
Vollmer, Sara H; Howard, George
2010-12-01
Achieving a good clinical trial design increases the likelihood that a trial will take place as planned, including that data will be obtained from a sufficient number of participants, and the total number of participants will be the minimal required to gain the knowledge sought. A good trial design also increases the likelihood that the knowledge sought by the experiment will be forthcoming. Achieving such a design is more than good sense-it is ethically required in experiments when participants are at risk of harm. This paper argues that doing a power analysis effectively contributes to ensuring that a trial design is good. The ethical importance of good trial design has long been recognized for trials in which there is risk of serious harm to participants. However, whether the quality of a trial design, when the risk to participants is only minimal, is an ethical issue is rarely discussed. This paper argues that even in cases when the risk is minimal, the quality of the trial design is an ethical issue, and that this is reflected in the emphasis the Belmont Report places on the importance of the benefit of knowledge gained by society. The paper also argues that good trial design is required for true informed consent.
ERIC Educational Resources Information Center
Hartog, Joop; Vijverberg, Wim
2007-01-01
Skill development involves important choices for individuals and school designers: should individuals and schools specialize, or should they aim for an optimal combination of skills? We analyze this question by employing mean-standard deviation analysis and show how cost structure, benefit structure and risk attitudes jointly determine the optimal…
Integrating Human Factors into Space Vehicle Processing for Risk Management
NASA Technical Reports Server (NTRS)
Woodbury, Sarah; Richards, Kimberly J.
2008-01-01
This presentation will discuss the multiple projects performed in United Space Alliance's Human Engineering Modeling and Performance (HEMAP) Lab, improvements that resulted from analysis, and the future applications of the HEMAP Lab for risk assessment by evaluating human/machine interaction and ergonomic designs.
Probabilistic cost-benefit analysis of disaster risk management in a development context.
Kull, Daniel; Mechler, Reinhard; Hochrainer-Stigler, Stefan
2013-07-01
Limited studies have shown that disaster risk management (DRM) can be cost-efficient in a development context. Cost-benefit analysis (CBA) is an evaluation tool to analyse economic efficiency. This research introduces quantitative, stochastic CBA frameworks and applies them in case studies of flood and drought risk reduction in India and Pakistan, while also incorporating projected climate change impacts. DRM interventions are shown to be economically efficient, with integrated approaches more cost-effective and robust than singular interventions. The paper highlights that CBA can be a useful tool if certain issues are considered properly, including: complexities in estimating risk; data dependency of results; negative effects of interventions; and distributional aspects. The design and process of CBA must take into account specific objectives, available information, resources, and the perceptions and needs of stakeholders as transparently as possible. Intervention design and uncertainties should be qualified through dialogue, indicating that process is as important as numerical results. © 2013 The Author(s). Journal compilation © Overseas Development Institute, 2013.
Adaptive designs for subpopulation analysis optimizing utility functions.
Graf, Alexandra C; Posch, Martin; Koenig, Franz
2015-01-01
If the response to treatment depends on genetic biomarkers, it is important to identify predictive biomarkers that define (sub-)populations where the treatment has a positive benefit risk balance. One approach to determine relevant subpopulations are subgroup analyses where the treatment effect is estimated in biomarker positive and biomarker negative groups. Subgroup analyses are challenging because several types of risks are associated with inference on subgroups. On the one hand, by disregarding a relevant subpopulation a treatment option may be missed due to a dilution of the treatment effect in the full population. Furthermore, even if the diluted treatment effect can be demonstrated in an overall population, it is not ethical to treat patients that do not benefit from the treatment when they can be identified in advance. On the other hand, selecting a spurious subpopulation increases the risk to restrict an efficacious treatment to a too narrow fraction of a potential benefiting population. We propose to quantify these risks with utility functions and investigate nonadaptive study designs that allow for inference on subgroups using multiple testing procedures as well as adaptive designs, where subgroups may be selected in an interim analysis. The characteristics of such adaptive and nonadaptive designs are compared for a range of scenarios. © 2014 The Authors. Biometrical Journal published by WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Investment appraisal using quantitative risk analysis.
Johansson, Henrik
2002-07-01
Investment appraisal concerned with investments in fire safety systems is discussed. Particular attention is directed at evaluating, in terms of the Bayesian decision theory, the risk reduction that investment in a fire safety system involves. It is shown how the monetary value of the change from a building design without any specific fire protection system to one including such a system can be estimated by use of quantitative risk analysis, the results of which are expressed in terms of a Risk-adjusted net present value. This represents the intrinsic monetary value of investing in the fire safety system. The method suggested is exemplified by a case study performed in an Avesta Sheffield factory.
Risk Factors of Falls in Community-Dwelling Older Adults: Logistic Regression Tree Analysis
ERIC Educational Resources Information Center
Yamashita, Takashi; Noe, Douglas A.; Bailer, A. John
2012-01-01
Purpose of the Study: A novel logistic regression tree-based method was applied to identify fall risk factors and possible interaction effects of those risk factors. Design and Methods: A nationally representative sample of American older adults aged 65 years and older (N = 9,592) in the Health and Retirement Study 2004 and 2006 modules was used.…
Sensemaking of patient safety risks and hazards.
Battles, James B; Dixon, Nancy M; Borotkanics, Robert J; Rabin-Fastmen, Barbara; Kaplan, Harold S
2006-08-01
In order for organizations to become learning organizations, they must make sense of their environment and learn from safety events. Sensemaking, as described by Weick (1995), literally means making sense of events. The ultimate goal of sensemaking is to build the understanding that can inform and direct actions to eliminate risk and hazards that are a threat to patient safety. True sensemaking in patient safety must use both retrospective and prospective approach to learning. Sensemaking is as an essential part of the design process leading to risk informed design. Sensemaking serves as a conceptual framework to bring together well established approaches to assessment of risk and hazards: (1) at the single event level using root cause analysis (RCA), (2) at the processes level using failure modes effects analysis (FMEA) and (3) at the system level using probabilistic risk assessment (PRA). The results of these separate or combined approaches are most effective when end users in conversation-based meetings add their expertise and knowledge to the data produced by the RCA, FMEA, and/or PRA in order to make sense of the risks and hazards. Without ownership engendered by such conversations, the possibility of effective action to eliminate or minimize them is greatly reduced.
Sensemaking of Patient Safety Risks and Hazards
Battles, James B; Dixon, Nancy M; Borotkanics, Robert J; Rabin-Fastmen, Barbara; Kaplan, Harold S
2006-01-01
In order for organizations to become learning organizations, they must make sense of their environment and learn from safety events. Sensemaking, as described by Weick (1995), literally means making sense of events. The ultimate goal of sensemaking is to build the understanding that can inform and direct actions to eliminate risk and hazards that are a threat to patient safety. True sensemaking in patient safety must use both retrospective and prospective approach to learning. Sensemaking is as an essential part of the design process leading to risk informed design. Sensemaking serves as a conceptual framework to bring together well established approaches to assessment of risk and hazards: (1) at the single event level using root cause analysis (RCA), (2) at the processes level using failure modes effects analysis (FMEA) and (3) at the system level using probabilistic risk assessment (PRA). The results of these separate or combined approaches are most effective when end users in conversation-based meetings add their expertise and knowledge to the data produced by the RCA, FMEA, and/or PRA in order to make sense of the risks and hazards. Without ownership engendered by such conversations, the possibility of effective action to eliminate or minimize them is greatly reduced. PMID:16898979
The physical vulnerability of elements at risk: a methodology based on fluid and classical mechanics
NASA Astrophysics Data System (ADS)
Mazzorana, B.; Fuchs, S.; Levaggi, L.
2012-04-01
The impacts of the flood events occurred in autumn 2011 in the Italian regions Liguria and Tuscany revived the engagement of the public decision makers to enhance in synergy flood control and land use planning. In this context, the design of efficient flood risk mitigation strategies and their subsequent implementation critically relies on a careful vulnerability analysis of both, the immobile and mobile elements at risk potentially exposed to flood hazards. Based on fluid and classical mechanics notions we developed computation schemes enabling for a dynamic vulnerability and risk analysis facing a broad typological variety of elements at risk. The methodological skeleton consists of (1) hydrodynamic computation of the time-varying flood intensities resulting for each element at risk in a succession of loading configurations; (2) modelling the mechanical response of the impacted elements through static, elasto-static and dynamic analyses; (3) characterising the mechanical response through proper structural damage variables and (4) economic valuation of the expected losses as a function of the quantified damage variables. From a computational perspective we coupled the description of the hydrodynamic flow behaviour and the induced structural modifications of the elements at risk exposed. Valuation methods, suitable to support a correct mapping from the value domains of the physical damage variables to the economic loss values are discussed. In such a way we target to complement from a methodological perspective the existing, mainly empirical, vulnerability and risk assessment approaches to refine the conceptual framework of the cost-benefit analysis. Moreover, we aim to support the design of effective flood risk mitigation strategies by diminishing the main criticalities within the systems prone to flood risk.
2011-01-01
stealth features requiring specialised noise and vibra- tion skills and propulsion plants requiring other unique skill sets. Personnel with these...analysis Acoustic, wake , thermal, electromagnetic, and other signature analysis Combat systems and ship control Combat system integration, combat system...to-diagnose flow-induced radiated noise Own-sensor performance degradation Note: Risks can be reduced for given designs using scale models
NASA Astrophysics Data System (ADS)
Oladyshkin, Sergey; Class, Holger; Helmig, Rainer; Nowak, Wolfgang
2010-05-01
CO2 storage in geological formations is currently being discussed intensively as a technology for mitigating CO2 emissions. However, any large-scale application requires a thorough analysis of the potential risks. Current numerical simulation models are too expensive for probabilistic risk analysis and for stochastic approaches based on brute-force repeated simulation. Even single deterministic simulations may require parallel high-performance computing. The multiphase flow processes involved are too non-linear for quasi-linear error propagation and other simplified stochastic tools. As an alternative approach, we propose a massive stochastic model reduction based on the probabilistic collocation method. The model response is projected onto a orthogonal basis of higher-order polynomials to approximate dependence on uncertain parameters (porosity, permeability etc.) and design parameters (injection rate, depth etc.). This allows for a non-linear propagation of model uncertainty affecting the predicted risk, ensures fast computation and provides a powerful tool for combining design variables and uncertain variables into one approach based on an integrative response surface. Thus, the design task of finding optimal injection regimes explicitly includes uncertainty, which leads to robust designs of the non-linear system that minimize failure probability and provide valuable support for risk-informed management decisions. We validate our proposed stochastic approach by Monte Carlo simulation using a common 3D benchmark problem (Class et al. Computational Geosciences 13, 2009). A reasonable compromise between computational efforts and precision was reached already with second-order polynomials. In our case study, the proposed approach yields a significant computational speedup by a factor of 100 compared to Monte Carlo simulation. We demonstrate that, due to the non-linearity of the flow and transport processes during CO2 injection, including uncertainty in the analysis leads to a systematic and significant shift of predicted leakage rates towards higher values compared with deterministic simulations, affecting both risk estimates and the design of injection scenarios. This implies that, neglecting uncertainty can be a strong simplification for modeling CO2 injection, and the consequences can be stronger than when neglecting several physical phenomena (e.g. phase transition, convective mixing, capillary forces etc.). The authors would like to thank the German Research Foundation (DFG) for financial support of the project within the Cluster of Excellence in Simulation Technology (EXC 310/1) at the University of Stuttgart. Keywords: polynomial chaos; CO2 storage; multiphase flow; porous media; risk assessment; uncertainty; integrative response surfaces
Risk Informed Design and Analysis Criteria for Nuclear Structures
DOE Office of Scientific and Technical Information (OSTI.GOV)
Salmon, Michael W.
2015-06-17
Target performance can be achieved by defining design basis ground motion from results of a probabilistic seismic hazards assessment, and introducing known levels of conservatism in the design above the DBE. ASCE 4, 43, DOE-STD-1020 defined the DBE at 4x10-4 and introduce only slight levels of conservatism in response. ASCE 4, 43, DOE-STD-1020 assume code capacities shoot for about 98% NEP. There is a need to have a uniform target (98% NEP) for code developers (ACI, AISC, etc.) to aim for. In considering strengthening options, one must also consider cost/risk reduction achieved.
[Eco-epidemiology: towards epidemiology of complexity].
Bizouarn, Philippe
2016-05-01
In order to solve public health problems posed by the epidemiology of risk factors centered on the individual and neglecting the causal processes linking the risk factors with the health outcomes, Mervyn Susser proposed a multilevel epidemiology called eco-epidemiology, addressing the interdependence of individuals and their connection with molecular, individual, societal, environmental levels of organization participating in the causal disease processes. The aim of this epidemiology is to integrate more than a level of organization in design, analysis and interpretation of health problems. After presenting the main criticisms of risk-factor epidemiology focused on the individual, we will try to show how eco-epidemiology and its development could help to understand the need for a broader and integrative epidemiology, in which studies designed to identify risk factors would be balanced by studies designed to answer other questions equally vital to public health. © 2016 médecine/sciences – Inserm.
Market-implied spread for earthquake CAT bonds: financial implications of engineering decisions.
Damnjanovic, Ivan; Aslan, Zafer; Mander, John
2010-12-01
In the event of natural and man-made disasters, owners of large-scale infrastructure facilities (assets) need contingency plans to effectively restore the operations within the acceptable timescales. Traditionally, the insurance sector provides the coverage against potential losses. However, there are many problems associated with this traditional approach to risk transfer including counterparty risk and litigation. Recently, a number of innovative risk mitigation methods, termed alternative risk transfer (ART) methods, have been introduced to address these problems. One of the most important ART methods is catastrophe (CAT) bonds. The objective of this article is to develop an integrative model that links engineering design parameters with financial indicators including spread and bond rating. The developed framework is based on a four-step structural loss model and transformed survival model to determine expected excess returns. We illustrate the framework for a seismically designed bridge using two unique CAT bond contracts. The results show a nonlinear relationship between engineering design parameters and market-implied spread. © 2010 Society for Risk Analysis.
Staccini, P; Quaranta, J F; Staccini-Myx, A; Veyres, P; Jambou, P
2003-09-01
Nowadays, information system is recognised as one of the key points of the management strategy. An information system is regarded conceptualised as a mean to link 3 aspects of a firm (structure, organisation rules and staff). Its design and implementation have to meet the objectives of medical and economical evaluation, especially risk management objectives. In order to identify, analyse, reduce and prevent the occurrence of adverse events, and also to measure the efficacy and efficiency of the production of care services, the design of information systems should be based on a process analysis in order to describe and classify all the working practices within the hospital. According to various methodologies (usually top-down analysis), each process can be divided into activities. Each activity (especially each care activity) can be described according to its potential risks and expected results. For care professionals performing a task, the access to official or internal guidelines and the adverse events reporting forms has also to be defined. Putting together all the elements of such a process analysis will contribute to integrate, into daily practice, the management of risks, supported by the information system.
Evaluation of risk communication in a mammography patient decision aid.
Klein, Krystal A; Watson, Lindsey; Ash, Joan S; Eden, Karen B
2016-07-01
We characterized patients' comprehension, memory, and impressions of risk communication messages in a patient decision aid (PtDA), Mammopad, and clarified perceived importance of numeric risk information in medical decision making. Participants were 75 women in their forties with average risk factors for breast cancer. We used mixed methods, comprising a risk estimation problem administered within a pretest-posttest design, and semi-structured qualitative interviews with a subsample of 21 women. Participants' positive predictive value estimates of screening mammography improved after using Mammopad. Although risk information was only briefly memorable, through content analysis, we identified themes describing why participants value quantitative risk information, and obstacles to understanding. We describe ways the most complicated graphic was incompletely comprehended. Comprehension of risk information following Mammopad use could be improved. Patients valued receiving numeric statistical information, particularly in pictograph format. Obstacles to understanding risk information, including potential for confusion between statistics, should be identified and mitigated in PtDA design. Using simple pictographs accompanied by text, PtDAs may enhance a shared decision-making discussion. PtDA designers and providers should be aware of benefits and limitations of graphical risk presentations. Incorporating comprehension checks could help identify and correct misapprehensions of graphically presented statistics. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.
Evaluation of risk communication in a mammography patient decision aid
Klein, Krystal A.; Watson, Lindsey; Ash, Joan S.; Eden, Karen B.
2016-01-01
Objectives We characterized patients’ comprehension, memory, and impressions of risk communication messages in a patient decision aid (PtDA), Mammopad, and clarified perceived importance of numeric risk information in medical decision making. Methods Participants were 75 women in their forties with average risk factors for breast cancer. We used mixed methods, comprising a risk estimation problem administered within a pretest–posttest design, and semi-structured qualitative interviews with a subsample of 21 women. Results Participants’ positive predictive value estimates of screening mammography improved after using Mammopad. Although risk information was only briefly memorable, through content analysis, we identified themes describing why participants value quantitative risk information, and obstacles to understanding. We describe ways the most complicated graphic was incompletely comprehended. Conclusions Comprehension of risk information following Mammopad use could be improved. Patients valued receiving numeric statistical information, particularly in pictograph format. Obstacles to understanding risk information, including potential for confusion between statistics, should be identified and mitigated in PtDA design. Practice implications Using simple pictographs accompanied by text, PtDAs may enhance a shared decision-making discussion. PtDA designers and providers should be aware of benefits and limitations of graphical risk presentations. Incorporating comprehension checks could help identify and correct misapprehensions of graphically presented statistics PMID:26965020
Reducing the Risk of Human Space Missions with INTEGRITY
NASA Technical Reports Server (NTRS)
Jones, Harry W.; Dillon-Merill, Robin L.; Tri, Terry O.; Henninger, Donald L.
2003-01-01
The INTEGRITY Program will design and operate a test bed facility to help prepare for future beyond-LEO missions. The purpose of INTEGRITY is to enable future missions by developing, testing, and demonstrating advanced human space systems. INTEGRITY will also implement and validate advanced management techniques including risk analysis and mitigation. One important way INTEGRITY will help enable future missions is by reducing their risk. A risk analysis of human space missions is important in defining the steps that INTEGRITY should take to mitigate risk. This paper describes how a Probabilistic Risk Assessment (PRA) of human space missions will help support the planning and development of INTEGRITY to maximize its benefits to future missions. PRA is a systematic methodology to decompose the system into subsystems and components, to quantify the failure risk as a function of the design elements and their corresponding probability of failure. PRA provides a quantitative estimate of the probability of failure of the system, including an assessment and display of the degree of uncertainty surrounding the probability. PRA provides a basis for understanding the impacts of decisions that affect safety, reliability, performance, and cost. Risks with both high probability and high impact are identified as top priority. The PRA of human missions beyond Earth orbit will help indicate how the risk of future human space missions can be reduced by integrating and testing systems in INTEGRITY.
Multi-mode reliability-based design of horizontal curves.
Essa, Mohamed; Sayed, Tarek; Hussein, Mohamed
2016-08-01
Recently, reliability analysis has been advocated as an effective approach to account for uncertainty in the geometric design process and to evaluate the risk associated with a particular design. In this approach, a risk measure (e.g. probability of noncompliance) is calculated to represent the probability that a specific design would not meet standard requirements. The majority of previous applications of reliability analysis in geometric design focused on evaluating the probability of noncompliance for only one mode of noncompliance such as insufficient sight distance. However, in many design situations, more than one mode of noncompliance may be present (e.g. insufficient sight distance and vehicle skidding at horizontal curves). In these situations, utilizing a multi-mode reliability approach that considers more than one failure (noncompliance) mode is required. The main objective of this paper is to demonstrate the application of multi-mode (system) reliability analysis to the design of horizontal curves. The process is demonstrated by a case study of Sea-to-Sky Highway located between Vancouver and Whistler, in southern British Columbia, Canada. Two noncompliance modes were considered: insufficient sight distance and vehicle skidding. The results show the importance of accounting for several noncompliance modes in the reliability model. The system reliability concept could be used in future studies to calibrate the design of various design elements in order to achieve consistent safety levels based on all possible modes of noncompliance. Copyright © 2016 Elsevier Ltd. All rights reserved.
Genre, Ludivine; Roché, Henri; Varela, Léonel; Kanoun, Dorra; Ouali, Monia; Filleron, Thomas; Dalenc, Florence
2017-02-01
Survival of patients with metastatic breast cancer (MBC) suffering from brain metastasis (BM) is limited and this event is usually fatal. In 2010, the Graesslin's nomogram was published in order to predict subsequent BM in patients with breast cancer (BC) with extra-cerebral metastatic disease. This model aims to select a patient population at high risk for BM and thus will facilitate the design of prevention strategies and/or the impact of early treatment of BM in prospective clinical studies. Nomogram external validation was retrospectively applied to patients with BC and later BM between January 2005 and December 2012, treated in our institution. Moreover, risk factors of BM appearance were studied by Fine and Gray's competing risk analysis. Among 492 patients with MBC, 116 developed subsequent BM. Seventy of them were included for the nomogram validation. The discrimination is good (area under curve = 0.695 [95% confidence interval, 0.61-0.77]). Risk factors of BM appearance are: human epidermal growth factor receptor 2 (HER2) overexpression/amplification, triple-negative BC and number of extra-cerebral metastatic sites (>1). With a competing risk model, we highlight the nomogram interest for HER2+ tumour subgroup exclusively. Graesslin's nomogram external validation demonstrates exportability and reproducibility. Importantly, the competing risk model analysis provides additional information for the design of prospective trials concerning the early diagnosis of BM and/or preventive treatment on high risk patients with extra-cerebral metastatic BC. Copyright © 2016 Elsevier Ltd. All rights reserved.
Design Process Improvement for Electric CAR Harness
NASA Astrophysics Data System (ADS)
Sawatdee, Thiwarat; Chutima, Parames
2017-06-01
In an automobile parts design company, the customer satisfaction is one of the most important factors for product design. Therefore, the company employs all means to focus its product design process based on the various requirements of customers resulting in high number of design changes. The objective of this research is to improve the design process of the electric car harness that effects the production scheduling by using Fault Tree Analysis (FTA) and Failure Mode and Effect Analysis (FMEA) as the main tools. FTA is employed for root cause analysis and FMEA is used to ranking a High Risk Priority Number (RPN) which is shows the priority of factors in the electric car harness that have high impact to the design of the electric car harness. After the implementation, the improvements are realized significantly since the number of design change is reduced from 0.26% to 0.08%.
Chen, Liangyong
2015-01-01
The perceived risk of nonremunerated blood donation (NRBD) is one of the most important factors which hinder the Chinese public from donating blood. To understand deeply and measure scientifically the public's perceived risk of NRBD, in this paper the qualitative and quantitative methods were used to explore the construct of perceived risk of NRBD in Chinese context. Firstly, the preliminary construct of perceived risk of NRBD was developed based on the grounded theory. Then, a measurement scale of perceived risk of NRBD was designed. Finally, the exploratory factor analysis (EFA) and confirmatory factor analysis (CFA) were adopted for testing and verifying the construct. The results show that the construct of perceived risk of NRBD has three core dimensions, namely, trust risk, psychological risk, and health risk, which provides a clear construct and concise scale to better capture the Chinese public's perceived risk of NRBD. Blood collection agencies can strategically make polices about perceived risk reduction to maximize the public's NRBD behavior. PMID:26526570
Chen, Liangyong; Ma, Zujun
2015-01-01
The perceived risk of nonremunerated blood donation (NRBD) is one of the most important factors which hinder the Chinese public from donating blood. To understand deeply and measure scientifically the public's perceived risk of NRBD, in this paper the qualitative and quantitative methods were used to explore the construct of perceived risk of NRBD in Chinese context. Firstly, the preliminary construct of perceived risk of NRBD was developed based on the grounded theory. Then, a measurement scale of perceived risk of NRBD was designed. Finally, the exploratory factor analysis (EFA) and confirmatory factor analysis (CFA) were adopted for testing and verifying the construct. The results show that the construct of perceived risk of NRBD has three core dimensions, namely, trust risk, psychological risk, and health risk, which provides a clear construct and concise scale to better capture the Chinese public's perceived risk of NRBD. Blood collection agencies can strategically make polices about perceived risk reduction to maximize the public's NRBD behavior.
Wallops Ship Surveillance System
NASA Technical Reports Server (NTRS)
Smith, Donna C.
2011-01-01
Approved as a Wallops control center backup system, the Wallops Ship Surveillance Software is a day-of-launch risk analysis tool for spaceport activities. The system calculates impact probabilities and displays ship locations relative to boundary lines. It enables rapid analysis of possible flight paths to preclude the need to cancel launches and allow execution of launches in a timely manner. Its design is based on low-cost, large-customer- base elements including personal computers, the Windows operating system, C/C++ object-oriented software, and network interfaces. In conformance with the NASA software safety standard, the system is designed to ensure that it does not falsely report a safe-for-launch condition. To improve the current ship surveillance method, the system is designed to prevent delay of launch under a safe-for-launch condition. A single workstation is designated the controller of the official ship information and the official risk analysis. Copies of this information are shared with other networked workstations. The program design is divided into five subsystems areas: 1. Communication Link -- threads that control the networking of workstations; 2. Contact List -- a thread that controls a list of protected item (ocean vessel) information; 3. Hazard List -- threads that control a list of hazardous item (debris) information and associated risk calculation information; 4. Display -- threads that control operator inputs and screen display outputs; and 5. Archive -- a thread that controls archive file read and write access. Currently, most of the hazard list thread and parts of other threads are being reused as part of a new ship surveillance system, under the SureTrak project.
Cost Estimation and Control for Flight Systems
NASA Technical Reports Server (NTRS)
Hammond, Walter E.; Vanhook, Michael E. (Technical Monitor)
2002-01-01
Good program management practices, cost analysis, cost estimation, and cost control for aerospace flight systems are interrelated and depend upon each other. The best cost control process cannot overcome poor design or poor systems trades that lead to the wrong approach. The project needs robust Technical, Schedule, Cost, Risk, and Cost Risk practices before it can incorporate adequate Cost Control. Cost analysis both precedes and follows cost estimation -- the two are closely coupled with each other and with Risk analysis. Parametric cost estimating relationships and computerized models are most often used. NASA has learned some valuable lessons in controlling cost problems, and recommends use of a summary Project Manager's checklist as shown here.
Hygrothermal Simulation: A Tool for Building Envelope Design Analysis
Samuel V. Glass; Anton TenWolde; Samuel L. Zelinka
2013-01-01
Is it possible to gauge the risk of moisture problems while designing the building envelope? This article provides a brief introduction to computer-based hygrothermal (heat and moisture) simulation, shows how simulation can be useful as a design tool, and points out a number of im-portant considerations regarding model inputs and limita-tions. Hygrothermal simulation...
Nanotoxicology and nanomedicine: making development decisions in an evolving governance environment
NASA Astrophysics Data System (ADS)
Rycroft, Taylor; Trump, Benjamin; Poinsatte-Jones, Kelsey; Linkov, Igor
2018-02-01
The fields of nanomedicine, risk analysis, and decision science have evolved considerably in the past decade, providing developers of nano-enabled therapies and diagnostic tools with more complete information than ever before and shifting a fundamental requisite of the nanomedical community from the need for more information about nanomaterials to the need for a streamlined method of integrating the abundance of nano-specific information into higher-certainty product design decisions. The crucial question facing nanomedicine developers that must select the optimal nanotechnology in a given situation has shifted from "how do we estimate nanomaterial risk in the absence of good risk data?" to "how can we derive a holistic characterization of the risks and benefits that a given nanomaterial may pose within a specific nanomedical application?" Many decision support frameworks have been proposed to assist with this inquiry; however, those based in multicriteria decision analysis have proven to be most adaptive in the rapidly evolving field of nanomedicine—from the early stages of the field when conditions of significant uncertainty and incomplete information dominated, to today when nanotoxicology and nano-environmental health and safety information is abundant but foundational paradigms such as chemical risk assessment, risk governance, life cycle assessment, safety-by-design, and stakeholder engagement are undergoing substantial reformation in an effort to address the needs of emerging technologies. In this paper, we reflect upon 10 years of developments in nanomedical engineering and demonstrate how the rich knowledgebase of nano-focused toxicological and risk assessment information developed over the last decade enhances the capability of multicriteria decision analysis approaches and underscores the need to continue the transition from traditional risk assessment towards risk-based decision-making and alternatives-based governance for emerging technologies.
Starup-Linde, Jakob; Karlstad, Øystein; Eriksen, Stine Aistrup; Vestergaard, Peter; Bronsveld, Heleen K.; de Vries, Frank; Andersen, Morten; Auvinen, Anssi; Haukka, Jari; Hjellvik, Vidar; Bazelier, Marloes T.; de Boer, Anthonius; Furu, Kari; De Bruin, Marie L.
2013-01-01
Background: Patients suffering from diabetes mellitus (DM) may experience an increased risk of cancer; however, it is not certain whether this effect is due to diabetes per se. Objective: To examine the association between DM and cancers by a systematic review and meta-analysis according to the PRISMA guidelines. Data Sources: The systematic literature search includes Medline at PubMed, Embase, Cinahl, Bibliotek.dk, Cochrane library, Web of Science and SveMed+ with the search terms: “Diabetes mellitus”, “Neoplasms”, and “Risk of cancer”. Study Eligibility Criteria: The included studies compared the risk of cancer in diabetic patients versus non-diabetic patients. All types of observational study designs were included. Results: Diabetes patients were at a substantially increased risk of liver (RR=2.1), and pancreas (RR=2.2) cancer. Modestly elevated significant risks were also found for ovary (RR=1.2), breast (RR=1.1), cervix (RR=1.3), endometrial (RR=1.4), several digestive tract (RR=1.1-1.5), kidney (RR=1.4), and bladder cancer (RR=1.1). The findings were similar for men and women, and unrelated to study design. Meta-regression analyses showed limited effect modification of body mass index, and possible effect modification of age, gender, with some influence of study characteristics (population source, cancer- and diabetes ascertainment). Limitations: Publication bias seemed to be present. Only published data were used in the analyses. Conclusions: The systematic review and meta-analysis confirm the previous results of increased cancer risk in diabetes and extend this to additional cancer sites. Physicians in contact with patients with diabetes should be aware that diabetes patients are at an increased risk of cancer. PMID:24215312
Starup-Linde, Jakob; Karlstad, Oystein; Eriksen, Stine Aistrup; Vestergaard, Peter; Bronsveld, Heleen K; de Vries, Frank; Andersen, Morten; Auvinen, Anssi; Haukka, Jari; Hjellvik, Vidar; Bazelier, Marloes T; Boer, Anthonius de; Furu, Kari; De Bruin, Marie L
2013-11-01
Patients suffering from diabetes mellitus (DM) may experience an increased risk of cancer; however, it is not certain whether this effect is due to diabetes per se. To examine the association between DM and cancers by a systematic review and meta-analysis according to the PRISMA guidelines. The systematic literature search includes Medline at PubMed, Embase, Cinahl, Bibliotek.dk, Cochrane library, Web of Science and SveMed+ with the search terms: "Diabetes mellitus", "Neoplasms", and "Risk of cancer". The included studies compared the risk of cancer in diabetic patients versus non-diabetic patients. All types of observational study designs were included. Diabetes patients were at a substantially increased risk of liver (RR=2.1), and pancreas (RR=2.2) cancer. Modestly elevated significant risks were also found for ovary (RR=1.2), breast (RR=1.1), cervix (RR=1.3), endometrial (RR=1.4), several digestive tract (RR=1.1-1.5), kidney (RR=1.4), and bladder cancer (RR=1.1). The findings were similar for men and women, and unrelated to study design. Meta-regression analyses showed limited effect modification of body mass index, and possible effect modification of age, gender, with some influence of study characteristics (population source, cancer- and diabetes ascertainment). Publication bias seemed to be present. Only published data were used in the analyses. The systematic review and meta-analysis confirm the previous results of increased cancer risk in diabetes and extend this to additional cancer sites. Physicians in contact with patients with diabetes should be aware that diabetes patients are at an increased risk of cancer.
A major uncertainty that has long been recognized in evaluating chemical toxicity is accounting for metabolic activation of chemicals resulting in increased toxicity. In silico approaches to predict chemical metabolism and to subsequently screen and prioritize chemicals for risk ...
Tool for Rapid Analysis of Monte Carlo Simulations
NASA Technical Reports Server (NTRS)
Restrepo, Carolina; McCall, Kurt E.; Hurtado, John E.
2011-01-01
Designing a spacecraft, or any other complex engineering system, requires extensive simulation and analysis work. Oftentimes, the large amounts of simulation data generated are very di cult and time consuming to analyze, with the added risk of overlooking potentially critical problems in the design. The authors have developed a generic data analysis tool that can quickly sort through large data sets and point an analyst to the areas in the data set that cause specific types of failures. The Tool for Rapid Analysis of Monte Carlo simulations (TRAM) has been used in recent design and analysis work for the Orion vehicle, greatly decreasing the time it takes to evaluate performance requirements. A previous version of this tool was developed to automatically identify driving design variables in Monte Carlo data sets. This paper describes a new, parallel version, of TRAM implemented on a graphical processing unit, and presents analysis results for NASA's Orion Monte Carlo data to demonstrate its capabilities.
The use of multi-dimensional flow and morphodynamic models for restoration design analysis
NASA Astrophysics Data System (ADS)
McDonald, R.; Nelson, J. M.
2013-12-01
River restoration projects with the goal of restoring a wide range of morphologic and ecologic channel processes and functions have become common. The complex interactions between flow and sediment-transport make it challenging to design river channels that are both self-sustaining and improve ecosystem function. The relative immaturity of the field of river restoration and shortcomings in existing methodologies for evaluating channel designs contribute to this problem, often leading to project failures. The call for increased monitoring of constructed channels to evaluate which restoration techniques do and do not work is ubiquitous and may lead to improved channel restoration projects. However, an alternative approach is to detect project flaws before the channels are built by using numerical models to simulate hydraulic and sediment-transport processes and habitat in the proposed channel (Restoration Design Analysis). Multi-dimensional models provide spatially distributed quantities throughout the project domain that may be used to quantitatively evaluate restoration designs for such important metrics as (1) the change in water-surface elevation which can affect the extent and duration of floodplain reconnection, (2) sediment-transport and morphologic change which can affect the channel stability and long-term maintenance of the design; and (3) habitat changes. These models also provide an efficient way to evaluate such quantities over a range of appropriate discharges including low-probability events which often prove the greatest risk to the long-term stability of restored channels. Currently there are many free and open-source modeling frameworks available for such analysis including iRIC, Delft3D, and TELEMAC. In this presentation we give examples of Restoration Design Analysis for each of the metrics above from projects on the Russian River, CA and the Kootenai River, ID. These examples demonstrate how detailed Restoration Design Analysis can be used to guide design elements and how this method can point out potential stability problems or other risks before designs proceed to the construction phase.
ERIC Educational Resources Information Center
Trenholm, Christopher; Devaney, Barbara; Fortson, Kenneth; Clark, Melissa; Bridgespan, Lisa Quay; Wheeler, Justin
2008-01-01
This paper examines the impacts of four abstinence-only education programs on adolescent sexual activity and risks of pregnancy and sexually transmitted diseases (STDs). Based on an experimental design, the impact analysis uses survey data collected in 2005 and early 2006 from more than 2,000 teens who had been randomly assigned to either a…
Design Analysis Kit for Optimization and Terascale Applications 6.0
DOE Office of Scientific and Technical Information (OSTI.GOV)
2015-10-19
Sandia's Dakota software (available at http://dakota.sandia.gov) supports science and engineering transformation through advanced exploration of simulations. Specifically it manages and analyzes ensembles of simulations to provide broader and deeper perspective for analysts and decision makers. This enables them to: (1) enhance understanding of risk, (2) improve products, and (3) assess simulation credibility. In its simplest mode, Dakota can automate typical parameter variation studies through a generic interface to a computational model. However, Dakota also delivers advanced parametric analysis techniques enabling design exploration, optimization, model calibration, risk analysis, and quantification of margins and uncertainty with such models. It directly supports verificationmore » and validation activities. The algorithms implemented in Dakota aim to address challenges in performing these analyses with complex science and engineering models from desktop to high performance computers.« less
Bønes, Erlend; Hasvold, Per; Henriksen, Eva; Strandenaes, Thomas
2007-09-01
Instant messaging (IM) is suited for immediate communication because messages are delivered almost in real time. Results from studies of IM use in enterprise work settings make us believe that IM based services may prove useful also within the healthcare sector. However, today's public instant messaging services do not have the level of information security required for adoption of IM in healthcare. We proposed MedIMob, our own architecture for a secure enterprise IM service for use in healthcare. MedIMob supports IM clients on mobile devices in addition to desktop based clients. Security threats were identified in a risk analysis of the MedIMob architecture. The risk analysis process consists of context identification, threat identification, analysis of consequences and likelihood, risk evaluation, and proposals for risk treatment. The risk analysis revealed a number of potential threats to the information security of a service like this. Many of the identified threats are general when dealing with mobile devices and sensitive data; others are threats which are more specific to our service and architecture. Individual threats identified in the risks analysis are discussed and possible counter measures presented. The risk analysis showed that most of the proposed risk treatment measures must be implemented to obtain an acceptable risk level; among others blocking much of the additional functionality of the smartphone. To conclude on the usefulness of this IM service, it will be evaluated in a trial study of the human-computer interaction. Further work also includes an improved design of the proposed MedIMob architecture. 2006 Elsevier Ireland Ltd
Simulation Assisted Risk Assessment Applied to Launch Vehicle Conceptual Design
NASA Technical Reports Server (NTRS)
Mathias, Donovan L.; Go, Susie; Gee, Ken; Lawrence, Scott
2008-01-01
A simulation-based risk assessment approach is presented and is applied to the analysis of abort during the ascent phase of a space exploration mission. The approach utilizes groupings of launch vehicle failures, referred to as failure bins, which are mapped to corresponding failure environments. Physical models are used to characterize the failure environments in terms of the risk due to blast overpressure, resulting debris field, and the thermal radiation due to a fireball. The resulting risk to the crew is dynamically modeled by combining the likelihood of each failure, the severity of the failure environments as a function of initiator and time of the failure, the robustness of the crew module, and the warning time available due to early detection. The approach is shown to support the launch vehicle design process by characterizing the risk drivers and identifying regions where failure detection would significantly reduce the risk to the crew.
Constellation Program (CxP) Crew Exploration Vehicle (CEV) Project Integrated Landing System
NASA Technical Reports Server (NTRS)
Baker, John D.; Yuchnovicz, Daniel E.; Eisenman, David J.; Peer, Scott G.; Fasanella, Edward L.; Lawrence, Charles
2009-01-01
Crew Exploration Vehicle (CEV) Chief Engineer requested a risk comparison of the Integrated Landing System design developed by NASA and the design developed by Contractor- referred to as the LM 604 baseline. Based on the results of this risk comparison, the CEV Chief engineer requested that the NESC evaluate identified risks and develop strategies for their reduction or mitigation. The assessment progressed in two phases. A brief Phase I analysis was performed by the Water versus Land-Landing Team to compare the CEV Integrated Landing System proposed by the Contractor against the NASA TS-LRS001 baseline with respect to risk. A phase II effort examined the areas of critical importance to the overall landing risk, evaluating risk to the crew and to the CEV Crew Module (CM) during a nominal land-landing. The findings of the assessment are contained in this report.
Risk analysis and its link with standards of the World Organisation for Animal Health.
Sugiura, K; Murray, N
2011-04-01
Among the agreements included in the treaty that created the World Trade Organization (WTO) in January 1995 is the Agreement on the Application of Sanitary and Phytosanitary Measures (SPS Agreement) that sets out the basic rules for food safety and animal and plant health standards. The SPS Agreement designates the World Organisation for Animal Health (OIE) as the organisation responsible for developing international standards for animal health and zoonoses. The SPS Agreement requires that the sanitary measures that WTO members apply should be based on science and encourages them to either apply measures based on the OIE standards or, if they choose to adopt a higher level of protection than that provided by these standards, apply measures based on a science-based risk assessment. The OIE also provides a procedural framework for risk analysis for its Member Countries to use. Despite the inevitable challenges that arise in carrying out a risk analysis of the international trade in animals and animal products, the OIE risk analysis framework provides a structured approach that facilitates the identification, assessment, management and communication of these risks.
Stakeholder Perceptions of Risk in Construction.
Zhao, Dong; McCoy, Andrew P; Kleiner, Brian M; Mills, Thomas H; Lingard, Helen
2016-02-01
Safety management in construction is an integral effort and its success requires inputs from all stakeholders across design and construction phases. Effective risk mitigation relies on the concordance of all stakeholders' risk perceptions. Many researchers have noticed the discordance of risk perceptions among critical stakeholders in safe construction work, however few have provided quantifiable evidence describing them. In an effort to fill this perception gap, this research performs an experiment that investigates stakeholder perceptions of risk in construction. Data analysis confirms the existence of such discordance, and indicates a trend in risk likelihood estimation. With risk perceptions from low to high, the stakeholders are architects, contractors/safety professionals, and engineers. Including prior studies, results also suggest that designers have improved their knowledge in building construction safety, but compared to builders they present more difficultly in reaching a consensus of perception. Findings of this research are intended to be used by risk management and decision makers to reassess stakeholders' varying judgments when considering injury prevention and hazard assessment.
Stakeholder Perceptions of Risk in Construction
Zhao, Dong; McCoy, Andrew P.; Kleiner, Brian M.; Mills, Thomas H.; Lingard, Helen
2015-01-01
Safety management in construction is an integral effort and its success requires inputs from all stakeholders across design and construction phases. Effective risk mitigation relies on the concordance of all stakeholders’ risk perceptions. Many researchers have noticed the discordance of risk perceptions among critical stakeholders in safe construction work, however few have provided quantifiable evidence describing them. In an effort to fill this perception gap, this research performs an experiment that investigates stakeholder perceptions of risk in construction. Data analysis confirms the existence of such discordance, and indicates a trend in risk likelihood estimation. With risk perceptions from low to high, the stakeholders are architects, contractors/safety professionals, and engineers. Including prior studies, results also suggest that designers have improved their knowledge in building construction safety, but compared to builders they present more difficultly in reaching a consensus of perception. Findings of this research are intended to be used by risk management and decision makers to reassess stakeholders’ varying judgments when considering injury prevention and hazard assessment. PMID:26441481
The Importance of Human Reliability Analysis in Human Space Flight: Understanding the Risks
NASA Technical Reports Server (NTRS)
Hamlin, Teri L.
2010-01-01
HRA is a method used to describe, qualitatively and quantitatively, the occurrence of human failures in the operation of complex systems that affect availability and reliability. Modeling human actions with their corresponding failure in a PRA (Probabilistic Risk Assessment) provides a more complete picture of the risk and risk contributions. A high quality HRA can provide valuable information on potential areas for improvement, including training, procedural, equipment design and need for automation.
Medical technology at home: safety-related items in technical documentation.
Hilbers, Ellen S M; de Vries, Claudette G J C A; Geertsma, Robert E
2013-01-01
This study aimed to investigate the technical documentation of manufacturers on issues of safe use of their device in a home setting. Three categories of equipment were selected: infusion pumps, ventilators, and dialysis systems. Risk analyses, instructions for use, labels, and post market surveillance procedures were requested from manufacturers. Additionally, they were asked to fill out a questionnaire on collection of field experience, on incidents, and training activities. Specific risks of device operation by lay users in a home setting were incompletely addressed in the risk analyses. A substantial number of user manuals were designed for professionals, rather than for patients or lay carers. Risk analyses and user information often showed incomplete coherence. Post market surveillance was mainly based on passive collection of field experiences. Manufacturers of infusion pumps, ventilators, and dialysis systems pay insufficient attention to the specific risks of use by lay persons in home settings. It is expected that this conclusion is also applicable for other medical equipment for treatment at home. Manufacturers of medical equipment for home use should pay more attention to use errors, lay use and home-specific risks in design, risk analysis, and user information. Field experiences should be collected more actively. Coherence between risk analysis and user information should be improved. Notified bodies should address these aspects in their assessment. User manuals issued by institutions supervising a specific home therapy should be drawn up in consultation with the manufacturer.
Multidisciplinary design and optimization (MDO) methodology for the aircraft conceptual design
NASA Astrophysics Data System (ADS)
Iqbal, Liaquat Ullah
An integrated design and optimization methodology has been developed for the conceptual design of an aircraft. The methodology brings higher fidelity Computer Aided Design, Engineering and Manufacturing (CAD, CAE and CAM) Tools such as CATIA, FLUENT, ANSYS and SURFCAM into the conceptual design by utilizing Excel as the integrator and controller. The approach is demonstrated to integrate with many of the existing low to medium fidelity codes such as the aerodynamic panel code called CMARC and sizing and constraint analysis codes, thus providing the multi-fidelity capabilities to the aircraft designer. The higher fidelity design information from the CAD and CAE tools for the geometry, aerodynamics, structural and environmental performance is provided for the application of the structured design methods such as the Quality Function Deployment (QFD) and the Pugh's Method. The higher fidelity tools bring the quantitative aspects of a design such as precise measurements of weight, volume, surface areas, center of gravity (CG) location, lift over drag ratio, and structural weight, as well as the qualitative aspects such as external geometry definition, internal layout, and coloring scheme early in the design process. The performance and safety risks involved with the new technologies can be reduced by modeling and assessing their impact more accurately on the performance of the aircraft. The methodology also enables the design and evaluation of the novel concepts such as the blended (BWB) and the hybrid wing body (HWB) concepts. Higher fidelity computational fluid dynamics (CFD) and finite element analysis (FEA) allow verification of the claims for the performance gains in aerodynamics and ascertain risks of structural failure due to different pressure distribution in the fuselage as compared with the tube and wing design. The higher fidelity aerodynamics and structural models can lead to better cost estimates that help reduce the financial risks as well. This helps in achieving better designs with reduced risk in lesser time and cost. The approach is shown to eliminate the traditional boundary between the conceptual and the preliminary design stages, combining the two into one consolidated preliminary design phase. Several examples for the validation and utilization of the Multidisciplinary Design and Optimization (MDO) Tool are presented using missions for the Medium and High Altitude Long Range/Endurance Unmanned Aerial Vehicles (UAVs).
Orbit Transfer Vehicle (OTV) engine, phase A study. Volume 2: Study
NASA Technical Reports Server (NTRS)
Mellish, J. A.
1979-01-01
The hydrogen oxygen engine used in the orbiter transfer vehicle is described. The engine design is analyzed and minimum engine performance and man rating requirements are discussed. Reliability and safety analysis test results are presented and payload, risk and cost, and engine installation parameters are defined. Engine tests were performed including performance analysis, structural analysis, thermal analysis, turbomachinery analysis, controls analysis, and cycle analysis.
Yekpe, Ketsia; Abatzoglou, Nicolas; Bataille, Bernard; Gosselin, Ryan; Sharkawi, Tahmer; Simard, Jean-Sébastien; Cournoyer, Antoine
2018-07-01
This study applied the concept of Quality by Design (QbD) to tablet dissolution. Its goal was to propose a quality control strategy to model dissolution testing of solid oral dose products according to International Conference on Harmonization guidelines. The methodology involved the following three steps: (1) a risk analysis to identify the material- and process-related parameters impacting the critical quality attributes of dissolution testing, (2) an experimental design to evaluate the influence of design factors (attributes and parameters selected by risk analysis) on dissolution testing, and (3) an investigation of the relationship between design factors and dissolution profiles. Results show that (a) in the case studied, the two parameters impacting dissolution kinetics are active pharmaceutical ingredient particle size distributions and tablet hardness and (b) these two parameters could be monitored with PAT tools to predict dissolution profiles. Moreover, based on the results obtained, modeling dissolution is possible. The practicality and effectiveness of the QbD approach were demonstrated through this industrial case study. Implementing such an approach systematically in industrial pharmaceutical production would reduce the need for tablet dissolution testing.
Aerospace Systems Design in NASA's Collaborative Engineering Environment
NASA Technical Reports Server (NTRS)
Monell, Donald W.; Piland, William M.
1999-01-01
Past designs of complex aerospace systems involved an environment consisting of collocated design teams with project managers, technical discipline experts, and other experts (e.g. manufacturing and systems operations). These experts were generally qualified only on the basis of past design experience and typically had access to a limited set of integrated analysis tools. These environments provided less than desirable design fidelity, often lead to the inability of assessing critical programmatic and technical issues (e.g., cost risk, technical impacts), and generally derived a design that was not necessarily optimized across the entire system. The continually changing, modern aerospace industry demands systems design processes that involve the best talent available (no matter where it resides) and access to the best design and analysis tools. A solution to these demands involves a design environment referred to as collaborative engineering. The collaborative engineering environment evolving within the National Aeronautics and Space Administration (NASA) is a capability that enables the Agency's engineering infrastructure to interact and use the best state-of-the-art tools and data across organizational boundaries. Using collaborative engineering, the collocated team is replaced with an interactive team structure where the team members are geographically distributed and the best engineering talent can be applied to the design effort regardless of physical location. In addition, a more efficient, higher quality design product is delivered by bringing together the best engineering talent with more up-to-date design and analysis tools. These tools are focused on interactive, multidisciplinary design and analysis with emphasis on the complete life cycle of the system, and they include nontraditional, integrated tools for life cycle cost estimation and risk assessment. NASA has made substantial progress during the last two years in developing a collaborative engineering environment. NASA is planning to use this collaborative engineering infrastructure to provide better aerospace systems life cycle design and analysis, which includes analytical assessment of the technical and programmatic aspects of a system from "cradle to grave." This paper describes the recent NASA developments in the area of collaborative engineering, the benefits (realized and anticipated) of using the developed capability, and the long-term plans for implementing this capability across the Agency.
Aerospace Systems Design in NASA's Collaborative Engineering Environment
NASA Technical Reports Server (NTRS)
Monell, Donald W.; Piland, William M.
2000-01-01
Past designs of complex aerospace systems involved an environment consisting of collocated design teams with project managers, technical discipline experts, and other experts (e.g., manufacturing and systems operation). These experts were generally qualified only on the basis of past design experience and typically had access to a limited set of integrated analysis tools. These environments provided less than desirable design fidelity, often lead to the inability of assessing critical programmatic and technical issues (e.g., cost, risk, technical impacts), and generally derived a design that was not necessarily optimized across the entire system. The continually changing, modern aerospace industry demands systems design processes that involve the best talent available (no matter where it resides) and access to the the best design and analysis tools. A solution to these demands involves a design environment referred to as collaborative engineering. The collaborative engineering environment evolving within the National Aeronautics and Space Administration (NASA) is a capability that enables the Agency's engineering infrastructure to interact and use the best state-of-the-art tools and data across organizational boundaries. Using collaborative engineering, the collocated team is replaced with an interactive team structure where the team members are geographical distributed and the best engineering talent can be applied to the design effort regardless of physical location. In addition, a more efficient, higher quality design product is delivered by bringing together the best engineering talent with more up-to-date design and analysis tools. These tools are focused on interactive, multidisciplinary design and analysis with emphasis on the complete life cycle of the system, and they include nontraditional, integrated tools for life cycle cost estimation and risk assessment. NASA has made substantial progress during the last two years in developing a collaborative engineering environment. NASA is planning to use this collaborative engineering engineering infrastructure to provide better aerospace systems life cycle design and analysis, which includes analytical assessment of the technical and programmatic aspects of a system from "cradle to grave." This paper describes the recent NASA developments in the area of collaborative engineering, the benefits (realized and anticipated) of using the developed capability, and the long-term plans for implementing this capability across Agency.
Aerospace Systems Design in NASA's Collaborative Engineering Environment
NASA Astrophysics Data System (ADS)
Monell, Donald W.; Piland, William M.
2000-07-01
Past designs of complex aerospace systems involved an environment consisting of collocated design teams with project managers, technical discipline experts, and other experts (e.g., manufacturing and systems operations). These experts were generally qualified only on the basis of past design experience and typically had access to a limited set of integrated analysis tools. These environments provided less than desirable design fidelity, often led to the inability of assessing critical programmatic and technical issues (e.g., cost, risk, technical impacts), and generally derived a design that was not necessarily optimized across the entire system. The continually changing, modern aerospace industry demands systems design processes that involve the best talent available (no matter where it resides) and access to the best design and analysis tools. A solution to these demands involves a design environment referred to as collaborative engineering. The collaborative engineering environment evolving within the National Aeronautics and Space Administration (NASA) is a capability that enables the Agency's engineering infrastructure to interact and use the best state-of-the-art tools and data across organizational boundaries. Using collaborative engineering, the collocated team is replaced with an interactive team structure where the team members are geographically distributed and the best engineering talent can be applied to the design effort regardless of physical location. In addition, a more efficient, higher quality design product is delivered by bringing together the best engineering talent with more up-to-date design and analysis tools. These tools are focused on interactive, multidisciplinary design and analysis with emphasis on the complete life cycle of the system, and they include nontraditional, integrated tools for life cycle cost estimation and risk assessment. NASA has made substantial progress during the last two years in developing a collaborative engineering environment. NASA is planning to use this collaborative engineering infrastructure to provide better aerospace systems life cycle design and analysis, which includes analytical assessment of the technical and programmatic aspects of a system from "cradle to grave." This paper describes the recent NASA developments in the area of collaborative engineering, the benefits (realized and anticipated) of using the developed capability, and the long-term plans for implementing this capability across the Agency.
The role of PRA in the safety assessment of VVER Nuclear Power Plants in Ukraine.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kot, C.
1999-05-10
Ukraine operates thirteen (13) Soviet-designed pressurized water reactors, VVERS. All Ukrainian plants are currently operating with annually renewable permits until they update their safety analysis reports (SARs), in accordance with new SAR content requirements issued in September 1995, by the Nuclear Regulatory Authority and the Government Nuclear Power Coordinating Committee of Ukraine. The requirements are in three major areas: design basis accident (DBA) analysis, probabilistic risk assessment (PRA), and beyond design-basis accident (BDBA) analysis. The last two requirements, on PRA and BDBA, are new, and the DBA requirements are an expanded version of the older SAR requirements. The US Departmentmore » of Energy (USDOE), as part of its Soviet-Designed Reactor Safety activities, is providing assistance and technology transfer to Ukraine to support their nuclear power plants (NPPs) in developing a Western-type technical basis for the new SARs. USDOE sponsored In-Depth Safety Assessments (ISAs) are in progress at three pilot nuclear reactor units in Ukraine, South Ukraine Unit 1, Zaporizhzhya Unit 5, and Rivne Unit 1, and a follow-on study has been initiated at Khmenytskyy Unit 1. The ISA projects encompass most areas of plant safety evaluation, but the initial emphasis is on performing a detailed, plant-specific Level 1 Internal Events PRA. This allows the early definition of the plant risk profile, the identification of risk significant accident sequences and plant vulnerabilities and provides guidance for the remainder of the safety assessments.« less
Use of benzodiazepine and risk of cancer: A meta-analysis of observational studies.
Kim, Hong-Bae; Myung, Seung-Kwon; Park, Yon Chul; Park, Byoungjin
2017-02-01
Several observational epidemiological studies have reported inconsistent results on the association between the use of benzodiazepine and the risk of cancer. We investigated the association by using a meta-analysis. We searched PubMed, EMBASE, and the bibliographies of relevant articles to locate additional publications in January 2016. Three evaluators independently reviewed and selected eligible studies based on predetermined selection criteria. Of 796 articles meeting our initial criteria, a total of 22 observational epidemiological studies with 18 case-control studies and 4 cohort studies were included in the final analysis. Benzodiazepine use was significantly associated with an increased risk of cancer (odds ratio [OR] or relative risk [RR] 1.19; 95% confidence interval 1.16-1.21) in a random-effects meta-analysis of all studies. Subgroup meta-analyses by various factors such as study design, type of case-control study, study region, and methodological quality of study showed consistent findings. Also, a significant dose-response relationship was observed between the use of benzodiazepine and the risk of cancer (p for trend <0.01). The current meta-analysis of observational epidemiological studies suggests that benzodiazepine use is associated with an increased risk of cancer. © 2016 UICC.
Body mass index and risk of BPH: a meta-analysis.
Wang, S; Mao, Q; Lin, Y; Wu, J; Wang, X; Zheng, X; Xie, L
2012-09-01
Epidemiological studies have reported conflicting results relating obesity to BPH. A meta-analysis of cohort and case-control studies was conducted to pool the risk estimates of the association between obesity and BPH. Eligible studies were retrieved by both computer searches and review of references. We analyzed abstracted data with random effects models to obtain the summary risk estimates. Dose-response meta-analysis was performed for studies reporting categorical risk estimates for a series of exposure levels. A total of 19 studies met the inclusion criteria of the meta-analysis. Positive association with body mass index (BMI) was observed in BPH and lower urinary tract symptoms (LUTS) combined group (odds ratio=1.27, 95% confidence intervals 1.05-1.53). In subgroup analysis, BMI exhibited a positive dose-response relationship with BPH/LUTS in population-based case-control studies and a marginal positive association was observed between risk of BPH and increased BMI. However, no association between BPH/LUTS and BMI was observed in other subgroups stratified by study design, geographical region or primary outcome. The overall current literatures suggested that BMI was associated with increased risk of BPH. Further efforts should be made to confirm these findings and clarify the underlying biological mechanisms.
Suchard, Marc A; Zorych, Ivan; Simpson, Shawn E; Schuemie, Martijn J; Ryan, Patrick B; Madigan, David
2013-10-01
The self-controlled case series (SCCS) offers potential as an statistical method for risk identification involving medical products from large-scale observational healthcare data. However, analytic design choices remain in encoding the longitudinal health records into the SCCS framework and its risk identification performance across real-world databases is unknown. To evaluate the performance of SCCS and its design choices as a tool for risk identification in observational healthcare data. We examined the risk identification performance of SCCS across five design choices using 399 drug-health outcome pairs in five real observational databases (four administrative claims and one electronic health records). In these databases, the pairs involve 165 positive controls and 234 negative controls. We also consider several synthetic databases with known relative risks between drug-outcome pairs. We evaluate risk identification performance through estimating the area under the receiver-operator characteristics curve (AUC) and bias and coverage probability in the synthetic examples. The SCCS achieves strong predictive performance. Twelve of the twenty health outcome-database scenarios return AUCs >0.75 across all drugs. Including all adverse events instead of just the first per patient and applying a multivariate adjustment for concomitant drug use are the most important design choices. However, the SCCS as applied here returns relative risk point-estimates biased towards the null value of 1 with low coverage probability. The SCCS recently extended to apply a multivariate adjustment for concomitant drug use offers promise as a statistical tool for risk identification in large-scale observational healthcare databases. Poor estimator calibration dampens enthusiasm, but on-going work should correct this short-coming.
System Safety and the Unintended Consequence
NASA Technical Reports Server (NTRS)
Watson, Clifford
2012-01-01
The analysis and identification of risks often result in design changes or modification of operational steps. This paper identifies the potential of unintended consequences as an over-looked result of these changes. Examples of societal changes such as prohibition, regulatory changes including mandating lifeboats on passenger ships, and engineering proposals or design changes to automobiles and spaceflight hardware are used to demonstrate that the System Safety Engineer must be cognizant of the potential for unintended consequences as a result of an analysis. Conclusions of the report indicate the need for additional foresight and consideration of the potential effects of analysis-driven design, processing changes, and/or operational modifications.
Wang, Junlin; Kan, Shuling; Chen, Tong; Liu, Jianping
2015-03-01
The aim of this research was to apply quality by design (QbD) to the development of naproxen loaded core pellets which can be used as the potential core for colon-specific pellets. In the early stages of this study, prior knowledge and preliminary studies were systematically incorporated into the risk assessment using failure mode and effect analysis (FMEA) and fishbone diagram. Then Plackett-Burman design was used to screen eight potential high risk factors (spheronization speed, spheronization time, extrusion speed, drying method, CCMC-Na concentration, lactose concentration, water concentration and Tween 80 concentration) obtained from the above risk assessment. It was discovered that out of the eight potential high risk factors only three factors (spheronization speed, extrusion speed and CCMC-Na concentration) had significant effects on the quality of the pellets. This allowed the use of Box-Behnken design (BBD) to fully elucidate the relationship between the variables and critical quality attribute (CQA). Finally, the final control space was established within which the quality of the pellets can meet the requirement of colon-specific drug delivery system. This study demonstrated that naproxen loaded core pellets were successfully designed using QbD principle.
Chronic disease risk factors among hotel workers
Gawde, Nilesh Chandrakant; Kurlikar, Prashika R.
2016-01-01
Context: Non-communicable diseases have emerged as a global health issue. Role of occupation in pathogenesis of non-communicable diseases has not been explored much especially in the hospitality industry. Aims: Objectives of this study include finding risk factor prevalence among hotel workers and studying relationship between occupational group and chronic disease risk factors chiefly high body mass index. Settings and Design: A cross-sectional study was conducted among non-managerial employees from classified hotels in India. Materials and Methods: The study participants self-administered pre-designed pilot-tested questionnaires. Statistical analysis used: The risk factor prevalence rates were expressed as percentages. Chi-square test was used for bi-variate analysis. Overweight was chosen as ‘outcome’ variable of interest and binary multi-logistic regression analysis was used to identify determinants. Results: The prevalence rates of tobacco use, alcohol use, inadequate physical activity and inadequate intake of fruits and vegetables were 32%, 49%, 24% and 92% respectively among hotel employees. Tobacco use was significantly common among those in food preparation and service, alcohol use among those in food service and security and leisure time physical activity among front office workers. More than two-fifths (42.7%) were overweight. Among the hotel workers, those employed in food preparation and security had higher odds of 1.650 (CI: 1.025 – 2.655) and 3.245 (CI: 1.296 – 8.129) respectively of being overweight. Conclusions: Prevalence of chronic disease risk factors is high among hotel workers. Risk of overweight is significantly high in food preparation and security departments and workplace interventions are necessary to address these risks PMID:27390474
Meta-Analysis of the Association between Tea Intake and the Risk of Cognitive Disorders
Ma, Qing-Ping; Huang, Chen; Cui, Qiao-Yun; Yang, Ding-Jun; Sun, Kang; Chen, Xuan; Li, Xing-Hui
2016-01-01
Background Alzheimer’s disease is a common neurodegenerative disorder in elderly. This study was aimed to systematically evaluate the association between tea intake and the risk of cognitive disorders by meta-analysis. Methods and Findings PubMed, Embase and Wanfang databases were systematically searched and a total of 26 observational studies were included in this study. Odds ratios (ORs) and the corresponding 95% confidence intervals (CIs) were calculated and pooled by using fixed or random effects models according to the degree of heterogeneity. Results The overall pooled analysis indicated that tea intake could significantly reduce the risk of cognitive disorders (OR = 0.65, 95%CI = 0.58–0.73). Subgroup analyses were conducted based on study design, population, frequency of tea drinking and type of cognitive disorders. The results showed that tea drinking was significantly associated with the reduced incidence of cognitive disorders in all of subgroups based on study design and frequency of tea drinking. In particular, tea drinking was inversely associated with the risk of cognitive impairment (CoI), mild cognitive impairment (MCI), cognitive decline and ungrouped cognitive disorders. Moreover, for population subgroups, the significant association was only found in Chinese people. Conclusion Our study suggests that daily tea drinking is associated with decreased risk of CoI, MCI and cognitive decline in the elderly. However, the association between tea intake and Alzheimer’s disease remains elusive. PMID:27824892
Glinert, Lewis H; Schommer, Jon C
2005-06-01
Considerable attention has been afforded to analyzing the content of and assessing consumers' reaction to print direct-to-consumer drug ads, but not so for televised ads. To determine whether advertisements with different risk severity and risk presentation would significantly affect viewers' (1) recall of information contained in the advertisement, (2) evaluation of the advertisement, and (3) perceptions of the advertised product's risks. Data were collected from a sample of 135 first-year pharmacy students at a Midwestern college of pharmacy. After viewing 1 of the 6 advertisements designed for this study, participants were asked to complete a self-administered survey. Chi-square and analysis of variance were used to analyze the data. A 2x3 between subjects design was used to test the effects of 2 levels of risk severity (high- vs low-risk severity) and 3 levels of risk presentation (original ad containing integrated risk message, deintegrated risk message/dual modality using male voice-over, deintegrated risk message/dual modality using female voice-over). Results of analysis of variance procedures revealed that deintegrating risk information by placing it at the end of the advertisement and the use of captions in addition to oral messages (dual modality) (1) improved the recall of general and specific side effect information, (2) led to a perception that the advertisement had greater informational content, (3) resulted in lower Advertisement Distraction, and (4) lessened cognitive and affective aspects of information overload for the advertisement containing the high-risk severity medication. However, this pattern of findings was not found for the low-risk severity medication. Alternative methods for presenting risk information in direct-to-consumer ads affected some aspects of information recall and advertisement evaluation, but were not shown to affect risk perceptions regarding the advertised products.
Trajectory Design to Mitigate Risk on the Transiting Exoplanet Survey Satellite (TESS) Mission
NASA Technical Reports Server (NTRS)
Dichmann, Donald
2016-01-01
The Transiting Exoplanet Survey Satellite (TESS) will employ a highly eccentric Earth orbit, in 2:1 lunar resonance, reached with a lunar flyby preceded by 3.5 phasing loops. The TESS mission has limited propellant and several orbit constraints. Based on analysis and simulation, we have designed the phasing loops to reduce delta-V and to mitigate risk due to maneuver execution errors. We have automated the trajectory design process and use distributed processing to generate and to optimize nominal trajectories, check constraint satisfaction, and finally model the effects of maneuver errors to identify trajectories that best meet the mission requirements.
Composable Framework Support for Software-FMEA Through Model Execution
NASA Astrophysics Data System (ADS)
Kocsis, Imre; Patricia, Andras; Brancati, Francesco; Rossi, Francesco
2016-08-01
Performing Failure Modes and Effect Analysis (FMEA) during software architecture design is becoming a basic requirement in an increasing number of domains; however, due to the lack of standardized early design phase model execution, classic SW-FMEA approaches carry significant risks and are human effort-intensive even in processes that use Model-Driven Engineering.Recently, modelling languages with standardized executable semantics have emerged. Building on earlier results, this paper describes framework support for generating executable error propagation models from such models during software architecture design. The approach carries the promise of increased precision, decreased risk and more automated execution for SW-FMEA during dependability- critical system development.
Tang, Zhenyu; Li, Min; Zhang, Xiaowei; Hou, Wenshang
2016-01-01
Objective To clarify and quantify the potential association between intake of flavonoids and risk of stroke. Design Meta-analysis of prospective cohort studies. Data source Studies published before January 2016 identified through electronic searches using PubMed, Embase and the Cochrane Library. Eligibility criteria for selecting studies Prospective cohort studies with relative risks and 95% CIs for stroke according to intake of flavonoids (assessed as dietary intake). Results The meta-analysis yielded 11 prospective cohort studies involving 356 627 participants and more than 5154 stroke cases. The pooled estimate of the multivariate relative risk of stroke for the highest compared with the lowest dietary flavonoid intake was 0.89 (95% CI 0.82 to 0.97; p=0.006). Dose-response analysis indicated that the summary relative risk of stroke for an increase of 100 mg flavonoids consumed per day was 0.91 (95% CI 0.77 to 1.08) without heterogeneity among studies (I2=0%). Stratifying by follow-up duration, the relative risk of stroke for flavonoid intake was 0.89 (95% CI 0.81 to 0.99) in studies with more than 10 years of follow-up. Conclusions Results from this meta-analysis suggest that higher dietary flavonoid intake may moderately lower the risk of stroke. PMID:27279473
NASA Astrophysics Data System (ADS)
Muneepeerakul, Chitsomanus; Huffaker, Ray; Munoz-Carpena, Rafael
2016-04-01
The weather index insurance promises financial resilience to farmers struck by harsh weather conditions with swift compensation at affordable premium thanks to its minimal adverse selection and moral hazard. Despite these advantages, the very nature of indexing causes the presence of "production basis risk" that the selected weather indexes and their thresholds do not correspond to actual damages. To reduce basis risk without additional data collection cost, we propose the use of rain intensity and frequency as indexes as it could offer better protection at the lower premium by avoiding basis risk-strike trade-off inherent in the total rainfall index. We present empirical evidences and modeling results that even under the similar cumulative rainfall and temperature environment, yield can significantly differ especially for drought sensitive crops. We further show that deriving the trigger level and payoff function from regression between historical yield and total rainfall data may pose significant basis risk owing to their non-unique relationship in the insured range of rainfall. Lastly, we discuss the design of index insurance in terms of contract specifications based on the results from global sensitivity analysis.
Design and Analysis of Turbines for Space Applications
NASA Technical Reports Server (NTRS)
Griffin, Lisa W.; Dorney, Daniel J.; Huber, Frank W.
2003-01-01
In order to mitigate the risk of rocket propulsion development, efficient, accurate, detailed fluid dynamics analysis of the turbomachinery is necessary. This analysis is used for component development, design parametrics, performance prediction, and environment definition. To support this requirement, a task was developed at NASAh4arshall Space Flight Center (MSFC) to improve turbine aerodynamic performance through the application of advanced design and analysis tools. There are four major objectives of this task: 1) to develop, enhance, and integrate advanced turbine aerodynamic design and analysis tools; 2) to develop the methodology for application of the analytical techniques; 3) to demonstrate the benefits of the advanced turbine design procedure through its application to a relevant turbine design point; and 4) to verify the optimized design and analysis with testing. The turbine chosen on which to demonstrate the procedure was a supersonic design suitable for a reusable launch vehicle (RLV). The hot gas path and blading were redesigned to obtain an increased efficiency. The redesign of the turbine was conducted with a consideration of system requirements, realizing that a highly efficient turbine that, for example, significantly increases engine weight, is of limited benefit. Both preliminary and detailed designs were considered. To generate an improved design, one-dimensional (1D) design and analysis tools, computational fluid dynamics (CFD), response surface methodology (RSM), and neural nets (NN) were used.
Practical, transparent prospective risk analysis for the clinical laboratory.
Janssens, Pim Mw
2014-11-01
Prospective risk analysis (PRA) is an essential element in quality assurance for clinical laboratories. Practical approaches to conducting PRA in laboratories, however, are scarce. On the basis of the classical Failure Mode and Effect Analysis method, an approach to PRA was developed for application to key laboratory processes. First, the separate, major steps of the process under investigation are identified. Scores are then given for the Probability (P) and Consequence (C) of predefined types of failures and the chances of Detecting (D) these failures. Based on the P and C scores (on a 10-point scale), an overall Risk score (R) is calculated. The scores for each process were recorded in a matrix table. Based on predetermined criteria for R and D, it was determined whether a more detailed analysis was required for potential failures and, ultimately, where risk-reducing measures were necessary, if any. As an illustration, this paper presents the results of the application of PRA to our pre-analytical and analytical activities. The highest R scores were obtained in the stat processes, the most common failure type in the collective process steps was 'delayed processing or analysis', the failure type with the highest mean R score was 'inappropriate analysis' and the failure type most frequently rated as suboptimal was 'identification error'. The PRA designed is a useful semi-objective tool to identify process steps with potential failures rated as risky. Its systematic design and convenient output in matrix tables makes it easy to perform, practical and transparent. © The Author(s) 2014 Reprints and permissions: sagepub.co.uk/journalsPermissions.nav.
ERIC Educational Resources Information Center
Schumacher, Krista S.
2015-01-01
The importance of school readiness to both the future of an individual child and society as a whole has given rise to several state-specific indexes designed to measure county-level risk for starting school unprepared to learn. One such index is the Oklahoma School Readiness Risk Index (OK SRRI), comprised of indicators known to be associated with…
Duinen, Rianne van; Filatova, Tatiana; Geurts, Peter; Veen, Anne van der
2015-04-01
Drought-induced water shortage and salinization are a global threat to agricultural production. With climate change, drought risk is expected to increase as drought events are assumed to occur more frequently and to become more severe. The agricultural sector's adaptive capacity largely depends on farmers' drought risk perceptions. Understanding the formation of farmers' drought risk perceptions is a prerequisite to designing effective and efficient public drought risk management strategies. Various strands of literature point at different factors shaping individual risk perceptions. Economic theory points at objective risk variables, whereas psychology and sociology identify subjective risk variables. This study investigates and compares the contribution of objective and subjective factors in explaining farmers' drought risk perception by means of survey data analysis. Data on risk perceptions, farm characteristics, and various other personality traits were collected from farmers located in the southwest Netherlands. From comparing the explanatory power of objective and subjective risk factors in separate models and a full model of risk perception, it can be concluded that farmers' risk perceptions are shaped by both rational and emotional factors. In a full risk perception model, being located in an area with external water supply, owning fields with salinization issues, cultivating drought-/salt-sensitive crops, farm revenue, drought risk experience, and perceived control are significant explanatory variables of farmers' drought risk perceptions. © 2014 Society for Risk Analysis.
AN IMPROVEMENT TO THE MOUSE COMPUTERIZED UNCERTAINTY ANALYSIS SYSTEM
The original MOUSE (Modular Oriented Uncertainty System) system was designed to deal with the problem of uncertainties in Environmental engineering calculations, such as a set of engineering cast or risk analysis equations. It was especially intended for use by individuals with l...
Naumann, R Wendel
2012-07-01
This study examines the design of previous and future trials of lymph node dissection in endometrial cancer. Data from previous trials were used to construct a decision analysis modeling the risk of lymphatic spread and the effects of treatment on patients with endometrial cancer. This model was then applied to previous trials as well as other future trial designs that might be used to address this subject. Comparing the predicted and actual results in the ASTEC trial, the model closely mimics the survival results with and without lymph node dissection for the low and high risk groups. The model suggests a survival difference of less than 2% between the experimental and control arms of the ASTEC trial under all circumstances. Sensitivity analyses reveal that these conclusions are robust. Future trial designs were also modeled with hysterectomy only, hysterectomy with radiation in intermediate risk patients, and staging with radiation only with node positive patients. Predicted outcomes for these approaches yield survival rates of 88%, 90%, and 93% in clinical stage I patients who have a risk of pelvic node involvement of approximately 7%. These estimates were 78%, 82%, and 89% in intermediate risk patients who have a risk of nodal spread of approximately 15%. This model accurately predicts the outcome of previous trials and demonstrates that even if lymph node dissection was therapeutic, these trials would have been negative due to study design. Furthermore, future trial designs that are being considered would need to be conducted in high-intermediate risk patients to detect any difference. Copyright © 2012 Elsevier Inc. All rights reserved.
Quantifying riverine and storm-surge flood risk by single-family residence: application to Texas.
Czajkowski, Jeffrey; Kunreuther, Howard; Michel-Kerjan, Erwann
2013-12-01
The development of catastrophe models in recent years allows for assessment of the flood hazard much more effectively than when the federally run National Flood Insurance Program (NFIP) was created in 1968. We propose and then demonstrate a methodological approach to determine pure premiums based on the entire distribution of possible flood events. We apply hazard, exposure, and vulnerability analyses to a sample of 300,000 single-family residences in two counties in Texas (Travis and Galveston) using state-of-the-art flood catastrophe models. Even in zones of similar flood risk classification by FEMA there is substantial variation in exposure between coastal and inland flood risk. For instance, homes in the designated moderate-risk X500/B zones in Galveston are exposed to a flood risk on average 2.5 times greater than residences in X500/B zones in Travis. The results also show very similar average annual loss (corrected for exposure) for a number of residences despite their being in different FEMA flood zones. We also find significant storm-surge exposure outside of the FEMA designated storm-surge risk zones. Taken together these findings highlight the importance of a microanalysis of flood exposure. The process of aggregating risk at a flood zone level-as currently undertaken by FEMA-provides a false sense of uniformity. As our analysis indicates, the technology to delineate the flood risks exists today. © 2013 Society for Risk Analysis.
Risk Management Technique for design and operation of facilities and equipment
NASA Technical Reports Server (NTRS)
Fedor, O. H.; Parsons, W. N.; Coutinho, J. De S.
1975-01-01
The Risk Management System collects information from engineering, operating, and management personnel to identify potentially hazardous conditions. This information is used in risk analysis, problem resolution, and contingency planning. The resulting hazard accountability system enables management to monitor all identified hazards. Data from this system are examined in project reviews so that management can decide to eliminate or accept these risks. This technique is particularly effective in improving the management of risks in large, complex, high-energy facilities. These improvements are needed for increased cooperation among industry, regulatory agencies, and the public.
Case-Cohort Studies: Design and Applicability to Hand Surgery.
Vojvodic, Miliana; Shafarenko, Mark; McCabe, Steven J
2018-04-24
Observational studies are common research strategies in hand surgery. The case-cohort design offers an efficient and resource-friendly method for risk assessment and outcomes analysis. Case-cohorts remain underrepresented in upper extremity research despite several practical and economic advantages over case-control studies. This report outlines the purpose, utility, and structure of the case-cohort design and offers a sample research question to demonstrate its value to risk estimation for adverse surgical outcomes. The application of well-designed case-cohort studies is advocated in an effort to improve the quality and quantity of observational research evidence in hand and upper extremity surgery. Copyright © 2018 American Society for Surgery of the Hand. Published by Elsevier Inc. All rights reserved.
The comparative risk assessment framework and tools (CRAFT)
Southern Research Station USDA Forest Service
2010-01-01
To help address these challenges, the USDA Forest Serviceâs Eastern Forest Environmental Threat Assessment Center (EFETAC) and the University of North Carolina Ashevilleâs National Environmental Modeling and Analysis Center (NEMAC) designed a planning framework, called the Comparative Risk Assessment Framework and Tools (CRAFT). CRAFT is...
Conversion economics of forest biomaterials: risk and financial analysis of CNC manufacturing
Camilla Abbati de Assis; Carl Houtman; Richard Phillips; E.M. Ted Bilek; Orlando J. Rojas; Lokendra Pal; Maria Soledad Peresin; Hasan Jameel; Ronalds Gonzalez
2017-01-01
Commercialization of cellulose nanocrystals (CNC) presents opportunities for a wide range of new products. Techno-economic assessments can provide insightful information for the effi cient design of conversion processes, drive cost-saving efforts, and reduce fi nancial risks. In this study, we conducted techno-economic assessments for CNC production using information...
Carleton, Amanda J; Sievenpiper, John L; de Souza, Russell; McKeown-Eyssen, Gail; Jenkins, David J A
2013-01-01
Objective α-Linolenic acid (ALA) is considered to be a cardioprotective nutrient; however, some epidemiological studies have suggested that dietary ALA intake increases the risk of prostate cancer. The main objective was to conduct a systematic review and meta-analysis of case–control and prospective studies investigating the association between dietary ALA intake and prostate cancer risk. Design A systematic review and meta-analysis were conducted by searching MEDLINE and EMBASE for relevant prospective and case–control studies. Included studies We included all prospective cohort, case–control, nested case-cohort and nested case–control studies that investigated the effect of dietary ALA intake on the incidence (or diagnosis) of prostate cancer and provided relative risk (RR), HR or OR estimates. Primary outcome measure Data were pooled using the generic inverse variance method with a random effects model from studies that compared the highest ALA quantile with the lowest ALA quantile. Risk estimates were expressed as RR with 95% CIs. Heterogeneity was assessed by χ2 and quantified by I2. Results Data from five prospective and seven case–control studies were pooled. The overall RR estimate showed ALA intake to be positively but non-significantly associated with prostate cancer risk (1.08 (0.90 to 1.29), p=0.40; I2=85%), but the interpretation was complicated by evidence of heterogeneity not explained by study design. A weak, non-significant protective effect of ALA intake on prostate cancer risk in the prospective studies became significant (0.91 (0.83 to 0.99), p=0.02) without evidence of heterogeneity (I2=8%, p=0.35) on removal of one study during sensitivity analyses. Conclusions This analysis failed to confirm an association between dietary ALA intake and prostate cancer risk. Larger and longer observational and interventional studies are needed to define the role of ALA and prostate cancer. PMID:23674441
Tang, Zhen-Hai; Zhang, Chi; Cheng, Pan; Sun, Hong-Min; Jin, Yu; Chen, Yuan-Jing; Huang, Fen
2014-01-01
The association between glutathione-S-transferase polymorphisms (GSTM1, GSTT1 and GSTP1) and risk of acute leukemia in Asians remains controversial. This study was therefore designed to evaluate the precise association in 23 studies identified by a search of PubMed and several other databases, up to December 2013. Using random or fixed effects models odds ratios (ORs) with corresponding 95% confidence intervals (CIs) were calculated. Heterogeneity across studies was assessed, and funnel plots were constructed to test for publication bias. The meta-analysis showed positive associations between GST polymorphisms (GSTM1 and GSTT1 but not GSTP1) and acute leukemia risk [(OR=1.47, 95% CI 1.18-1.83); (OR=1.32, 95% CI 1.07-1.62); (OR=1.01, 95% CI 0.84-1.23), respectively] and heterogeneity between the studies. The results suggested that the GSTM1 null genotype and GSTT1null genotype, but not the GSTP1 polymorphism, might be a potential risk factors for acute leukemia. Further well-designed studies are needed to confirm our findings.
Kuiper, H A; König, A; Kleter, G A; Hammes, W P; Knudsen, I
2004-07-01
The most important results from the EU-sponsored ENTRANSFOOD Thematic Network project are reviewed, including the design of a detailed step-wise procedure for the risk assessment of foods derived from genetically modified crops based on the latest scientific developments, evaluation of topical risk assessment issues, and the formulation of proposals for improved risk management and public involvement in the risk analysis process. Copyright 2004 Elsevier Ltd.
Meta-analysis of paternal age and schizophrenia risk in male versus female offspring.
Miller, Brian; Messias, Erick; Miettunen, Jouko; Alaräisänen, Antti; Järvelin, Marjo-Riita; Koponen, Hannu; Räsänen, Pirkko; Isohanni, Matti; Kirkpatrick, Brian
2011-09-01
Advanced paternal age (APA) is a reported risk factor for schizophrenia in the offspring. We performed a meta-analysis of this association, considering the effect of gender and study design. We identified articles by searching Pub Med, PsychInfo, ISI, and EMBASE, and the reference lists of identified studies. Previously unpublished data from the Northern Finland 1966 Birth Cohort (NFBC 1966) study were also included. There were 6 cohort studies and 6 case-control studies that met the inclusion criteria. In both study designs, there was a significant increase in risk of schizophrenia in the offspring of older fathers (≥30) compared to a reference paternal age of 25-29, with no gender differences. The relative risk (RR) in the oldest fathers (≥50) was 1.66 [95% confidence interval (95% CI): 1.46-1.89, P < 0.01]. A significant increase in risk was also found for younger fathers (<25) in males (RR = 1.08, 95% CI: 1.02-1.14, P = 0.01) but not females (RR = 1.04, 95% CI: 0.97-1.14, P = 0.28). The population attributable risk percentage (PAR%) was 10% for paternal age ≥30 and 5% for paternal age <25. Both APA (≥30) and younger paternal age (<25) increase the risk of schizophrenia; younger paternal age may be associated with an increased risk in males but not females. This risk factor increases the risk of schizophrenia as much as any single candidate gene of risk. The mechanism of these associations is not known and may differ for older and younger fathers.
Alcohol and the risk of sleep apnoea: a systematic review and meta-analysis.
Simou, Evangelia; Britton, John; Leonardi-Bee, Jo
2018-02-01
A systematic review and meta-analysis of the association between alcohol consumption and risk of sleep apnoea in adults. We searched Medline, EMBASE and Web of Science databases from 1985 to 2015 for comparative epidemiological studies assessing the relation between alcohol consumption and sleep apnoea. Two authors independently screened and extracted data. Random effects meta-analysis was used to estimate pooled effect sizes with 95% confidence intervals (CI). Heterogeneity was quantified using I 2 and explored using subgroup analyses based on study exposure and outcome measures, quality, design, adjustment for confounders and geographical location. Publication bias was assessed using a funnel plot and Egger's test. We identified 21 studies from which estimates of relative risk could be obtained. Meta-analysis of these estimates demonstrated that higher levels of alcohol consumption increased the risk of sleep apnoea by 25% (RR 1.25, 95%CI 1.13-1.38, I 2 = 82%, p < 0.0001). This estimate's differences were robust in alcohol consumption and sleep apnoea definitions, study design, and quality but was greater in Low and Middle Income Country locations. We detected evidence of publication bias (p = 0.001). A further eight included studies reported average alcohol consumption in people with and without sleep apnoea. Meta-analysis revealed that mean alcohol intake was two units/week higher in those with sleep apnoea, but this difference was not statistically significant (p = 0.41). These findings suggest that alcohol consumption is associated with a higher risk of sleep apnoea, further supporting evidence that reducing alcohol intake is of potential therapeutic and preventive value in this condition. Copyright © 2018 The Authors. Published by Elsevier B.V. All rights reserved.
RAMPART (TM): Risk Assessment Method-Property Analysis and Ranking Tool v.4.0
DOE Office of Scientific and Technical Information (OSTI.GOV)
Carson, Susan D.; Hunter, Regina L.; Link, Madison D.
RAMPART{trademark}, Risk Assessment Method-property Analysis and Ranking Tool, is a new type of computer software package for the assessment of risk to buildings. RAMPART{trademark} has been developed by Sandia National Laboratories (SNL) for the U.S. General Services Administration (GSA). RAMPART {trademark} has been designed and developed to be a risk-based decision support tool that requires no risk analysis expertise on the part of the user. The RAMPART{trademark} user interface elicits information from the user about the building. The RAMPART{trademark} expert system is a set of rules that embodies GSA corporate knowledge and SNL's risk assessment experience. The RAMPART{trademark} database containsmore » both data entered by the user during a building analysis session and large sets of natural hazard and crime data. RAMPART{trademark} algorithms use these data to assess the risk associated with a given building in the face of certain hazards. Risks arising from five natural hazards (earthquake, hurricane, winter storm, tornado and flood); crime (inside and outside the building); fire and terrorism are calculated. These hazards may cause losses of various kinds. RAMPART{trademark} considers death, injury, loss of mission, loss of property, loss of contents, loss of building use, and first-responder loss. The results of each analysis are presented graphically on the screen and in a written report.« less
Bilirubin and Stroke Risk Using a Mendelian Randomization Design.
Lee, Sun Ju; Jee, Yon Ho; Jung, Keum Ji; Hong, Seri; Shin, Eun Soon; Jee, Sun Ha
2017-05-01
Circulating bilirubin, a natural antioxidant, is associated with decreased risk of stroke. However, the nature of the relationship between the two remains unknown. We used a Mendelian randomization analysis to assess the causal effect of serum bilirubin on stroke risk in Koreans. The 14 single-nucleotide polymorphisms (SNPs) (<10 -7 ) including rs6742078 of uridine diphosphoglucuronyl-transferase were selected from genome-wide association study of bilirubin level in the KCPS-II (Korean Cancer Prevention Study-II) Biobank subcohort consisting of 4793 healthy Korean and 806 stroke cases. Weighted genetic risk score was calculated using 14 SNPs selected from the top SNPs. Both rs6742078 (F statistics=138) and weighted genetic risk score with 14 SNPs (F statistics=187) were strongly associated with bilirubin levels. Simultaneously, serum bilirubin level was associated with decreased risk of stroke in an ordinary least-squares analysis. However, in 2-stage least-squares Mendelian randomization analysis, no causal relationship between serum bilirubin and stroke risk was found. There is no evidence that bilirubin level is causally associated with risk of stroke in Koreans. Therefore, bilirubin level is not a risk determinant of stroke. © 2017 American Heart Association, Inc.
Kothari, Bhaveshkumar H; Fahmy, Raafat; Claycamp, H Gregg; Moore, Christine M V; Chatterjee, Sharmista; Hoag, Stephen W
2017-05-01
The goal of this study was to utilize risk assessment techniques and statistical design of experiments (DoE) to gain process understanding and to identify critical process parameters for the manufacture of controlled release multiparticulate beads using a novel disk-jet fluid bed technology. The material attributes and process parameters were systematically assessed using the Ishikawa fish bone diagram and failure mode and effect analysis (FMEA) risk assessment methods. The high risk attributes identified by the FMEA analysis were further explored using resolution V fractional factorial design. To gain an understanding of the processing parameters, a resolution V fractional factorial study was conducted. Using knowledge gained from the resolution V study, a resolution IV fractional factorial study was conducted; the purpose of this IV study was to identify the critical process parameters (CPP) that impact the critical quality attributes and understand the influence of these parameters on film formation. For both studies, the microclimate, atomization pressure, inlet air volume, product temperature (during spraying and curing), curing time, and percent solids in the coating solutions were studied. The responses evaluated were percent agglomeration, percent fines, percent yield, bead aspect ratio, median particle size diameter (d50), assay, and drug release rate. Pyrobuttons® were used to record real-time temperature and humidity changes in the fluid bed. The risk assessment methods and process analytical tools helped to understand the novel disk-jet technology and to systematically develop models of the coating process parameters like process efficiency and the extent of curing during the coating process.
Finite element analysis of container ship's cargo hold using ANSYS and POSEIDON software
NASA Astrophysics Data System (ADS)
Tanny, Tania Tamiz; Akter, Naznin; Amin, Osman Md.
2017-12-01
Nowadays ship structural analysis has become an integral part of the preliminary ship design providing further support for the development and detail design of ship structures. Structural analyses of container ship's cargo holds are carried out for the balancing of their safety and capacity, as those ships are exposed to the high risk of structural damage during voyage. Two different design methodologies have been considered for the structural analysis of a container ship's cargo hold. One is rule-based methodology and the other is a more conventional software based analyses. The rule based analysis is done by DNV-GL's software POSEIDON and the conventional package based analysis is done by ANSYS structural module. Both methods have been applied to analyze some of the mechanical properties of the model such as total deformation, stress-strain distribution, Von Mises stress, Fatigue etc., following different design bases and approaches, to indicate some guidance's for further improvements in ship structural design.
Assessing ergonomic risks of software: Development of the SEAT.
Peres, S Camille; Mehta, Ranjana K; Ritchey, Paul
2017-03-01
Software utilizing interaction designs that require extensive dragging or clicking of icons may increase users' risks for upper extremity cumulative trauma disorders. The purpose of this research is to develop a Self-report Ergonomic Assessment Tool (SEAT) for assessing the risks of software interaction designs and facilitate mitigation of those risks. A 28-item self-report measure was developed by combining and modifying items from existing industrial ergonomic tools. Data were collected from 166 participants after they completed four different tasks that varied by method of input (touch or keyboard and mouse) and type of task (selecting or typing). Principal component analysis found distinct factors associated with stress (i.e., demands) and strain (i.e., response). Repeated measures analyses of variance showed that participants could discriminate the different strain induced by the input methods and tasks. However, participants' ability to discriminate between the stressors associated with that strain was mixed. Further validation of the SEAT is necessary but these results indicate that the SEAT may be a viable method of assessing ergonomics risks presented by software design. Copyright © 2016 Elsevier Ltd. All rights reserved.
DiMase, Daniel; Collier, Zachary A; Carlson, Jinae; Gray, Robin B; Linkov, Igor
2016-10-01
Within the microelectronics industry, there is a growing concern regarding the introduction of counterfeit electronic parts into the supply chain. Even though this problem is widespread, there have been limited attempts to implement risk-based approaches for testing and supply chain management. Supply chain risk management tends to focus on the highly visible disruptions of the supply chain instead of the covert entrance of counterfeits; thus counterfeit risk is difficult to mitigate. This article provides an overview of the complexities of the electronics supply chain, and highlights some gaps in risk assessment practices. In particular, this article calls for enhanced traceability capabilities to track and trace parts at risk through various stages of the supply chain. Placing the focus on risk-informed decision making through the following strategies is needed, including prioritization of high-risk parts, moving beyond certificates of conformance, incentivizing best supply chain management practices, adoption of industry standards, and design and management for supply chain resilience. © 2016 Society for Risk Analysis.
Zhang, Meng-Xi; Pan, Guo-Tao; Guo, Jian-Fen; Li, Bing-Yan; Qin, Li-Qiang; Zhang, Zeng-Li
2015-01-01
The results investigating the relationship between vitamin D levels and gestational diabetes mellitus (GDM) are inconsistent. Thus, we focused on evaluating the association of vitamin D deficiency with GDM by conducting a meta-analysis of observed studies. A systematic literature search was conducted via PubMed, MEDLINE, and Cochrane library to identify eligible studies before August 2015. The meta-analysis of 20 studies including 9209 participants showed that women with vitamin D deficiency experienced a significantly increased risk for developing GDM (odds ratio (OR) = 1.53; 95% confidence intervals (CI), 1.33, 1.75) with a little heterogeneity (I2 = 16.20%, p = 0.252). A noteworthy decrease of 4.93 nmol/L (95% CI, −6.73, −3.14) in serum 25(OH)D was demonstrated in the participants with GDM, and moderate heterogeneity was observed (I2 = 61.40%, p = 0.001). Subgroup analysis with study design showed that there were obvious heterogeneities in nested case–control studies (I2 > 52.5%, p < 0.07). Sensitivity analysis showed that exclusion of any single study did not materially alter the overall combined effect. In summary, the evidence from this meta-analysis indicates a consistent association between vitamin D deficiency and an increased risk of GDM. However, well-designed randomized controlled trials are needed to elicit the clear effect of vitamin D supplementation on prevention of GDM. PMID:26437429
NASA Technical Reports Server (NTRS)
Brown, Andrew M.; DeHaye, Michael; DeLessio, Steven
2011-01-01
The LOX-Hydrogen J-2X Rocket Engine, which is proposed for use as an upper-stage engine for numerous earth-to-orbit and heavy lift launch vehicle architectures, is presently in the design phase and will move shortly to the initial development test phase. Analysis of the design has revealed numerous potential resonance issues with hardware in the turbomachinery turbine-side flow-path. The analysis of the fuel pump turbine blades requires particular care because resonant failure of the blades, which are rotating in excess of 30,000 revolutions/minutes (RPM), could be catastrophic for the engine and the entire launch vehicle. This paper describes a series of probabilistic analyses performed to assess the risk of failure of the turbine blades due to resonant vibration during past and present test series. Some significant results are that the probability of failure during a single complete engine hot-fire test is low (1%) because of the small likelihood of resonance, but that the probability increases to around 30% for a more focused turbomachinery-only test because all speeds will be ramped through and there is a greater likelihood of dwelling at more speeds. These risk calculations have been invaluable for use by program management in deciding if risk-reduction methods such as dampers are necessary immediately or if the test can be performed before the risk-reduction hardware is ready.
Is adaptation or transformation needed? Active nanomaterials and risk analysis
NASA Astrophysics Data System (ADS)
Kuzma, Jennifer; Roberts, John Patrick
2016-07-01
Nanotechnology has been a key area of funding and policy for the United States and globally for the past two decades. Since nanotechnology research and development became a focus and nanoproducts began to permeate the market, scholars and scientists have been concerned about how to assess the risks that they may pose to human health and the environment. The newest generation of nanomaterials includes biomolecules that can respond to and influence their environments, and there is a need to explore whether and how existing risk-analysis frameworks are challenged by such novelty. To fill this niche, we used a modified approach of upstream oversight assessment (UOA), a subset of anticipatory governance. We first selected case studies of "active nanomaterials," that are early in research and development and designed for use in multiple sectors, and then considered them under several, key risk-analysis frameworks. We found two ways in which the cases challenge the frameworks. The first category relates to how to assess risk under a narrow framing of the term (direct health and environmental harm), and the second involves the definition of what constitutes a "risk" worthy of assessment and consideration in decision making. In light of these challenges, we propose some changes for risk analysis in the face of active nanostructures in order to improve risk governance.
Malicki, Julian; Bly, Ritva; Bulot, Mireille; Godet, Jean-Luc; Jahnen, Andreas; Krengli, Marco; Maingon, Philippe; Prieto Martin, Carlos; Przybylska, Kamila; Skrobała, Agnieszka; Valero, Marc; Jarvinen, Hannu
2017-04-01
To describe the current status of implementation of European directives for risk management in radiotherapy and to assess variability in risk management in the following areas: 1) in-country regulatory framework; 2) proactive risk assessment; (3) reactive analysis of events; and (4) reporting and learning systems. The original data were collected as part of the ACCIRAD project through two online surveys. Risk assessment criteria are closely associated with quality assurance programs. Only 9/32 responding countries (28%) with national regulations reported clear "requirements" for proactive risk assessment and/or reactive risk analysis, with wide variability in assessment methods. Reporting of adverse error events is mandatory in most (70%) but not all surveyed countries. Most European countries have taken steps to implement European directives designed to reduce the probability and magnitude of accidents in radiotherapy. Variability between countries is substantial in terms of legal frameworks, tools used to conduct proactive risk assessment and reactive analysis of events, and in the reporting and learning systems utilized. These findings underscore the need for greater harmonisation in common terminology, classification and reporting practices across Europe to improve patient safety and to enable more reliable inter-country comparisons. Copyright © 2017 Elsevier B.V. All rights reserved.
TU-AB-BRD-03: Fault Tree Analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dunscombe, P.
2015-06-15
Current quality assurance and quality management guidelines provided by various professional organizations are prescriptive in nature, focusing principally on performance characteristics of planning and delivery devices. However, published analyses of events in radiation therapy show that most events are often caused by flaws in clinical processes rather than by device failures. This suggests the need for the development of a quality management program that is based on integrated approaches to process and equipment quality assurance. Industrial engineers have developed various risk assessment tools that are used to identify and eliminate potential failures from a system or a process before amore » failure impacts a customer. These tools include, but are not limited to, process mapping, failure modes and effects analysis, fault tree analysis. Task Group 100 of the American Association of Physicists in Medicine has developed these tools and used them to formulate an example risk-based quality management program for intensity-modulated radiotherapy. This is a prospective risk assessment approach that analyzes potential error pathways inherent in a clinical process and then ranks them according to relative risk, typically before implementation, followed by the design of a new process or modification of the existing process. Appropriate controls are then put in place to ensure that failures are less likely to occur and, if they do, they will more likely be detected before they propagate through the process, compromising treatment outcome and causing harm to the patient. Such a prospective approach forms the basis of the work of Task Group 100 that has recently been approved by the AAPM. This session will be devoted to a discussion of these tools and practical examples of how these tools can be used in a given radiotherapy clinic to develop a risk based quality management program. Learning Objectives: Learn how to design a process map for a radiotherapy process Learn how to perform failure modes and effects analysis analysis for a given process Learn what fault trees are all about Learn how to design a quality management program based upon the information obtained from process mapping, failure modes and effects analysis and fault tree analysis. Dunscombe: Director, TreatSafely, LLC and Center for the Assessment of Radiological Sciences; Consultant to IAEA and Varian Thomadsen: President, Center for the Assessment of Radiological Sciences Palta: Vice President of the Center for the Assessment of Radiological Sciences.« less
TU-AB-BRD-02: Failure Modes and Effects Analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Huq, M.
2015-06-15
Current quality assurance and quality management guidelines provided by various professional organizations are prescriptive in nature, focusing principally on performance characteristics of planning and delivery devices. However, published analyses of events in radiation therapy show that most events are often caused by flaws in clinical processes rather than by device failures. This suggests the need for the development of a quality management program that is based on integrated approaches to process and equipment quality assurance. Industrial engineers have developed various risk assessment tools that are used to identify and eliminate potential failures from a system or a process before amore » failure impacts a customer. These tools include, but are not limited to, process mapping, failure modes and effects analysis, fault tree analysis. Task Group 100 of the American Association of Physicists in Medicine has developed these tools and used them to formulate an example risk-based quality management program for intensity-modulated radiotherapy. This is a prospective risk assessment approach that analyzes potential error pathways inherent in a clinical process and then ranks them according to relative risk, typically before implementation, followed by the design of a new process or modification of the existing process. Appropriate controls are then put in place to ensure that failures are less likely to occur and, if they do, they will more likely be detected before they propagate through the process, compromising treatment outcome and causing harm to the patient. Such a prospective approach forms the basis of the work of Task Group 100 that has recently been approved by the AAPM. This session will be devoted to a discussion of these tools and practical examples of how these tools can be used in a given radiotherapy clinic to develop a risk based quality management program. Learning Objectives: Learn how to design a process map for a radiotherapy process Learn how to perform failure modes and effects analysis analysis for a given process Learn what fault trees are all about Learn how to design a quality management program based upon the information obtained from process mapping, failure modes and effects analysis and fault tree analysis. Dunscombe: Director, TreatSafely, LLC and Center for the Assessment of Radiological Sciences; Consultant to IAEA and Varian Thomadsen: President, Center for the Assessment of Radiological Sciences Palta: Vice President of the Center for the Assessment of Radiological Sciences.« less
NASA Technical Reports Server (NTRS)
Jones, Harry W.; Dillon-Merrill, Robin L.; Thomas, Gretchen A.
2003-01-01
The Advanced Integration Matrix (AIM) Project u7ill study and solve systems-level integration issues for exploration missions beyond Low Earth Orbit (LEO), through the design and development of a ground-based facility for developing revolutionary integrated systems for joint human-robotic missions. This paper describes a Probabilistic Risk Analysis (PRA) of human space missions that was developed to help define the direction and priorities for AIM. Risk analysis is required for all major NASA programs and has been used for shuttle, station, and Mars lander programs. It is a prescribed part of early planning and is necessary during concept definition, even before mission scenarios and system designs exist. PRA cm begin when little failure data are available, and be continually updated and refined as detail becomes available. PRA provides a basis for examining tradeoffs among safety, reliability, performance, and cost. The objective of AIM's PRA is to indicate how risk can be managed and future human space missions enabled by the AIM Project. Many critical events can cause injuries and fatalities to the crew without causing loss of vehicle or mission. Some critical systems are beyond AIM's scope, such as propulsion and guidance. Many failure-causing events can be mitigated by conducting operational tests in AIM, such as testing equipment and evaluating operational procedures, especially in the areas of communications and computers, autonomous operations, life support, thermal design, EVA and rover activities, physiological factors including habitation, medical equipment, and food, and multifunctional tools and repairable systems. AIM is well suited to test and demonstrate the habitat, life support, crew operations, and human interface. Because these account for significant crew, systems performance, and science risks, AIM will help reduce mission risk, and missions beyond LEO are far enough in the future that AIM can have significant impact.
NASA Astrophysics Data System (ADS)
Kempf, Scott; Schäfer, Frank K.; Cardone, Tiziana; Ferreira, Ivo; Gerené, Sam; Destefanis, Roberto; Grassi, Lilith
2016-12-01
During recent years, the state-of-the-art risk assessment of the threat posed to spacecraft by micrometeoroids and space debris has been expanded to the analysis of failure modes of internal spacecraft components. This method can now be used to perform risk analyses for satellites to assess various failure levels - from failure of specific sub-systems to catastrophic break-up. This new assessment methodology is based on triple-wall ballistic limit equations (BLEs), specifically the Schäfer-Ryan-Lambert (SRL) BLE, which is applicable for describing failure threshold levels for satellite components following a hypervelocity impact. The methodology is implemented in the form of the software tool Particle Impact Risk and vulnerability Analysis Tool (PIRAT). During a recent European Space Agency (ESA) funded study, the PIRAT functionality was expanded in order to provide an interface to ESA's Concurrent Design Facility (CDF). The additions include a geometry importer and an OCDT (Open Concurrent Design Tool) interface. The new interface provides both the expanded geometrical flexibility, which is provided by external computer aided design (CAD) modelling, and an ease of import of existing data without the need for extensive preparation of the model. The reduced effort required to perform vulnerability analyses makes it feasible for application during early design phase, at which point modifications to satellite design can be undertaken with relatively little extra effort. The integration of PIRAT in the CDF represents the first time that vulnerability analyses can be performed in-session in ESA's CDF and the first time that comprehensive vulnerability studies can be applied cost-effectively in early design phase in general.
Cost and accuracy of advanced breeding trial designs in apple
Harshman, Julia M; Evans, Kate M; Hardner, Craig M
2016-01-01
Trialing advanced candidates in tree fruit crops is expensive due to the long-term nature of the planting and labor-intensive evaluations required to make selection decisions. How closely the trait evaluations approximate the true trait value needs balancing with the cost of the program. Designs of field trials of advanced apple candidates in which reduced number of locations, the number of years and the number of harvests per year were modeled to investigate the effect on the cost and accuracy in an operational breeding program. The aim was to find designs that would allow evaluation of the most additional candidates while sacrificing the least accuracy. Critical percentage difference, response to selection, and correlated response were used to examine changes in accuracy of trait evaluations. For the quality traits evaluated, accuracy and response to selection were not substantially reduced for most trial designs. Risk management influences the decision to change trial design, and some designs had greater risk associated with them. Balancing cost and accuracy with risk yields valuable insight into advanced breeding trial design. The methods outlined in this analysis would be well suited to other horticultural crop breeding programs. PMID:27019717
The impact of cigarette pack design, descriptors, and warning labels on risk perception in the U.S.
Bansal-Travers, Maansi; Hammond, David; Smith, Philip; Cummings, K Michael
2011-06-01
In the U.S., limited evidence exists on the impact of colors and brand imagery used in cigarette pack design. This study examined the impact of pack design, product descriptors, and health warnings on risk perception and brand appeal. A cross-sectional mall-intercept study was conducted with 197 adult smokers and 200 nonsmokers in Buffalo NY from June to July 2009 (data analysis from July 2009 to December 2010). Participants were shown 12 sets of packs randomly; each set varied by a particular design feature (color, descriptor) or warning label style (text versus graphic, size, attribution, message framing). Packs were rated on criteria including risk perceptions, quit motivation, and purchase interest. Participants selected larger, pictorial, and loss-framed warning labels as more likely to attract attention, encourage thoughts about health risks, motivate quitting, and be most effective. Participants were more likely to select packs with lighter color shading and descriptors such as light, silver, and smooth as delivering less tar, smoother taste, and lower health risk, compared to darker-shaded or full-flavor packs. Additionally, participants were more likely to select the branded compared to plain white pack when asked which delivered the most tar, smoothest taste, was more attractive, appealed to youth aged <18 years, and contained cigarettes of better quality. The findings support larger, graphic health warnings that convey loss-framed messages as most effective in communicating health risks to U.S. adults. The results also indicate that color and product descriptors are associated with false beliefs about risks. Plain packaging may reduce many of the erroneous misperceptions of risk communicated through pack design features. Copyright © 2011 American Journal of Preventive Medicine. Published by Elsevier Inc. All rights reserved.
The Impact of Cigarette Pack Design, Descriptors, and Warning Labels on Risk Perception in the U.S
Bansal-Travers, Maansi; Hammond, David; Smith, Philip; Cummings, K. Michael
2011-01-01
Background In the U.S., limited evidence exists on the impact of colors and brand imagery used in cigarette pack design. Purpose This study examined the impact of pack design, product descriptors, and health warnings on risk perception and brand appeal. Methods A cross-sectional mall-intercept study was conducted with 197 adult smokers and 200 nonsmokers in Buffalo, NY from June to July 2009 (data analysis from July 2009 to December 2010). Participants were shown 12 sets of packs randomly; each set varied by a particular design feature (color, descriptor) or warning label style (text vs graphic, size, attribution, message framing). Packs were rated on criteria including risk perceptions, quit motivation, and purchase interest. Results Participants selected larger, pictorial, and loss-framed warning labels as more likely to attract attention, encourage thoughts about health risks, motivate quitting, and most effective. Participants were more likely to select packs with lighter color shading and descriptors such as light, silver, and smooth as delivering less tar, smoother taste, and lower health risk, compared to darker-shaded or full flavor packs. Additionally, participants were more likely to select the branded compared to plain white pack when asked which delivered the most tar, smoothest taste, was more attractive, appealed to youth aged <18 years, and contained cigarettes of better quality. Conclusions The findings support larger, graphic health warnings that convey loss-framed messages as most effective in communicating health risks to U.S. adults. The results also indicate that color and product descriptors are associated with false beliefs about risks. Plain packaging may reduce many of the erroneous misperceptions of risk communicated through pack design features. PMID:21565661
Ren, Chong; McGrath, Colman; Yang, Yanqi
2015-09-01
To assess the effectiveness of diode low-level laser therapy (LLLT) for orthodontic pain control, a systematic and extensive electronic search for randomised controlled trials (RCTs) investigating the effects of diode LLLT on orthodontic pain prior to November 2014 was performed using the Cochrane Library (Issue 9, 2014), PubMed (1997), EMBASE (1947) and Web of Science (1956). The Cochrane tool for risk of bias evaluation was used to assess the bias risk in the chosen data. A meta-analysis was conducted using RevMan 5.3. Of the 186 results, 14 RCTs, with a total of 659 participants from 11 countries, were included. Except for three studies assessed as having a 'moderate risk of bias', the RCTs were rated as having a 'high risk of bias'. The methodological weaknesses were mainly due to 'blinding' and 'allocation concealment'. The meta-analysis showed that diode LLLT significantly reduced orthodontic pain by 39 % in comparison with placebo groups (P = 0.02). Diode LLLT was shown to significantly reduce the maximum pain intensity among parallel-design studies (P = 0.003 versus placebo groups; P = 0.000 versus control groups). However, no significant effects were shown for split-mouth-design studies (P = 0.38 versus placebo groups). It was concluded that the use of diode LLLT for orthodontic pain appears promising. However, due to methodological weaknesses, there was insufficient evidence to support or refute LLLT's effectiveness. RCTs with better designs and appropriate sample power are required to provide stronger evidence for diode LLLT's clinical applications.
Truck accidents at freeway ramps : data analysis and high-risk site identification
DOT National Transportation Integrated Search
1998-01-01
To examine the relationship of ramp design to truck accident rates, this paper presents an analysis of truck accidents in Washington State, plus a comparison to limited data from Colorado and California. The authors group freeway truck accidents by r...
PICASSO: an end-to-end image simulation tool for space and airborne imaging systems
NASA Astrophysics Data System (ADS)
Cota, Steve A.; Bell, Jabin T.; Boucher, Richard H.; Dutton, Tracy E.; Florio, Chris J.; Franz, Geoffrey A.; Grycewicz, Thomas J.; Kalman, Linda S.; Keller, Robert A.; Lomheim, Terrence S.; Paulson, Diane B.; Willkinson, Timothy S.
2008-08-01
The design of any modern imaging system is the end result of many trade studies, each seeking to optimize image quality within real world constraints such as cost, schedule and overall risk. Image chain analysis - the prediction of image quality from fundamental design parameters - is an important part of this design process. At The Aerospace Corporation we have been using a variety of image chain analysis tools for many years, the Parameterized Image Chain Analysis & Simulation SOftware (PICASSO) among them. In this paper we describe our PICASSO tool, showing how, starting with a high quality input image and hypothetical design descriptions representative of the current state of the art in commercial imaging satellites, PICASSO can generate standard metrics of image quality in support of the decision processes of designers and program managers alike.
PICASSO: an end-to-end image simulation tool for space and airborne imaging systems
NASA Astrophysics Data System (ADS)
Cota, Stephen A.; Bell, Jabin T.; Boucher, Richard H.; Dutton, Tracy E.; Florio, Christopher J.; Franz, Geoffrey A.; Grycewicz, Thomas J.; Kalman, Linda S.; Keller, Robert A.; Lomheim, Terrence S.; Paulson, Diane B.; Wilkinson, Timothy S.
2010-06-01
The design of any modern imaging system is the end result of many trade studies, each seeking to optimize image quality within real world constraints such as cost, schedule and overall risk. Image chain analysis - the prediction of image quality from fundamental design parameters - is an important part of this design process. At The Aerospace Corporation we have been using a variety of image chain analysis tools for many years, the Parameterized Image Chain Analysis & Simulation SOftware (PICASSO) among them. In this paper we describe our PICASSO tool, showing how, starting with a high quality input image and hypothetical design descriptions representative of the current state of the art in commercial imaging satellites, PICASSO can generate standard metrics of image quality in support of the decision processes of designers and program managers alike.
Risk-based maintenance of ethylene oxide production facilities.
Khan, Faisal I; Haddara, Mahmoud R
2004-05-20
This paper discusses a methodology for the design of an optimum inspection and maintenance program. The methodology, called risk-based maintenance (RBM) is based on integrating a reliability approach and a risk assessment strategy to obtain an optimum maintenance schedule. First, the likely equipment failure scenarios are formulated. Out of many likely failure scenarios, the ones, which are most probable, are subjected to a detailed study. Detailed consequence analysis is done for the selected scenarios. Subsequently, these failure scenarios are subjected to a fault tree analysis to determine their probabilities. Finally, risk is computed by combining the results of the consequence and the probability analyses. The calculated risk is compared against known acceptable criteria. The frequencies of the maintenance tasks are obtained by minimizing the estimated risk. A case study involving an ethylene oxide production facility is presented. Out of the five most hazardous units considered, the pipeline used for the transportation of the ethylene is found to have the highest risk. Using available failure data and a lognormal reliability distribution function human health risk factors are calculated. Both societal risk factors and individual risk factors exceeded the acceptable risk criteria. To determine an optimal maintenance interval, a reverse fault tree analysis was used. The maintenance interval was determined such that the original high risk is brought down to an acceptable level. A sensitivity analysis is also undertaken to study the impact of changing the distribution of the reliability model as well as the error in the distribution parameters on the maintenance interval.
Quality-by-design approach for the development of telmisartan potassium tablets.
Oh, Ga-Hui; Park, Jin-Hyun; Shin, Hye-Won; Kim, Joo-Eun; Park, Young-Joon
2018-05-01
A quality-by-design approach was adopted to develop telmisartan potassium (TP) tablets, which were bioequivalent with the commercially available Micardis ® (telmisartan free base) tablets. The dissolution pattern and impurity profile of TP tablets differed from those of Micardis ® tablets because telmisartan free base is poorly soluble in water. After identifying the quality target product profile and critical quality attributes (CQAs), drug dissolution, and impurities were predicted to be risky CQAs. To determine the exact range and cause of risks, we used the risk assessment (RA) tools, preliminary hazard analysis and failure mode and effect analysis to determine the parameters affecting drug dissolution, impurities, and formulation. The range of the design space was optimized using the face-centered central composite design among the design of experiment (DOE) methods. The binder, disintegrant, and kneading time in the wet granulation were identified as X values affecting Y values (disintegration, hardness, friability, dissolution, and impurities). After determining the design space with the desired Y values, the TP tablets were formulated and their dissolution pattern was compared with that of the reference tablet. The selected TP tablet formulated using design space showed a similar dissolution to that of Micardis ® tablets at pH 7.5. The QbD approach TP tablet was bioequivalent to Micardis ® tablets in beagle dogs.
Low-thrust mission risk analysis, with application to a 1980 rendezvous with the comet Encke
NASA Technical Reports Server (NTRS)
Yen, C. L.; Smith, D. B.
1973-01-01
A computerized failure process simulation procedure is used to evaluate the risk in a solar electric space mission. The procedure uses currently available thrust-subsystem reliability data and performs approximate simulations of the thrust sybsystem burn operation, the system failure processes, and the retargeting operations. The method is applied to assess the risks in carrying out a 1980 rendezvous mission to the comet Encke. Analysis of the results and evaluation of the effects of various risk factors on the mission show that system component failure rates are the limiting factors in attaining a high mission relability. It is also shown that a well-designed trajectory and system operation mode can be used effectively to partially compensate for unreliable thruster performance.
Dietary zinc and iron intake and risk of depression: A meta-analysis.
Li, Zongyao; Li, Bingrong; Song, Xingxing; Zhang, Dongfeng
2017-05-01
The associations between dietary zinc and iron intake and risk of depression remain controversial. Thus, we carried out a meta-analysis to evaluate these associations. A systematic search was performed in PubMed, Embase, Web of Science, Chinese National Knowledge Infrastructure (CNKI) and Wanfang databases for relevant studies up to January 2017. Pooled relative risks (RRs) with 95% confidence intervals (CIs) were calculated using a random effects model. A total of 9 studies for dietary zinc intake and 3 studies for dietary iron intake were finally included in present meta-analysis. The pooled RRs with 95% CIs of depression for the highest versus lowest dietary zinc and iron intake were 0.67 (95% CI: 0.58-0.76) and 0.57 (95% CI: 0.34-0.95), respectively. In subgroup analysis by study design, the inverse association between dietary zinc intake and risk of depression remained significant in the cohort studies and cross-sectional studies. The pooled RRs (95% CIs) for depression did not substantially change in the influence analysis and subgroup analysis by adjustment for body mass index (BMI). The present meta-analysis indicates inverse associations between dietary zinc and iron intake and risk of depression. Copyright © 2017 Elsevier Ireland Ltd. All rights reserved.
Practical Application of PRA as an Integrated Design Tool for Space Systems
NASA Technical Reports Server (NTRS)
Kalia, Prince; Shi, Ying; Pair, Robin; Quaney, Virginia; Uhlenbrock, John
2013-01-01
This paper presents the application of the first comprehensive Probabilistic Risk Assessment (PRA) during the design phase of a joint NASA/NOAA weather satellite program, Geostationary Operational Environmental Satellite Series R (GOES-R). GOES-R is the next generation weather satellite primarily to help understand the weather and help save human lives. PRA has been used at NASA for Human Space Flight for many years. PRA was initially adopted and implemented in the operational phase of manned space flight programs and more recently for the next generation human space systems. Since its first use at NASA, PRA has become recognized throughout the Agency as a method of assessing complex mission risks as part of an overall approach to assuring safety and mission success throughout project lifecycles. PRA is now included as a requirement during the design phase of both NASA next generation manned space vehicles as well as for high priority robotic missions. The influence of PRA on GOES-R design and operation concepts are discussed in detail. The GOES-R PRA is unique at NASA for its early implementation. It also represents a pioneering effort to integrate risks from both Spacecraft (SC) and Ground Segment (GS) to fully assess the probability of achieving mission objectives. PRA analysts were actively involved in system engineering and design engineering to ensure that a comprehensive set of technical risks were correctly identified and properly understood from a design and operations perspective. The analysis included an assessment of SC hardware and software, SC fault management system, GS hardware and software, common cause failures, human error, natural hazards, solar weather and infrastructure (such as network and telecommunications failures, fire). PRA findings directly resulted in design changes to reduce SC risk from micro-meteoroids. PRA results also led to design changes in several SC subsystems, e.g. propulsion, guidance, navigation and control (GNC), communications, mechanisms, and command and data handling (C&DH). The fault tree approach assisted in the development of the fault management system design. Human error analysis, which examined human response to failure, indicated areas where automation could reduce the overall probability of gaps in operation by half. In addition, the PRA brought to light many potential root causes of system disruptions, including earthquakes, inclement weather, solar storms, blackouts and other extreme conditions not considered in the typical reliability and availability analyses. Ultimately the PRA served to identify potential failures that, when mitigated, resulted in a more robust design, as well as to influence the program's concept of operations. The early and active integration of PRA with system and design engineering provided a well-managed approach for risk assessment that increased reliability and availability, optimized lifecyc1e costs, and unified the SC and GS developments.
NASA Astrophysics Data System (ADS)
Massmann, Joel; Freeze, R. Allan
1987-02-01
The risk-cost-benefit analysis developed in the companion paper (J. Massmann and R. A. Freeze, this issue) is here applied to (1) an assessment of the relative worth of containment-construction activities, site-exploration activities, and monitoring activities as components of a design strategy for the owner/operator of a waste management facility; (2) an assessment of alternative policy options available to a regulatory agency; and (3) a case history. Sensitivity analyses designed to address the first issue show that the allocation of resources by the owner/operator is sensitive to the stochastic parameters used to describe the hydraulic conductivity field at a site. For the cases analyzed, the installation of a dense monitoring network is of less value to the owner/operator than a more conservative containment design. Sensitivity analyses designed to address the second issue suggest that from a regulatory perspective, design standards should be more effective than performance standards in reducing risk, and design specifications on the containment structure should be more effective than those on the monitoring network. Performance bonds posted before construction have a greater potential to influence design than prospective penalties to be imposed at the time of failure. Siting on low-conductivity deposits is a more effective method of risk reduction than any form of regulatory influence. Results of the case history indicate that the methodology can be successfully applied at field sites.
A Case Study of Measuring Process Risk for Early Insights into Software Safety
NASA Technical Reports Server (NTRS)
Layman, Lucas; Basili, Victor; Zelkowitz, Marvin V.; Fisher, Karen L.
2011-01-01
In this case study, we examine software safety risk in three flight hardware systems in NASA's Constellation spaceflight program. We applied our Technical and Process Risk Measurement (TPRM) methodology to the Constellation hazard analysis process to quantify the technical and process risks involving software safety in the early design phase of these projects. We analyzed 154 hazard reports and collected metrics to measure the prevalence of software in hazards and the specificity of descriptions of software causes of hazardous conditions. We found that 49-70% of 154 hazardous conditions could be caused by software or software was involved in the prevention of the hazardous condition. We also found that 12-17% of the 2013 hazard causes involved software, and that 23-29% of all causes had a software control. The application of the TPRM methodology identified process risks in the application of the hazard analysis process itself that may lead to software safety risk.
Ergonomic assessment for the task of repairing computers in a manufacturing company: A case study.
Maldonado-Macías, Aidé; Realyvásquez, Arturo; Hernández, Juan Luis; García-Alcaraz, Jorge
2015-01-01
Manufacturing industry workers who repair computers may be exposed to ergonomic risk factors. This project analyzes the tasks involved in the computer repair process to (1) find the risk level for musculoskeletal disorders (MSDs) and (2) propose ergonomic interventions to address any ergonomic issues. Work procedures and main body postures were video recorded and analyzed using task analysis, the Rapid Entire Body Assessment (REBA) postural method, and biomechanical analysis. High risk for MSDs was found on every subtask using REBA. Although biomechanical analysis found an acceptable mass center displacement during tasks, a hazardous level of compression on the lower back during computer's transportation was detected. This assessment found ergonomic risks mainly in the trunk, arm/forearm, and legs; the neck and hand/wrist were also compromised. Opportunities for ergonomic analyses and interventions in the design and execution of computer repair tasks are discussed.
NASA Technical Reports Server (NTRS)
Ryan, R. S.; Bullock, T.; Holland, W. B.; Kross, D. A.; Kiefling, L. A.
1981-01-01
The achievement of an optimized design from the system standpoint under the low cost, high risk constraints of the present day environment was analyzed. Space Shuttle illustrates the requirement for an analysis approach that considers all major disciplines (coupling between structures control, propulsion, thermal, aeroelastic, and performance), simultaneously. The Space Shuttle and certain payloads, Space Telescope and Spacelab, are examined. The requirements for system analysis approaches and criteria, including dynamic modeling requirements, test requirements, control requirements, and the resulting design verification approaches are illustrated. A survey of the problem, potential approaches available as solutions, implications for future systems, and projected technology development areas are addressed.
Dietary fiber intake reduces risk for Barrett's esophagus and esophageal cancer.
Sun, Lingli; Zhang, Zhizhong; Xu, Jian; Xu, Gelin; Liu, Xinfeng
2017-09-02
Observational studies suggest an association between dietary fiber intake and risk of Barrett's esophagus and esophageal cancer. However, the results are inconsistent. To conduct a meta-analysis of observational studies to assess this association. All eligible studies were identified by electronic searches in PubMed and Embase through February 2015. Dose-response, subgroup, sensitivity, and publication bias analyses were performed. A total of 15 studies involving 16,885 subjects were included in the meta-analysis. The pooled odds ratio for the highest compared with the lowest dietary fiber intake was 0.52 (95% CI, 0.43-0.64). Stratified analyses for tumor subtype, study design, geographic location, fiber type, publication year, total sample size, and quality score yielded consistent results. Dose-response analysis indicated that a 10-g/d increment in dietary fiber intake was associated with a 31% reduction in Barrett's esophagus and esophageal cancer risk. Sensitivity analysis restricted to studies with control for conventional risk factors produced similar results, and omission of any single study had little effect on the overall risk estimate. Our findings indicate that dietary fiber intake is inversely associated with risk of Barrett's esophagus and esophageal cancer. Further large prospective studies are warranted.
NASA Astrophysics Data System (ADS)
Lopes, D. F.; Oliveira, M. D.; Costa, C. A. Bana e.
2015-05-01
Risk matrices (RMs) are commonly used to evaluate health and safety risks. Nonetheless, they violate some theoretical principles that compromise their feasibility and use. This study describes how multiple criteria decision analysis methods have been used to improve the design and the deployment of RMs to evaluate health and safety risks at the Occupational Health and Safety Unit (OHSU) of the Regional Health Administration of Lisbon and Tagus Valley. ‘Value risk-matrices’ (VRMs) are built with the MACBETH approach in four modelling steps: a) structuring risk impacts, involving the construction of descriptors of impact that link risk events with health impacts and are informed by scientific evidence; b) generating a value measurement scale of risk impacts, by applying the MACBETH-Choquet procedure; c) building a system for eliciting subjective probabilities that makes use of a numerical probability scale that was constructed with MACBETH qualitative judgments on likelihood; d) and defining a classification colouring scheme for the VRM. A VRM built with OHSU members was implemented in a decision support system which will be used by OHSU members to evaluate health and safety risks and to identify risk mitigation actions.
Citrus fruit intake and bladder cancer risk: a meta-analysis of observational studies.
Liang, Sudong; Lv, Gaofei; Chen, Weikai; Jiang, Jianxin; Wang, Jingqun
2014-11-01
Epidemiological studies have investigated the association between citrus fruit and bladder cancer risk; however, the results are inconsistent. To assess these issues, we conducted a meta-analysis of currently available studies. We identified relevant articles by searching the MEDLINE and EMBASE databases. We calculated the summary relative risk (RR) with 95% confidence interval (95% CI) using a random effect model. We included eight case-control studies and six cohort studies in the meta-analysis. There was a significant inverse association between citrus fruit intake and bladder cancer risk in all pooled studies (RR: 0.85; 95% CI, 0.76-0.94) and case-control studies (RR: 0.77; 95% CI, 0.64-0.92), but not in the cohort studies (RR: 0.96; 95% CI, 0.87-1.07). Our results suggest that citrus fruit intake is related to decreased bladder cancer risk. Subsequent well-designed, large prospective studies are needed to obtain better understanding of this relationship.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Komelasky, M. C.
1980-03-01
Lowry and Hoffman Associates Inc. (LHA) performed for ORI an analysis of the shipbuilding requirements for constructing an OTEC plant, and the available shipyard assets which could fulfill these requirements. In addition, several shipyards were queried concerning their attitudes towards OTEC. In assessing the shipbuilding requirements for an OTEC plant, four different platform configurations were studied and four different designs of the cold water pipe (CWP) were examined. The platforms were: a concrete ship design proposed by Lockheed; concrete spar designs with internal heat exchangers (IHE) (Rosenblatt) and external heat exchangers (XHE) (Lockheed); and a steel ship design proposed bymore » Gibbs and Cox. The types of materials examined for CWP construction were: steel, fiber reinforced plastic (FPR), elastomer, and concrete. The report is organized io three major discussion areas. All the construction requirements are synthesized for the four platforms and CWPs, and general comments are made concerning their availability in the US. Specific shipbuilders facilities are reviewed for their applicability to building an OTEC plant, an assessment of the shipyards general interest in the OTEC program is presented providing an insight into their nearterm commercial outlook. The method of determining this interest will depend largely on a risk analysis of the OTEC system. Also included are factors which may comprise this analysis, and a methodology to ascertain the risk. In the appendices, various shipyard specifications are presented, shipyard assessment matrices are given, graphs of various shipyard economic outlooks are provided, and definitions of the risk factors are listed. (WHK)« less
Pineles, Beth L.; Park, Edward; Samet, Jonathan M.
2014-01-01
We conducted a systematic review and meta-analysis to characterize the relationship between smoking and miscarriage. We searched the PubMed database (1956–August 31, 2011) using keywords and conducted manual reference searches of included articles and reports of the US Surgeon General. The full text of 1,706 articles was reviewed, and 98 articles that examined the association between active or passive smoking and miscarriage were included in the meta-analysis. Data were abstracted by 2 reviewers. Any active smoking was associated with increased risk of miscarriage (summary relative risk ratio = 1.23, 95% confidence interval (CI): 1.16, 1.30; n = 50 studies), and this risk was greater when the smoking exposure was specifically defined as during the pregnancy in which miscarriage risk was measured (summary relative risk ratio = 1.32, 95% CI: 1.21, 1.44; n = 25 studies). The risk of miscarriage increased with the amount smoked (1% increase in relative risk per cigarette smoked per day). Secondhand smoke exposure during pregnancy increased the risk of miscarriage by 11% (95% CI: 0.95, 1.31; n = 17 studies). Biases in study publication, design, and analysis did not significantly affect the results. This finding strengthens the evidence that women should not smoke while pregnant, and all women of reproductive age should be warned that smoking increases the risk of miscarriage. PMID:24518810
NASA Technical Reports Server (NTRS)
Karandikar, Harsh M.
1997-01-01
An approach for objective and quantitative technical and cost risk analysis during product development, which is applicable from the earliest stages, is discussed. The approach is supported by a software tool called the Analytical System for Uncertainty and Risk Estimation (ASURE). Details of ASURE, the underlying concepts and its application history, are provided.
A Meta-Analysis of the RTI Literature for Children at Risk for Reading Disabilities
ERIC Educational Resources Information Center
Tran, Loan; Sanchez, Tori; Arellano, Brenda; Swanson, H. Lee
2011-01-01
This article synthesizes the literature comparing at-risk children designated as responders and low responders to interventions in reading. The central question addressed in this review is whether individual differences in reading-related skills at pretest predict responders at posttest across a variety of interventions and sets of criteria for…
Risk Communication about Nuclear Power in Korea: One-Year Descriptive Analysis on Twitter
ERIC Educational Resources Information Center
Kim, Minkee
2013-01-01
Over the last three decades, public understanding of science (PUS) has been one of the foremost research topics in the Korean society where numerous social scientific conflicts have taken place. As a lead channel of risk communication, Twitter has been studied in experimental research designs or among target user groups, leaving the measurement of…
Decomposition-Based Failure Mode Identification Method for Risk-Free Design of Large Systems
NASA Technical Reports Server (NTRS)
Tumer, Irem Y.; Stone, Robert B.; Roberts, Rory A.; Clancy, Daniel (Technical Monitor)
2002-01-01
When designing products, it is crucial to assure failure and risk-free operation in the intended operating environment. Failures are typically studied and eliminated as much as possible during the early stages of design. The few failures that go undetected result in unacceptable damage and losses in high-risk applications where public safety is of concern. Published NASA and NTSB accident reports point to a variety of components identified as sources of failures in the reported cases. In previous work, data from these reports were processed and placed in matrix form for all the system components and failure modes encountered, and then manipulated using matrix methods to determine similarities between the different components and failure modes. In this paper, these matrices are represented in the form of a linear combination of failures modes, mathematically formed using Principal Components Analysis (PCA) decomposition. The PCA decomposition results in a low-dimensionality representation of all failure modes and components of interest, represented in a transformed coordinate system. Such a representation opens the way for efficient pattern analysis and prediction of failure modes with highest potential risks on the final product, rather than making decisions based on the large space of component and failure mode data. The mathematics of the proposed method are explained first using a simple example problem. The method is then applied to component failure data gathered from helicopter, accident reports to demonstrate its potential.
Trajectory Design Enhancements to Mitigate Risk for the Transiting Exoplanet Survey Satellite (TESS)
NASA Technical Reports Server (NTRS)
Dichmann, Donald; Parker, Joel; Nickel, Craig; Lutz, Stephen
2016-01-01
The Transiting Exoplanet Survey Satellite (TESS) will employ a highly eccentric Earth orbit, in 2:1 lunar resonance, which will be reached with a lunar flyby preceded by 3.5 phasing loops. The TESS mission has limited propellant and several constraints on the science orbit and on the phasing loops. Based on analysis and simulation, we have designed the phasing loops to reduce delta-V (DV) and to mitigate risk due to maneuver execution errors. We have automated the trajectory design process and use distributed processing to generate and optimal nominal trajectories; to check constraint satisfaction; and finally to model the effects of maneuver errors to identify trajectories that best meet the mission requirements.
Fahmy, Raafat; Kona, Ravikanth; Dandu, Ramesh; Xie, Walter; Claycamp, Gregg; Hoag, Stephen W
2012-12-01
As outlined in the ICH Q8(R2) guidance, identifying the critical quality attributes (CQA) is a crucial part of dosage form development; however, the number of possible formulation and processing factors that could influence the manufacturing of a pharmaceutical dosage form is enormous obviating formal study of all possible parameters and their interactions. Thus, the objective of this study is to examine how quality risk management can be used to prioritize the number of experiments needed to identify the CQA, while still maintaining an acceptable product risk profile. To conduct the study, immediate-release ciprofloxacin tablets manufactured via roller compaction were used as a prototype system. Granules were manufactured using an Alexanderwerk WP120 roller compactor and tablets were compressed on a Stokes B2 tablet press. In the early stages of development, prior knowledge was systematically incorporated into the risk assessment using failure mode and effect analysis (FMEA). The factors identified using FMEA were then followed by a quantitative assessed using a Plackett-Burman screening design. Results show that by using prior experience, literature data, and preformulation data the number of experiments could be reduced to an acceptable level, and the use of FMEA and screening designs such as the Plackett Burman can rationally guide the process of reducing the number experiments to a manageable level.
NASA Astrophysics Data System (ADS)
Li, Leihong
A modular structural design methodology for composite blades is developed. This design method can be used to design composite rotor blades with sophisticate geometric cross-sections. This design method hierarchically decomposed the highly-coupled interdisciplinary rotor analysis into global and local levels. In the global level, aeroelastic response analysis and rotor trim are conduced based on multi-body dynamic models. In the local level, variational asymptotic beam sectional analysis methods are used for the equivalent one-dimensional beam properties. Compared with traditional design methodology, the proposed method is more efficient and accurate. Then, the proposed method is used to study three different design problems that have not been investigated before. The first is to add manufacturing constraints into design optimization. The introduction of manufacturing constraints complicates the optimization process. However, the design with manufacturing constraints benefits the manufacturing process and reduces the risk of violating major performance constraints. Next, a new design procedure for structural design against fatigue failure is proposed. This procedure combines the fatigue analysis with the optimization process. The durability or fatigue analysis employs a strength-based model. The design is subject to stiffness, frequency, and durability constraints. Finally, the manufacturing uncertainty impacts on rotor blade aeroelastic behavior are investigated, and a probabilistic design method is proposed to control the impacts of uncertainty on blade structural performance. The uncertainty factors include dimensions, shapes, material properties, and service loads.
ERIC Educational Resources Information Center
Burkholder, Gary J.; Harlow, Lisa L.
2003-01-01
Tested a model of HIV behavior risk, using a fully cross-lagged, longitudinal design to illustrate the analysis of larger structural equation models. Data from 527 women who completed a survey at three time points show excellent fit of the model to the data. (SLD)
49 CFR Appendix B to Part 236 - Risk Assessment Criteria
Code of Federal Regulations, 2010 CFR
2010-10-01
... results of the application of safety design principles as noted in Appendix C to this part. The MTTHE is... fault/failure analysis must be based on the assessment of the design and implementation of all safety... associated device drivers, as well as historical performance data, analytical methods and experimental safety...
49 CFR Appendix B to Part 236 - Risk Assessment Criteria
Code of Federal Regulations, 2013 CFR
2013-10-01
... results of the application of safety design principles as noted in Appendix C to this part. The MTTHE is... fault/failure analysis must be based on the assessment of the design and implementation of all safety... associated device drivers, as well as historical performance data, analytical methods and experimental safety...
Transient risk factors for acute traumatic hand injuries: a case‐crossover study in Hong Kong
Chow, C Y; Lee, H; Lau, J; Yu, I T S
2007-01-01
Objectives To identify the remediable transient risk factors of occupational hand injuries in Hong Kong in order to guide the development of prevention strategies. Methods The case‐crossover study design was adopted. Study subjects were workers with acute hand injuries presenting to the government Occupational Medicine Unit for compensation claims within 90 days from the date of injury. Detailed information on exposures to specific transient factors during the 60 minutes prior to the occurrence of the injury, during the same time interval on the day prior to the injury, as well as the usual exposure during the past work‐month was obtained through telephone interviews. Both matched‐pair interval approach and usual frequency approach were adopted to assess the associations between transient exposures in the workplace and the short‐term risk of sustaining a hand injury. Results A total of 196 injured workers were interviewed. The results of the matched‐pair interval analysis matched well with the results obtained using the usual frequency analysis. Seven significant transient risk factors were identified: using malfunctioning equipment/materials, using a different work method, performing an unusual work task, working overtime, feeling ill, being distracted and rushing, with odds ratios ranging from 10.5 to 26.0 in the matched‐pair interval analysis and relative risks ranging between 8.0 and 28.3 with the usual frequency analysis. Wearing gloves was found to have an insignificant protective effect on the occurrence of hand injury in both analyses. Conclusions Using the case‐crossover study design for acute occupational hand injuries, seven transient risk factors that were mostly modifiable were identified. It is suggested that workers and their employers should increase their awareness of these risk factors, and efforts should be made to avoid exposures to these factors by means of engineering and administrative controls supplemented by safety education and training. PMID:16973734
Association between vasectomy and risk of testicular cancer: A systematic review and meta-analysis.
Duan, Haifeng; Deng, Tuo; Chen, Yiwen; Zhao, Zhijian; Wen, Yaoan; Chen, Yeda; Li, Xiaohang; Zeng, Guohua
2018-01-01
A number of researchers have reported that vasectomy is a risk factor for testicular cancer. However, this conclusion is inconsistent with a number of other published articles. Hence, we conducted this meta-analysis to assess whether vasectomy increases the risk of testicular cancer. We identified all related studies by searching the PubMed, Embase, and Cochrane Library database from January 01, 1980 to June 01, 2017. The Newcastle-Ottawa Scale (NOS) checklist was used to assess all included non-randomized studies. Summarized odds ratios (ORs) and 95% confidence intervals (CIs) were used to assess the difference in outcomes between case and control groups. Subgroup analyses were performed according to the study design and country. A total of eight studies (2176 testicular cancer patients) were included in this systematic review and meta-analysis. Six articles were case-control studies, and two were cohort studies. The pooled estimate of the OR was 1.10 (95% CI: 0.93-1.30) based on the eight studies in a fixed effects model. Two subgroup analyses were performed according to the study design and country. The results were consistent with the overall findings. Publication bias was detected by Begg's test and Egger's test and p values > 0.05, respectively. Our meta-analysis suggested that there was no association between vasectomy and the development of testicular cancer. More high-quality studies are warranted to further explore the association between vasectomy and risk of testicular cancer.
Analysis of wheel rim - Material and manufacturing aspects
NASA Astrophysics Data System (ADS)
Misra, Sheelam; Singh, Abhiraaj; James, Eldhose
2018-05-01
The tire in an automobile is supported by the rim of the wheel and its shape and dimensions should be adjusted to accommodate a specified tire. In this study, a tire of car wheel rim belonging to the disc wheel category is considered. Design is an important industrial operation used to define and specify the quality of the product. The design and modelling reduces the risk of damage involved in the manufacturing process. The design performed on this wheel rim is done on modelling software. After designing the model, it is imported for analysis purposes. The analysis software is used to calculate the different types of force, stresses, torque, and pressures acting upon the rim of the wheel and it reduces the time spent by a human for mathematical calculations. The analysis carried out considers two different materials namely structural steel and aluminium. Both materials are analyzed and their performance is noted.
NASA Space Radiation Risk Project: Overview and Recent Results
NASA Technical Reports Server (NTRS)
Blattnig, Steve R.; Chappell, Lori J.; George, Kerry A.; Hada, Megumi; Hu, Shaowen; Kidane, Yared H.; Kim, Myung-Hee Y.; Kovyrshina, Tatiana; Norman, Ryan B.; Nounu, Hatem N.;
2015-01-01
The NASA Space Radiation Risk project is responsible for integrating new experimental and computational results into models to predict risk of cancer and acute radiation syndrome (ARS) for use in mission planning and systems design, as well as current space operations. The project has several parallel efforts focused on proving NASA's radiation risk projection capability in both the near and long term. This presentation will give an overview, with select results from these efforts including the following topics: verification, validation, and streamlining the transition of models to use in decision making; relative biological effectiveness and dose rate effect estimation using a combination of stochastic track structure simulations, DNA damage model calculations and experimental data; ARS model improvements; pathway analysis from gene expression data sets; solar particle event probabilistic exposure calculation including correlated uncertainties for use in design optimization.
Capability maturity models for offshore organisational management.
Strutt, J E; Sharp, J V; Terry, E; Miles, R
2006-12-01
The goal setting regime imposed by the UK safety regulator has important implications for an organisation's ability to manage health and safety related risks. Existing approaches to safety assurance based on risk analysis and formal safety assessments are increasingly considered unlikely to create the step change improvement in safety to which the offshore industry aspires and alternative approaches are being considered. One approach, which addresses the important issue of organisational behaviour and which can be applied at a very early stage of design, is the capability maturity model (CMM). The paper describes the development of a design safety capability maturity model, outlining the key processes considered necessary to safety achievement, definition of maturity levels and scoring methods. The paper discusses how CMM is related to regulatory mechanisms and risk based decision making together with the potential of CMM to environmental risk management.
Operational Implementation of a Pc Uncertainty Construct for Conjunction Assessment Risk Analysis
NASA Technical Reports Server (NTRS)
Newman, Lauri K.; Hejduk, Matthew D.; Johnson, Lauren C.
2016-01-01
Earlier this year the NASA Conjunction Assessment and Risk Analysis (CARA) project presented the theoretical and algorithmic aspects of a method to include the uncertainties in the calculation inputs when computing the probability of collision (Pc) between two space objects, principally uncertainties in the covariances and the hard-body radius. The output of this calculation approach is to produce rather than a single Pc value an entire probability density function that will represent the range of possible Pc values given the uncertainties in the inputs and bring CA risk analysis methodologies more in line with modern risk management theory. The present study provides results from the exercise of this method against an extended dataset of satellite conjunctions in order to determine the effect of its use on the evaluation of conjunction assessment (CA) event risk posture. The effects are found to be considerable: a good number of events are downgraded from or upgraded to a serious risk designation on the basis of consideration of the Pc uncertainty. The findings counsel the integration of the developed methods into NASA CA operations.
Cardiovascular Risk and Its Associated Factors in Health Care Workers in Colombia: A Study Protocol.
Gamboa Delgado, Edna M; Rojas Sánchez, Lyda Z; Bermon Angarita, Anderson; Rangel Díaz, Yully Andrea; Jaraba Suárez, Silvia J; Serrano Díaz, Norma C; Vega Fernández, Evaristo
2015-07-30
Cardiovascular diseases are the leading cause of mortality worldwide, for this reason, they are a public health problem. In Colombia, cardiovascular diseases are the main cause of mortality, having a death rate of 152 deaths per 100,000 population. There are 80% of these cardiovascular events that are considered avoidable. The objective of the study is to determine the prevalence of the cardiovascular risk and its associated factors among the institution's workers in order to design and implement interventions in the work environment which may achieve a decrease in such risk. An analytical cross-sectional study was designed to determine the cardiovascular risk and its associated factors among workers of a high complexity health care institution. A self-applied survey will be conducted considering sociodemographic aspects, physical activity, diet, alcohol consumption, smoking, level of perceived stress, and personal and family history. In a second appointment, a physical examination will be made, as well as anthropometric measurements and blood pressure determination. Also, blood samples for evaluating total and high density lipoprotein cholesterol, triglycerides, and fasting blood sugar will be taken. A ten-year global risk for cardiovascular disease will be determined using the Framingham score. A descriptive analysis of the population's characteristics and a stratified analysis by sex, age, and occupation will be made. Bivariate and multivariate analysis will be made using logistic regression models to evaluate the association between cardiovascular risk and the independent variables. The research protocol was approved by the Scientific and Technical Committee and the Ethics Committee on Research of the Fundación Cardiovascular de Colombia. The protocol has already received funding and the enrollment phase will begin in the coming months. The results of this study will give the foundation for the design, implementation, and evaluation of a program based on promoting healthy lifestyles, such as performing regular physical activity and healthy food intake in order to avoid and/or control the cardiovascular risk in the workers of a high complexity health care institution.
NASA Technical Reports Server (NTRS)
Deckert, George
2010-01-01
This viewgraph presentation reviews The NASA Hazard Analysis process. The contents include: 1) Significant Incidents and Close Calls in Human Spaceflight; 2) Subsystem Safety Engineering Through the Project Life Cycle; 3) The Risk Informed Design Process; 4) Types of NASA Hazard Analysis; 5) Preliminary Hazard Analysis (PHA); 6) Hazard Analysis Process; 7) Identify Hazardous Conditions; 8) Consider All Interfaces; 9) Work a Preliminary Hazard List; 10) NASA Generic Hazards List; and 11) Final Thoughts
Risk analysis within environmental impact assessment of proposed construction activity
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zeleňáková, Martina; Zvijáková, Lenka
Environmental impact assessment is an important process, prior to approval of the investment plan, providing a detailed examination of the likely and foreseeable impacts of proposed construction activity on the environment. The objective of this paper is to develop a specific methodology for the analysis and evaluation of environmental impacts of selected constructions – flood protection structures using risk analysis methods. The application of methodology designed for the process of environmental impact assessment will develop assumptions for further improvements or more effective implementation and performance of this process. The main objective of the paper is to improve the implementation ofmore » the environmental impact assessment process. Through the use of risk analysis methods in environmental impact assessment process, the set objective has been achieved. - Highlights: This paper is informed by an effort to develop research with the aim of: • Improving existing qualitative and quantitative methods for assessing the impacts • A better understanding of relations between probabilities and consequences • Methodology for the EIA of flood protection constructions based on risk analysis • Creative approaches in the search for environmentally friendly proposed activities.« less
[Design of a risk matrix to assess sterile formulations at health care facilities].
Martín de Rosales Cabrera, A M; López Cabezas, C; García Salom, P
2014-05-01
To design a matrix allowing classifying sterile formulations prepared at the hospital with different risk levels. i) Literature search and critical appraisal of the model proposed by the European Resolution CM/Res Ap(2011)1, ii) Identification of the risk associated to the elaboration process by means of the AMFE methodology (Modal Analysis of Failures and Effects), iii) estimation of the severity associated to the risks detected. After initially trying a model of numeric scoring, the classification matrix was changed to an alphabetical classification, grading each criterion from A to D.Each preparation assessed is given a 6-letter combination with three possible risk levels: low, intermediate, and high. This model was easier for risk assignment, and more reproducible. The final model designed analyzes 6 criteria: formulation process, administration route, the drug's safety profile, amount prepared, distribution, and susceptibility for microbiological contamination.The risk level obtained will condition the requirements of the formulation area, validity time, and storing conditions. The matrix model proposed may help health care institutions to better assess the risk of sterile formulations prepared,and provides information about the acceptable validity time according to the storing conditions and the manufacturing area. Its use will increase the safety level of this procedure as well as help in resources planning and distribution. Copyright AULA MEDICA EDICIONES 2014. Published by AULA MEDICA. All rights reserved.
Failure mode and effects analysis: a comparison of two common risk prioritisation methods.
McElroy, Lisa M; Khorzad, Rebeca; Nannicelli, Anna P; Brown, Alexandra R; Ladner, Daniela P; Holl, Jane L
2016-05-01
Failure mode and effects analysis (FMEA) is a method of risk assessment increasingly used in healthcare over the past decade. The traditional method, however, can require substantial time and training resources. The goal of this study is to compare a simplified scoring method with the traditional scoring method to determine the degree of congruence in identifying high-risk failures. An FMEA of the operating room (OR) to intensive care unit (ICU) handoff was conducted. Failures were scored and ranked using both the traditional risk priority number (RPN) and criticality-based method, and a simplified method, which designates failures as 'high', 'medium' or 'low' risk. The degree of congruence was determined by first identifying those failures determined to be critical by the traditional method (RPN≥300), and then calculating the per cent congruence with those failures designated critical by the simplified methods (high risk). In total, 79 process failures among 37 individual steps in the OR to ICU handoff process were identified. The traditional method yielded Criticality Indices (CIs) ranging from 18 to 72 and RPNs ranging from 80 to 504. The simplified method ranked 11 failures as 'low risk', 30 as medium risk and 22 as high risk. The traditional method yielded 24 failures with an RPN ≥300, of which 22 were identified as high risk by the simplified method (92% agreement). The top 20% of CI (≥60) included 12 failures, of which six were designated as high risk by the simplified method (50% agreement). These results suggest that the simplified method of scoring and ranking failures identified by an FMEA can be a useful tool for healthcare organisations with limited access to FMEA expertise. However, the simplified method does not result in the same degree of discrimination in the ranking of failures offered by the traditional method. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
2015-06-15
Current quality assurance and quality management guidelines provided by various professional organizations are prescriptive in nature, focusing principally on performance characteristics of planning and delivery devices. However, published analyses of events in radiation therapy show that most events are often caused by flaws in clinical processes rather than by device failures. This suggests the need for the development of a quality management program that is based on integrated approaches to process and equipment quality assurance. Industrial engineers have developed various risk assessment tools that are used to identify and eliminate potential failures from a system or a process before amore » failure impacts a customer. These tools include, but are not limited to, process mapping, failure modes and effects analysis, fault tree analysis. Task Group 100 of the American Association of Physicists in Medicine has developed these tools and used them to formulate an example risk-based quality management program for intensity-modulated radiotherapy. This is a prospective risk assessment approach that analyzes potential error pathways inherent in a clinical process and then ranks them according to relative risk, typically before implementation, followed by the design of a new process or modification of the existing process. Appropriate controls are then put in place to ensure that failures are less likely to occur and, if they do, they will more likely be detected before they propagate through the process, compromising treatment outcome and causing harm to the patient. Such a prospective approach forms the basis of the work of Task Group 100 that has recently been approved by the AAPM. This session will be devoted to a discussion of these tools and practical examples of how these tools can be used in a given radiotherapy clinic to develop a risk based quality management program. Learning Objectives: Learn how to design a process map for a radiotherapy process Learn how to perform failure modes and effects analysis analysis for a given process Learn what fault trees are all about Learn how to design a quality management program based upon the information obtained from process mapping, failure modes and effects analysis and fault tree analysis. Dunscombe: Director, TreatSafely, LLC and Center for the Assessment of Radiological Sciences; Consultant to IAEA and Varian Thomadsen: President, Center for the Assessment of Radiological Sciences Palta: Vice President of the Center for the Assessment of Radiological Sciences.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Palta, J.
2015-06-15
Current quality assurance and quality management guidelines provided by various professional organizations are prescriptive in nature, focusing principally on performance characteristics of planning and delivery devices. However, published analyses of events in radiation therapy show that most events are often caused by flaws in clinical processes rather than by device failures. This suggests the need for the development of a quality management program that is based on integrated approaches to process and equipment quality assurance. Industrial engineers have developed various risk assessment tools that are used to identify and eliminate potential failures from a system or a process before amore » failure impacts a customer. These tools include, but are not limited to, process mapping, failure modes and effects analysis, fault tree analysis. Task Group 100 of the American Association of Physicists in Medicine has developed these tools and used them to formulate an example risk-based quality management program for intensity-modulated radiotherapy. This is a prospective risk assessment approach that analyzes potential error pathways inherent in a clinical process and then ranks them according to relative risk, typically before implementation, followed by the design of a new process or modification of the existing process. Appropriate controls are then put in place to ensure that failures are less likely to occur and, if they do, they will more likely be detected before they propagate through the process, compromising treatment outcome and causing harm to the patient. Such a prospective approach forms the basis of the work of Task Group 100 that has recently been approved by the AAPM. This session will be devoted to a discussion of these tools and practical examples of how these tools can be used in a given radiotherapy clinic to develop a risk based quality management program. Learning Objectives: Learn how to design a process map for a radiotherapy process Learn how to perform failure modes and effects analysis analysis for a given process Learn what fault trees are all about Learn how to design a quality management program based upon the information obtained from process mapping, failure modes and effects analysis and fault tree analysis. Dunscombe: Director, TreatSafely, LLC and Center for the Assessment of Radiological Sciences; Consultant to IAEA and Varian Thomadsen: President, Center for the Assessment of Radiological Sciences Palta: Vice President of the Center for the Assessment of Radiological Sciences.« less
TU-AB-BRD-04: Development of Quality Management Program
DOE Office of Scientific and Technical Information (OSTI.GOV)
Thomadsen, B.
2015-06-15
Current quality assurance and quality management guidelines provided by various professional organizations are prescriptive in nature, focusing principally on performance characteristics of planning and delivery devices. However, published analyses of events in radiation therapy show that most events are often caused by flaws in clinical processes rather than by device failures. This suggests the need for the development of a quality management program that is based on integrated approaches to process and equipment quality assurance. Industrial engineers have developed various risk assessment tools that are used to identify and eliminate potential failures from a system or a process before amore » failure impacts a customer. These tools include, but are not limited to, process mapping, failure modes and effects analysis, fault tree analysis. Task Group 100 of the American Association of Physicists in Medicine has developed these tools and used them to formulate an example risk-based quality management program for intensity-modulated radiotherapy. This is a prospective risk assessment approach that analyzes potential error pathways inherent in a clinical process and then ranks them according to relative risk, typically before implementation, followed by the design of a new process or modification of the existing process. Appropriate controls are then put in place to ensure that failures are less likely to occur and, if they do, they will more likely be detected before they propagate through the process, compromising treatment outcome and causing harm to the patient. Such a prospective approach forms the basis of the work of Task Group 100 that has recently been approved by the AAPM. This session will be devoted to a discussion of these tools and practical examples of how these tools can be used in a given radiotherapy clinic to develop a risk based quality management program. Learning Objectives: Learn how to design a process map for a radiotherapy process Learn how to perform failure modes and effects analysis analysis for a given process Learn what fault trees are all about Learn how to design a quality management program based upon the information obtained from process mapping, failure modes and effects analysis and fault tree analysis. Dunscombe: Director, TreatSafely, LLC and Center for the Assessment of Radiological Sciences; Consultant to IAEA and Varian Thomadsen: President, Center for the Assessment of Radiological Sciences Palta: Vice President of the Center for the Assessment of Radiological Sciences.« less
Germany wide seasonal flood risk analysis for agricultural crops
NASA Astrophysics Data System (ADS)
Klaus, Stefan; Kreibich, Heidi; Kuhlmann, Bernd; Merz, Bruno; Schröter, Kai
2016-04-01
In recent years, large-scale flood risk analysis and mapping has gained attention. Regional to national risk assessments are needed, for example, for national risk policy developments, for large-scale disaster management planning and in the (re-)insurance industry. Despite increasing requests for comprehensive risk assessments some sectors have not received much scientific attention, one of these is the agricultural sector. In contrast to other sectors, agricultural crop losses depend strongly on the season. Also flood probability shows seasonal variation. Thus, the temporal superposition of high flood susceptibility of crops and high flood probability plays an important role for agricultural flood risk. To investigate this interrelation and provide a large-scale overview of agricultural flood risk in Germany, an agricultural crop loss model is used for crop susceptibility analyses and Germany wide seasonal flood-frequency analyses are undertaken to derive seasonal flood patterns. As a result, a Germany wide map of agricultural flood risk is shown as well as the crop type most at risk in a specific region. The risk maps may provide guidance for federal state-wide coordinated designation of retention areas.
Lin, Chun-Li; Chang, Yen-Hsiang; Hsieh, Shih-Kai; Chang, Wen-Jen
2013-03-01
This study evaluated the risk of failure for an endodontically treated premolar with different crack depths, which was shearing toward the pulp chamber and was restored by using 3 different computer-aided design/computer-aided manufacturing ceramic restoration configurations. Three 3-dimensional finite element models designed with computer-aided design/computer-aided manufacturing ceramic onlay, endocrown, and conventional crown restorations were constructed to perform simulations. The Weibull function was incorporated with finite element analysis to calculate the long-term failure probability relative to different load conditions. The results indicated that the stress values on the enamel, dentin, and luting cement for endocrown restorations exhibited the lowest values relative to the other 2 restoration methods. Weibull analysis revealed that the overall failure probabilities in a shallow cracked premolar were 27%, 2%, and 1% for the onlay, endocrown, and conventional crown restorations, respectively, in the normal occlusal condition. The corresponding values were 70%, 10%, and 2% for the depth cracked premolar. This numeric investigation suggests that the endocrown provides sufficient fracture resistance only in a shallow cracked premolar with endodontic treatment. The conventional crown treatment can immobilize the premolar for different cracked depths with lower failure risk. Copyright © 2013 American Association of Endodontists. Published by Elsevier Inc. All rights reserved.
Structural Element Testing in Support of the Design of the NASA Composite Crew Module
NASA Technical Reports Server (NTRS)
Kellas, Sotiris; Jackson, Wade C.; Thesken, John C.; Schleicher, Eric; Wagner, Perry; Kirsch, Michael T.
2012-01-01
In January 2007, the NASA Administrator and Associate Administrator for the Exploration Systems Mission Directorate chartered the NASA Engineering and Safety Center (NESC) to design, build, and test a full-scale Composite Crew Module (CCM). For the design and manufacturing of the CCM, the team adopted the building block approach where design and manufacturing risks were mitigated through manufacturing trials and structural testing at various levels of complexity. Following NASA's Structural Design Verification Requirements, a further objective was the verification of design analysis methods and the provision of design data for critical structural features. Test articles increasing in complexity from basic material characterization coupons through structural feature elements and large structural components, to full-scale structures were evaluated. This paper discusses only four elements tests three of which include joints and one that includes a tapering honeycomb core detail. For each test series included are specimen details, instrumentation, test results, a brief analysis description, test analysis correlation and conclusions.
Liu, Huan; Wang, Xing-Chun; Hu, Guang-Hui; Guo, Zhui-Feng; Lai, Peng; Xu, Liang; Huang, Tian-Bao; Xu, Yun-Fei
2015-11-01
This meta-analysis was conducted to assess the association between fruit and vegetable intake and bladder cancer risk. Eligible studies published up to August 2014 were retrieved both through a computer search of PubMed, Embase and the Cochrane library and through a manual review of references. The summary relative risks with 95% confidence intervals (CIs) for the highest versus the lowest intakes of fruits and vegetables were calculated with random-effects models. Heterogeneity and publication bias were also evaluated. Potential sources of heterogeneity were detected with metaregression. Subgroup analyses and sensitivity analyses were also performed. A total of 27 studies (12 cohort and 15 case-control studies) were included in this meta-analysis. The summary relative risks for the highest versus lowest were 0.84 (95% CI: 0.72-0.96) for vegetable intake and 0.81 (95% CI: 0.73-0.89) for fruit intake. The dose-response analysis showed that the risk of bladder cancer decreased by 8% (relative risk=0.92; 95% CI: 0.87-0.97) and 9% (relative risk=0.91; 95% CI: 0.83-0.99) for every 200 g/day increment in vegetable and fruit consumption, respectively. Sensitivity analysis confirmed the stability of the results. Our findings suggest that intake of vegetables and fruits may significantly reduce the risk of bladder cancer. Further well-designed prospective studies are warranted to confirm these findings.
Hypertension-misattributed kidney disease in African Americans.
Skorecki, Karl L; Wasser, Walter G
2013-01-01
Lipkowitz et al. extend the African American Study of Kidney Disease and Hypertension to the level of genetic epidemiology, in a case-control study design. Analysis of genotypes at the APOL1 kidney disease risk region supports a paradigm shift in which genetic risk is proximate to both kidney disease and hypertension. The findings mandate urgency in clarifying mechanisms whereby APOL1 region risk variants interact with environmental triggers to cause progressive kidney disease accompanied by dangerous hypertension.
A statistical model of operational impacts on the framework of the bridge crane
NASA Astrophysics Data System (ADS)
Antsev, V. Yu; Tolokonnikov, A. S.; Gorynin, A. D.; Reutov, A. A.
2017-02-01
The technical regulations of the Customs Union demands implementation of the risk analysis of the bridge cranes operation at their design stage. The statistical model has been developed for performance of random calculations of risks, allowing us to model possible operational influences on the bridge crane metal structure in their various combination. The statistical model is practically actualized in the software product automated calculation of risks of failure occurrence of bridge cranes.
Yang, Kathleen Y; Caughey, Aaron B; Little, Sarah E; Cheung, Michael K; Chen, Lee-May
2011-09-01
Women at risk for Lynch Syndrome/HNPCC have an increased lifetime risk of endometrial and ovarian cancer. This study investigates the cost-effectiveness of prophylactic surgery versus surveillance in women with Lynch Syndrome. A decision analytic model was designed incorporating key clinical decisions and existing probabilities, costs, and outcomes from the literature. Clinical forum where risk-reducing surgery and surveillance were considered. A theoretical population of women with Lynch Syndrome at age 30 was used for the analysis. A decision analytic model was designed comparing the health outcomes of prophylactic hysterectomy with bilateral salpingo-oophorectomy at age 30 versus annual gynecologic screening versus annual gynecologic exam. The literature was searched for probabilities of different health outcomes, results of screening modalities, and costs of cancer diagnosis and treatment. Cost-effectiveness expressed in dollars per discounted life-years. Risk-reducing surgery is the least expensive option, costing $23,422 per patient for 25.71 quality-adjusted life-years (QALYs). Annual screening costs $68,392 for 25.17 QALYs; and annual examination without screening costs $100,484 for 24.60 QALYs. Further, because risk-reducing surgery leads to both the lowest costs and the highest number of QALYs, it is a dominant strategy. Risk-reducing surgery is the most cost-effective option from a societal healthcare cost perspective.
Post mitigation impact risk analysis for asteroid deflection demonstration missions
NASA Astrophysics Data System (ADS)
Eggl, Siegfried; Hestroffer, Daniel; Thuillot, William; Bancelin, David; Cano, Juan L.; Cichocki, Filippo
2015-08-01
Even though mankind believes to have the capabilities to avert potentially disastrous asteroid impacts, only the realization of mitigation demonstration missions can validate this claim. Such a deflection demonstration attempt has to be cost effective, easy to validate, and safe in the sense that harmless asteroids must not be turned into potentially hazardous objects. Uncertainties in an asteroid's orbital and physical parameters as well as those additionally introduced during a mitigation attempt necessitate an in depth analysis of deflection mission designs in order to dispel planetary safety concerns. We present a post mitigation impact risk analysis of a list of potential kinetic impactor based deflection demonstration missions proposed in the framework of the NEOShield project. Our results confirm that mitigation induced uncertainties have a significant influence on the deflection outcome. Those cannot be neglected in post deflection impact risk studies. We show, furthermore, that deflection missions have to be assessed on an individual basis in order to ensure that asteroids are not inadvertently transported closer to the Earth at a later date. Finally, we present viable targets and mission designs for a kinetic impactor test to be launched between the years 2025 and 2032.
Analysis and Design of Complex Network Environments
2012-03-01
and J. Lowe, “The myths and facts behind cyber security risks for industrial control systems ,” in the Proceedings of the VDE Kongress, VDE Congress...questions about 1) how to model them, 2) the design of experiments necessary to discover their structure (and thus adapt system inputs to optimize the...theoretical work that clarifies fundamental limitations of complex networks with network engineering and systems biology to implement specific designs and
NASA Technical Reports Server (NTRS)
Shih, Ann T.; Lo, Yunnhon; Ward, Natalie C.
2010-01-01
Quantifying the probability of significant launch vehicle failure scenarios for a given design, while still in the design process, is critical to mission success and to the safety of the astronauts. Probabilistic risk assessment (PRA) is chosen from many system safety and reliability tools to verify the loss of mission (LOM) and loss of crew (LOC) requirements set by the NASA Program Office. To support the integrated vehicle PRA, probabilistic design analysis (PDA) models are developed by using vehicle design and operation data to better quantify failure probabilities and to better understand the characteristics of a failure and its outcome. This PDA approach uses a physics-based model to describe the system behavior and response for a given failure scenario. Each driving parameter in the model is treated as a random variable with a distribution function. Monte Carlo simulation is used to perform probabilistic calculations to statistically obtain the failure probability. Sensitivity analyses are performed to show how input parameters affect the predicted failure probability, providing insight for potential design improvements to mitigate the risk. The paper discusses the application of the PDA approach in determining the probability of failure for two scenarios from the NASA Ares I project
Systematic risk assessment methodology for critical infrastructure elements - Oil and Gas subsectors
NASA Astrophysics Data System (ADS)
Gheorghiu, A.-D.; Ozunu, A.
2012-04-01
The concern for the protection of critical infrastructure has been rapidly growing in the last few years in Europe. The level of knowledge and preparedness in this field is beginning to develop in a lawfully organized manner, for the identification and designation of critical infrastructure elements of national and European interest. Oil and gas production, refining, treatment, storage and transmission by pipelines facilities, are considered European critical infrastructure sectors, as per Annex I of the Council Directive 2008/114/EC of 8 December 2008 on the identification and designation of European critical infrastructures and the assessment of the need to improve their protection. Besides identifying European and national critical infrastructure elements, member states also need to perform a risk analysis for these infrastructure items, as stated in Annex II of the above mentioned Directive. In the field of risk assessment, there are a series of acknowledged and successfully used methods in the world, but not all hazard identification and assessment methods and techniques are suitable for a given site, situation, or type of hazard. As Theoharidou, M. et al. noted (Theoharidou, M., P. Kotzanikolaou, and D. Gritzalis 2009. Risk-Based Criticality Analysis. In Critical Infrastructure Protection III. Proceedings. Third Annual IFIP WG 11.10 International Conference on Critical Infrastructure Protection. Hanover, New Hampshire, USA, March 23-25, 2009: revised selected papers, edited by C. Palmer and S. Shenoi, 35-49. Berlin: Springer.), despite the wealth of knowledge already created, there is a need for simple, feasible, and standardized criticality analyses. The proposed systematic risk assessment methodology includes three basic steps: the first step (preliminary analysis) includes the identification of hazards (including possible natural hazards) for each installation/section within a given site, followed by a criterial analysis and then a detailed analysis step. The criterial evaluation is used as a ranking system in order to establish the priorities for the detailed risk assessment. This criterial analysis stage is necessary because the total number of installations and sections on a site can be quite large. As not all installations and sections on a site contribute significantly to the risk of a major accident occurring, it is not efficient to include all installations and sections in the detailed risk assessment, which can be time and resource consuming. The selected installations are then taken into consideration in the detailed risk assessment, which is the third step of the systematic risk assessment methodology. Following this step, conclusions can be drawn related to the overall risk characteristics of the site. The proposed methodology can as such be successfully applied to the assessment of risk related to critical infrastructure elements falling under the energy sector of Critical Infrastructure, mainly the sub-sectors oil and gas. Key words: Systematic risk assessment, criterial analysis, energy sector critical infrastructure elements
Hickey, Graeme L; Blackstone, Eugene H
2016-08-01
Clinical risk-prediction models serve an important role in healthcare. They are used for clinical decision-making and measuring the performance of healthcare providers. To establish confidence in a model, external model validation is imperative. When designing such an external model validation study, thought must be given to patient selection, risk factor and outcome definitions, missing data, and the transparent reporting of the analysis. In addition, there are a number of statistical methods available for external model validation. Execution of a rigorous external validation study rests in proper study design, application of suitable statistical methods, and transparent reporting. Copyright © 2016 The American Association for Thoracic Surgery. Published by Elsevier Inc. All rights reserved.
Niu, Yu-Ming; Deng, Mo-Hong; Chen, Wen; Zeng, Xian-Tao; Luo, Jie
2015-01-01
Conflicting results on the association between MTHFR polymorphism and head and neck cancer (HNC) risk were reported. We therefore performed a meta-analysis to derive a more precise relationship between MTHFR C677T polymorphism and HNC risk. Three online databases of PubMed, Embase, and CNKI were researched on the associations between MTHFR C677T polymorphism and HNC risk. Twenty-three published case-control studies involving 4,955 cases and 8,805 controls were collected. Odds ratios (ORs) with 95% confidence interval (CI) were used to evaluate the relationship between MTHFR C677T polymorphism and HNC risk. Sensitivity analysis, cumulative analyses, and publication bias were conducted to validate the strength of the results. Overall, no significant association between MTHFR C677T polymorphism and HNC risk was found in this meta-analysis (T versus C: OR = 1.04, 95% CI = 0.92-1.18; TT versus CC: OR = 1.15, 95% CI = 0.90-1.46; CT versus CC: OR = 1.00, 95% CI = 0.85-1.17; CT + TT versus CC: OR = 1.01, 95% CI = 0.87-1.18; TT versus CC + CT: OR = 1.11, 95% CI = 0.98-1.26). In the subgroup analysis by HWE, ethnicity, study design, cancer location, and negative significant associations were detected in almost all genetic models, except for few significant risks that were found in thyroid cancer. This meta-analysis demonstrates that MTHFR C677T polymorphism may not be a risk factor for the developing of HNC.
Factorial analysis of trihalomethanes formation in drinking water.
Chowdhury, Shakhawat; Champagne, Pascale; McLellan, P James
2010-06-01
Disinfection of drinking water reduces pathogenic infection, but may pose risks to human health through the formation of disinfection byproducts. The effects of different factors on the formation of trihalomethanes were investigated using a statistically designed experimental program, and a predictive model for trihalomethanes formation was developed. Synthetic water samples with different factor levels were produced, and trihalomethanes concentrations were measured. A replicated fractional factorial design with center points was performed, and significant factors were identified through statistical analysis. A second-order trihalomethanes formation model was developed from 92 experiments, and the statistical adequacy was assessed through appropriate diagnostics. This model was validated using additional data from the Drinking Water Surveillance Program database and was applied to the Smiths Falls water supply system in Ontario, Canada. The model predictions were correlated strongly to the measured trihalomethanes, with correlations of 0.95 and 0.91, respectively. The resulting model can assist in analyzing risk-cost tradeoffs in the design and operation of water supply systems.
Zimmermann, Hartmut F; Hentschel, Norbert
2011-01-01
With the publication of the quality guideline ICH Q9 "Quality Risk Management" by the International Conference on Harmonization, risk management has already become a standard requirement during the life cycle of a pharmaceutical product. Failure mode and effect analysis (FMEA) is a powerful risk analysis tool that has been used for decades in mechanical and electrical industries. However, the adaptation of the FMEA methodology to biopharmaceutical processes brings about some difficulties. The proposal presented here is intended to serve as a brief but nevertheless comprehensive and detailed guideline on how to conduct a biopharmaceutical process FMEA. It includes a detailed 1-to-10-scale FMEA rating table for occurrence, severity, and detectability of failures that has been especially designed for typical biopharmaceutical processes. The application for such a biopharmaceutical process FMEA is widespread. It can be useful whenever a biopharmaceutical manufacturing process is developed or scaled-up, or when it is transferred to a different manufacturing site. It may also be conducted during substantial optimization of an existing process or the development of a second-generation process. According to their resulting risk ratings, process parameters can be ranked for importance and important variables for process development, characterization, or validation can be identified. Health authorities around the world ask pharmaceutical companies to manage risk during development and manufacturing of pharmaceuticals. The so-called failure mode and effect analysis (FMEA) is an established risk analysis tool that has been used for decades in mechanical and electrical industries. However, the adaptation of the FMEA methodology to pharmaceutical processes that use modern biotechnology (biopharmaceutical processes) brings about some difficulties, because those biopharmaceutical processes differ from processes in mechanical and electrical industries. The proposal presented here explains how a biopharmaceutical process FMEA can be conducted. It includes a detailed 1-to-10-scale FMEA rating table for occurrence, severity, and detectability of failures that has been especially designed for typical biopharmaceutical processes. With the help of this guideline, different details of the manufacturing process can be ranked according to their potential risks, and this can help pharmaceutical companies to identify aspects with high potential risks and to react accordingly to improve the safety of medicines.
Multi-variant study of obesity risk genes in African Americans: The Jackson Heart Study.
Liu, Shijian; Wilson, James G; Jiang, Fan; Griswold, Michael; Correa, Adolfo; Mei, Hao
2016-11-30
Genome-wide association study (GWAS) has been successful in identifying obesity risk genes by single-variant association analysis. For this study, we designed steps of analysis strategy and aimed to identify multi-variant effects on obesity risk among candidate genes. Our analyses were focused on 2137 African American participants with body mass index measured in the Jackson Heart Study and 657 common single nucleotide polymorphisms (SNPs) genotyped at 8 GWAS-identified obesity risk genes. Single-variant association test showed that no SNPs reached significance after multiple testing adjustment. The following gene-gene interaction analysis, which was focused on SNPs with unadjusted p-value<0.10, identified 6 significant multi-variant associations. Logistic regression showed that SNPs in these associations did not have significant linear interactions; examination of genetic risk score evidenced that 4 multi-variant associations had significant additive effects of risk SNPs; and haplotype association test presented that all multi-variant associations contained one or several combinations of particular alleles or haplotypes, associated with increased obesity risk. Our study evidenced that obesity risk genes generated multi-variant effects, which can be additive or non-linear interactions, and multi-variant study is an important supplement to existing GWAS for understanding genetic effects of obesity risk genes. Copyright © 2016 Elsevier B.V. All rights reserved.
Liu, Renzhi; Liu, Jing; Zhang, Zhijiao; Borthwick, Alistair; Zhang, Ke
2015-12-02
Over the past half century, a surprising number of major pollution incidents occurred due to tailings dam failures. Most previous studies of such incidents comprised forensic analyses of environmental impacts after a tailings dam failure, with few considering the combined pollution risk before incidents occur at a watershed-scale. We therefore propose Watershed-scale Tailings-pond Pollution Risk Analysis (WTPRA), designed for multiple mine tailings ponds, stemming from previous watershed-scale accidental pollution risk assessments. Transferred and combined risk is embedded using risk rankings of multiple routes of the "source-pathway-target" in the WTPRA. The previous approach is modified using multi-criteria analysis, dam failure models, and instantaneous water quality models, which are modified for application to multiple tailings ponds. The study area covers the basin of Gutanting Reservoir (the largest backup drinking water source for Beijing) in Zhangjiakou City, where many mine tailings ponds are located. The resultant map shows that risk is higher downstream of Gutanting Reservoir and in its two tributary basins (i.e., Qingshui River and Longyang River). Conversely, risk is lower in the midstream and upstream reaches. The analysis also indicates that the most hazardous mine tailings ponds are located in Chongli and Xuanhua, and that Guanting Reservoir is the most vulnerable receptor. Sensitivity and uncertainty analyses are performed to validate the robustness of the WTPRA method.
Liu, Renzhi; Liu, Jing; Zhang, Zhijiao; Borthwick, Alistair; Zhang, Ke
2015-01-01
Over the past half century, a surprising number of major pollution incidents occurred due to tailings dam failures. Most previous studies of such incidents comprised forensic analyses of environmental impacts after a tailings dam failure, with few considering the combined pollution risk before incidents occur at a watershed-scale. We therefore propose Watershed-scale Tailings-pond Pollution Risk Analysis (WTPRA), designed for multiple mine tailings ponds, stemming from previous watershed-scale accidental pollution risk assessments. Transferred and combined risk is embedded using risk rankings of multiple routes of the “source-pathway-target” in the WTPRA. The previous approach is modified using multi-criteria analysis, dam failure models, and instantaneous water quality models, which are modified for application to multiple tailings ponds. The study area covers the basin of Gutanting Reservoir (the largest backup drinking water source for Beijing) in Zhangjiakou City, where many mine tailings ponds are located. The resultant map shows that risk is higher downstream of Gutanting Reservoir and in its two tributary basins (i.e., Qingshui River and Longyang River). Conversely, risk is lower in the midstream and upstream reaches. The analysis also indicates that the most hazardous mine tailings ponds are located in Chongli and Xuanhua, and that Guanting Reservoir is the most vulnerable receptor. Sensitivity and uncertainty analyses are performed to validate the robustness of the WTPRA method. PMID:26633450
Yang, Jun; Goddard, Ellen
2011-01-01
Cluster analysis is applied in this study to group Canadian households by two characteristics, their risk perceptions and risk attitudes toward beef. There are some similarities in demographic profiles, meat purchases, and bovine spongiform encephalopathy (BSE) media recall between the cluster that perceives beef to be the most risky and the cluster that has little willingness to accept the risks of eating beef. There are similarities between the medium risk perception cluster and the medium risk attitude cluster, as well as between the cluster that perceives beef to have little risk and the cluster that is most willing to accept the risks of eating beef. Regression analysis shows that risk attitudes have a larger impact on household-level beef purchasing decisions than do risk perceptions for all consumer clusters. This implies that it may be more effective to undertake policies that reduce the risks associated with eating beef, instead of enhancing risk communication to improve risk perceptions. Only for certain clusters with higher willingness to accept the risks of eating beef might enhancing risk communication increase beef consumption significantly. The different role of risk perceptions and risk attitudes in beef consumption needs to be recognized during the design of risk management policies.
Mukharya, Amit; Patel, Paresh U; Shenoy, Dinesh; Chaudhary, Shivang
2013-01-01
Lacidipine (LCDP) is a very low soluble and highly biovariable calcium channel blocker used in the treatment of hypertension. To increase its apparent solubility and to reduce its biovariability, solid dispersion fluid bed processing technology was explored, as it produces highly dispersible granules with a characteristic porous structure that enhances dispersibility, wettability, blend uniformity (by dissolving and spraying a solution of actives), flow ability and compressibility of granules for tableting and reducing variability by uniform drug-binder solution distribution on carrier molecules. Main object of this quality risk management (QRM) study is to provide a sophisticated "robust and rugged" Fluidized Bed Process (FBP) for the preparation of LCDP tablets with desired quality (stability) and performance (dissolution) by quality by design (QbD) concept. THIS STUDY IS PRINCIPALLY FOCUSING ON THOROUGH MECHANISTIC UNDERSTANDING OF THE FBP BY WHICH IT IS DEVELOPED AND SCALED UP WITH A KNOWLEDGE OF THE CRITICAL RISKS INVOLVED IN MANUFACTURING PROCESS ANALYZED BY RISK ASSESSMENT TOOLS LIKE: Qualitative Initial Risk-based Matrix Analysis (IRMA) and Quantitative Failure Mode Effective Analysis (FMEA) to identify and rank parameters with potential to have an impact on In Process/Finished Product Critical Quality Attributes (IP/FP CQAs). These Critical Process Parameters (CPPs) were further refined by DoE and MVDA to develop design space with Real Time Release Testing (RTRT) that leads to implementation of a control strategy to achieve consistent finished product quality at lab scale itself to prevent possible product failure at larger manufacturing scale.
Mukharya, Amit; Patel, Paresh U; Shenoy, Dinesh; Chaudhary, Shivang
2013-01-01
Introduction: Lacidipine (LCDP) is a very low soluble and highly biovariable calcium channel blocker used in the treatment of hypertension. To increase its apparent solubility and to reduce its biovariability, solid dispersion fluid bed processing technology was explored, as it produces highly dispersible granules with a characteristic porous structure that enhances dispersibility, wettability, blend uniformity (by dissolving and spraying a solution of actives), flow ability and compressibility of granules for tableting and reducing variability by uniform drug-binder solution distribution on carrier molecules. Materials and Methods: Main object of this quality risk management (QRM) study is to provide a sophisticated “robust and rugged” Fluidized Bed Process (FBP) for the preparation of LCDP tablets with desired quality (stability) and performance (dissolution) by quality by design (QbD) concept. Results and Conclusion: This study is principally focusing on thorough mechanistic understanding of the FBP by which it is developed and scaled up with a knowledge of the critical risks involved in manufacturing process analyzed by risk assessment tools like: Qualitative Initial Risk-based Matrix Analysis (IRMA) and Quantitative Failure Mode Effective Analysis (FMEA) to identify and rank parameters with potential to have an impact on In Process/Finished Product Critical Quality Attributes (IP/FP CQAs). These Critical Process Parameters (CPPs) were further refined by DoE and MVDA to develop design space with Real Time Release Testing (RTRT) that leads to implementation of a control strategy to achieve consistent finished product quality at lab scale itself to prevent possible product failure at larger manufacturing scale. PMID:23799202
NASA Astrophysics Data System (ADS)
Frits, Andrew P.
In the current Navy environment of undersea weapons development, the engineering aspect of design is decoupled from the development of the tactics with which the weapon is employed. Tactics are developed by intelligence experts, warfighters, and wargamers, while torpedo design is handled by engineers and contractors. This dissertation examines methods by which the conceptual design process of undersea weapon systems, including both torpedo systems and mine counter-measure systems, can be improved. It is shown that by simultaneously designing the torpedo and the tactics with which undersea weapons are used, a more effective overall weapon system can be created. In addition to integrating torpedo tactics with design, the thesis also looks at design methods to account for uncertainty. The uncertainty is attributable to multiple sources, including: lack of detailed analysis tools early in the design process, incomplete knowledge of the operational environments, and uncertainty in the performance of potential technologies. A robust design process is introduced to account for this uncertainty in the analysis and optimization of torpedo systems through the combination of Monte Carlo simulation with response surface methodology and metamodeling techniques. Additionally, various other methods that are appropriate to uncertainty analysis are discussed and analyzed. The thesis also advances a new approach towards examining robustness and risk: the treatment of probability of success (POS) as an independent variable. Examining the cost and performance tradeoffs between high and low probability of success designs, the decision-maker can make better informed decisions as to what designs are most promising and determine the optimal balance of risk, cost, and performance. Finally, the thesis examines the use of non-dimensionalization of parameters for torpedo design. The thesis shows that the use of non-dimensional torpedo parameters leads to increased knowledge about the scaleability of torpedo systems and increased performance of Designs of Experiments.
Qin, Ling; Deng, Hui-Yang; Chen, Sheng-Jiang; Wei, Wei
2017-05-01
Previous epidemiologic studies that have been reported on the association between cigarette smoking and risk of chronic myeloid leukaemia (CML) have remained controversial. A comprehensive meta-analysis was performed to evaluate smoking as a potential relationship factor and incidence of CML. Systematic literatures collected from articles published before August 2015 were searched from PubMed, EMBASE and the Cochrane Library. A total of 10 studies (nine case-controls and one cohort) met inclusion criteria of this meta-analysis. Odds ratios (ORs) with 95% confidence interval (CI) were calculated to assess the strength of the association between cigarette smoking and risk of CML in this study. Quality assessments were performed on the studies with the Newcastle-Ottawa Scale. I2 index was used to evaluate heterogeneity. Finally, publication bias was assessed through funnel plots and Begger's test. No significant association was observed between ever-smokers and CML when compared among non-smokers (OR = 1.13, 95% CI: 0.99-1.29) or between subgroups stratified by smoking history, gender, geographical region, study design and source of patients. Our results demonstrate that this association was stronger in individuals who smoked <20 cigarettes/day (OR = 1.72, 95% CI: 1.06-2.79) vs. individuals who smoked >20 cigarettes/day (OR = 1.24, 95% CI: 0.55-2.81). Moreover, cumulative smoking of <15, 15-30 and >30 pack-years was associated with ORs of 1.22, 1.32 and 1.39, respectively (P < 0.001, for trend). This meta-analysis suggests that smoking may significantly increase the risk of CML in a dose-dependent manner. However, additional well-designed, prospective cohort studies are required to verify these findings and identify other risk factors associated with CML.
Risk Management in Biologics Technology Transfer.
Toso, Robert; Tsang, Jonathan; Xie, Jasmina; Hohwald, Stephen; Bain, David; Willison-Parry, Derek
Technology transfer of biological products is a complex process that is important for product commercialization. To achieve a successful technology transfer, the risks that arise from changes throughout the project must be managed. Iterative risk analysis and mitigation tools can be used to both evaluate and reduce risk. The technology transfer stage gate model is used as an example tool to help manage risks derived from both designed process change and unplanned changes that arise due to unforeseen circumstances. The strategy of risk assessment for a change can be tailored to the type of change. In addition, a cross-functional team and centralized documentation helps maximize risk management efficiency to achieve a successful technology transfer. © PDA, Inc. 2016.
Association of polypharmacy with fall risk among geriatric outpatients.
Kojima, Taro; Akishita, Masahiro; Nakamura, Tetsuro; Nomura, Kazushi; Ogawa, Sumito; Iijima, Katsuya; Eto, Masato; Ouchi, Yasuyoshi
2011-10-01
To investigate the association of fall risk with comorbidities and medications in geriatric outpatients in a cross-sectional design. A total of 262 outpatients (84 men and 178 women, mean age 76.2±6.8years) were evaluated. Physical examination, clinical histories and medication profile were obtained from each patient. History of falls in the past year, 22-item fall risk index, 13-point simple screening test for fall, and time interval of one-leg standing test were examined as markers of fall risk. On univariate analysis, older age, female sex, hypertension, osteoporosis, history of stroke, number of comorbidities, use of antihypertensives, aspirin, bisphosphonates, hypnotics and number of prescribed drugs were significantly associated with either of four indices. On multiple regression analysis, the number of drugs was associated with all of the four indices, independent of other factors associated in the univariate analysis. The association of number of drugs with fall risk indices was stepwise. In geriatric outpatients, polypharmacy rather than number of comorbidities was associated with fall risk. Prospective and intervention studies are needed to clarify the causal relationship between polypharmacy, comorbidities and fall risk. © 2011 Japan Geriatrics Society.
NASA Astrophysics Data System (ADS)
Strauss, B.; Dodson, D.; Kulp, S. A.; Rizza, D. H.
2016-12-01
Surging Seas Risk Finder (riskfinder.org) is an online tool for accessing extensive local projections and analysis of sea level rise; coastal floods; and land, populations, contamination sources, and infrastructure and other assets that may be exposed to inundation. Risk Finder was first published in 2013 for Florida, New York and New Jersey, expanding to all states in the contiguous U.S. by 2016, when a major new version of the tool was released with a completely new interface. The revised tool was informed by hundreds of survey responses from and conversations with planners, local officials and other coastal stakeholders, plus consideration of modern best practices for responsive web design and user interfaces, and social science-based principles for science communication. Overarching design principles include simplicity and ease of navigation, leading to a landing page with Google-like sparsity and focus on search, and to an architecture based on search, so that each coastal zip code, city, county, state or other place type has its own webpage gathering all relevant analysis in modular, scrollable units. Millions of users have visited the Surging Seas suite of tools to date, and downloaded thousands of files, for stated purposes ranging from planning to business to education to personal decisions; and from institutions ranging from local to federal government agencies, to businesses, to NGOs, and to academia.
ERIC Educational Resources Information Center
Brody, Gene H.; Yu, Tianyi; Chen, Yi-Fu; Kogan, Steven M.; Evans, Gary W.; Beach, Steven R. H.; Windle, Michael; Simons, Ronald L.; Gerrard, Meg; Gibbons, Frederick X.; Philibert, Robert A.
2013-01-01
The health disparities literature has identified a common pattern among middle-aged African Americans that includes high rates of chronic disease along with low rates of psychiatric disorders despite exposure to high levels of cumulative socioeconomic status (SES) risk. The current study was designed to test hypotheses about the developmental…
Network Type and Mortality Risk in Later Life
ERIC Educational Resources Information Center
Litwin, Howard; Shiovitz-Ezra, Sharon
2006-01-01
Purpose: The purpose of this study was to examine the association of baseline network type and 7-year mortality risk in later life. Design and Methods: We executed secondary analysis of all-cause mortality in Israel using data from a 1997 national survey of adults aged 60 and older (N = 5,055) that was linked to records from the National Death…
SYN-OP-SYS™: A Computerized Management Information System for Quality Assurance and Risk Management
Thomas, David J.; Weiner, Jayne; Lippincott, Ronald C.
1985-01-01
SYN·OP·SYS™ is a computerized management information system for quality assurance and risk management. Computer software for the efficient collection and analysis of “occurrences” and the clinical data associated with these kinds of patient events is described. The system is evaluated according to certain computer design criteria, and the system's implementation is assessed.
USDA-ARS?s Scientific Manuscript database
We investigated the association between vitamin D status, assessed by plasma 25-hydroxyvitamin D, and risk of incident diabetes. The research design and methods were a prospective observational study with a mean follow-up of 2.7 years in the Diabetes Prevention Program (DPP), a multi-center trial co...
Risk-based zoning for urbanizing floodplains.
Porse, Erik
2014-01-01
Urban floodplain development brings economic benefits and enhanced flood risks. Rapidly growing cities must often balance the economic benefits and increased risks of floodplain settlement. Planning can provide multiple flood mitigation and environmental benefits by combining traditional structural measures such as levees, increasingly popular landscape and design features (green infrastructure), and non-structural measures such as zoning. Flexibility in both structural and non-structural options, including zoning procedures, can reduce flood risks. This paper presents a linear programming formulation to assess cost-effective urban floodplain development decisions that consider benefits and costs of development along with expected flood damages. It uses a probabilistic approach to identify combinations of land-use allocations (residential and commercial development, flood channels, distributed runoff management) and zoning regulations (development zones in channel) to maximize benefits. The model is applied to a floodplain planning analysis for an urbanizing region in the Baja Sur peninsula of Mexico. The analysis demonstrates how (1) economic benefits drive floodplain development, (2) flexible zoning can improve economic returns, and (3) cities can use landscapes, enhanced by technology and design, to manage floods. The framework can incorporate additional green infrastructure benefits, and bridges typical disciplinary gaps for planning and engineering.
Quesada, Jose Antonio; Melchor, Inmaculada; Nolasco, Andreu
2017-05-26
The analysis of spatio-temporal patterns of disease or death in urban areas has been developed mainly from the ecological studies approach. These designs may have some limitations like the ecological fallacy and instability with few cases. The objective of this study was to apply the point process methodology, as a complement to that of aggregated data, to study HIV/AIDS mortality in men in the city of Alicante (Spain). A case-control study in residents in the city during the period 2004-2011 was designed. Cases were men who died from HIV/AIDS and controls represented the general population, matched by age to cases. The risk surfaces of death over the city were estimated using the log-risk function of intensities, and we contrasted their temporal variations over the two periods. High risk significant areas of death by HIV/AIDS, which coincide with the most deprived areas in the city, were detected. Significant spatial change of the areas at risk between the periods studied was not detected. The point process methodology is a useful tool to analyse the patterns of death by HIV/AIDS in urban areas.
2014-01-01
Background Physical activity has been inversely associated with risk of several cancers. We performed a systematic review and meta-analysis to evaluate the association between physical activity and risk of esophageal cancer (esophageal adenocarcinoma [EAC] and/or esophageal squamous cell carcinoma [ESCC]). Methods We conducted a comprehensive search of bibliographic databases and conference proceedings from inception through February 2013 for observational studies that examined associations between recreational and/or occupational physical activity and esophageal cancer risk. Summary adjusted odds ratio (OR) estimates with 95% confidence intervals (CI) were estimated using the random-effects model. Results The analysis included 9 studies (4 cohort, 5 case–control) reporting 1,871 cases of esophageal cancer among 1,381,844 patients. Meta-analysis demonstrated that the risk of esophageal cancer was 29% lower among the most physically active compared to the least physically active subjects (OR, 0.71; 95% CI, 0.57-0.89), with moderate heterogeneity (I2 = 47%). On histology-specific analysis, physical activity was associated with a 32% decreased risk of EAC (4 studies, 503 cases of EAC; OR, 0.68; 95% CI, 0.55-0.85) with minimal heterogeneity (I2 = 0%). There were only 3 studies reporting the association between physical activity and risk of ESCC with conflicting results, and the meta-analysis demonstrated a null association (OR, 1.10; 95% CI, 0.21-5.64). The results were consistent across study design, geographic location and study quality, with a non-significant trend towards a dose–response relationship. Conclusions Meta-analysis of published observational studies indicates that physical activity may be associated with reduced risk of esophageal adenocarcinoma. Lifestyle interventions focusing on increasing physical activity may decrease the global burden of EAC. PMID:24886123
Process-based Cost Estimation for Ramjet/Scramjet Engines
NASA Technical Reports Server (NTRS)
Singh, Brijendra; Torres, Felix; Nesman, Miles; Reynolds, John
2003-01-01
Process-based cost estimation plays a key role in effecting cultural change that integrates distributed science, technology and engineering teams to rapidly create innovative and affordable products. Working together, NASA Glenn Research Center and Boeing Canoga Park have developed a methodology of process-based cost estimation bridging the methodologies of high-level parametric models and detailed bottoms-up estimation. The NASA GRC/Boeing CP process-based cost model provides a probabilistic structure of layered cost drivers. High-level inputs characterize mission requirements, system performance, and relevant economic factors. Design alternatives are extracted from a standard, product-specific work breakdown structure to pre-load lower-level cost driver inputs and generate the cost-risk analysis. As product design progresses and matures the lower level more detailed cost drivers can be re-accessed and the projected variation of input values narrowed, thereby generating a progressively more accurate estimate of cost-risk. Incorporated into the process-based cost model are techniques for decision analysis, specifically, the analytic hierarchy process (AHP) and functional utility analysis. Design alternatives may then be evaluated not just on cost-risk, but also user defined performance and schedule criteria. This implementation of full-trade study support contributes significantly to the realization of the integrated development environment. The process-based cost estimation model generates development and manufacturing cost estimates. The development team plans to expand the manufacturing process base from approximately 80 manufacturing processes to over 250 processes. Operation and support cost modeling is also envisioned. Process-based estimation considers the materials, resources, and processes in establishing cost-risk and rather depending on weight as an input, actually estimates weight along with cost and schedule.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Almajali, Anas; Rice, Eric; Viswanathan, Arun
This paper presents a systems analysis approach to characterizing the risk of a Smart Grid to a load-drop attack. A characterization of the risk is necessary for the design of detection and remediation strategies to address the consequences of such attacks. Using concepts from systems health management and system engineering, this work (a) first identifies metrics that can be used to generate constraints for security features, and (b) lays out an end-to-end integrated methodology using separate network and power simulations to assess system risk. We demonstrate our approach by performing a systems-style analysis of a load-drop attack implemented over themore » AMI subsystem and targeted at destabilizing the underlying power grid.« less
NASA Technical Reports Server (NTRS)
Hatfield, Glen S.; Hark, Frank; Stott, James
2016-01-01
Launch vehicle reliability analysis is largely dependent upon using predicted failure rates from data sources such as MIL-HDBK-217F. Reliability prediction methodologies based on component data do not take into account system integration risks such as those attributable to manufacturing and assembly. These sources often dominate component level risk. While consequence of failure is often understood, using predicted values in a risk model to estimate the probability of occurrence may underestimate the actual risk. Managers and decision makers use the probability of occurrence to influence the determination whether to accept the risk or require a design modification. The actual risk threshold for acceptance may not be fully understood due to the absence of system level test data or operational data. This paper will establish a method and approach to identify the pitfalls and precautions of accepting risk based solely upon predicted failure data. This approach will provide a set of guidelines that may be useful to arrive at a more realistic quantification of risk prior to acceptance by a program.
Improved FTA methodology and application to subsea pipeline reliability design.
Lin, Jing; Yuan, Yongbo; Zhang, Mingyuan
2014-01-01
An innovative logic tree, Failure Expansion Tree (FET), is proposed in this paper, which improves on traditional Fault Tree Analysis (FTA). It describes a different thinking approach for risk factor identification and reliability risk assessment. By providing a more comprehensive and objective methodology, the rather subjective nature of FTA node discovery is significantly reduced and the resulting mathematical calculations for quantitative analysis are greatly simplified. Applied to the Useful Life phase of a subsea pipeline engineering project, the approach provides a more structured analysis by constructing a tree following the laws of physics and geometry. Resulting improvements are summarized in comparison table form.
Improved FTA Methodology and Application to Subsea Pipeline Reliability Design
Lin, Jing; Yuan, Yongbo; Zhang, Mingyuan
2014-01-01
An innovative logic tree, Failure Expansion Tree (FET), is proposed in this paper, which improves on traditional Fault Tree Analysis (FTA). It describes a different thinking approach for risk factor identification and reliability risk assessment. By providing a more comprehensive and objective methodology, the rather subjective nature of FTA node discovery is significantly reduced and the resulting mathematical calculations for quantitative analysis are greatly simplified. Applied to the Useful Life phase of a subsea pipeline engineering project, the approach provides a more structured analysis by constructing a tree following the laws of physics and geometry. Resulting improvements are summarized in comparison table form. PMID:24667681
Automotive Stirling Engine Mod 1 Design Review, Volume 1
NASA Technical Reports Server (NTRS)
1982-01-01
Risk assessment, safety analysis of the automotive stirling engine (ASE) mod I, design criteria and materials properties for the ASE mod I and reference engines, combustion are flower development, and the mod I engine starter motor are discussed. The stirling engine system, external heat system, hot engine system, cold engine system, and engine drive system are also discussed.
A Field Programmable Gate Array Based Software Defined Radio Design for the Space Environment
2009-12-01
CHANGING PARAMETERS ......................................................................97 APPENDIX B. ADDITIONAL APPLICATIONS ...Professor Frank Kragh was inspirational and always provided keen insight into the mathematics of signal analysis. Special thanks to Professor...and risk involved with launching a new satellite. [2] An FPGA design with potential for space applications was presented in [3]. This initial SDR
A Strategic Approach to Medical Care for Exploration Missions
NASA Technical Reports Server (NTRS)
Antonsen, E.; Canga, M.
2016-01-01
Exploration missions will present significant new challenges to crew health, including effects of variable gravity environments, limited communication with Earth-based personnel for diagnosis and consultation for medical events, limited resupply, and limited ability for crew return. Providing health care capabilities for exploration class missions will require system trades be performed to identify a minimum set of requirements and crosscutting capabilities which can be used in design of exploration medical systems. Current and future medical data, information, and knowledge must be cataloged and put in formats that facilitate querying and analysis. These data may then be used to inform the medical research and development program through analysis of risk trade studies between medical care capabilities and system constraints such as mass, power, volume, and training. These studies will be used to define a Medical Concept of Operations to facilitate stakeholder discussions on expected medical capability for exploration missions. Medical Capability as a quantifiable variable is proposed as a surrogate risk metric and explored for trade space analysis that can improve communication between the medical and engineering approaches to mission design. The resulting medical system approach selected will inform NASA mission architecture, vehicle, and subsystem design for the next generation of spacecraft.
Product Quality Improvement Using FMEA for Electric Parking Brake (EPB)
NASA Astrophysics Data System (ADS)
Dumitrescu, C. D.; Gruber, G. C.; Tişcă, I. A.
2016-08-01
One of the most frequently used methods to improve product quality is complex FMEA. (Failure Modes and Effects Analyses). In the literature various FMEA is known, depending on the mode and depending on the targets; we mention here some of these names: Failure Modes and Effects Analysis Process, or analysis Failure Mode and Effects Reported (FMECA). Whatever option is supported by the work team, the goal of the method is the same: optimize product design activities in research, design processes, implementation of manufacturing processes, optimization of mining product to beneficiaries. According to a market survey conducted on parts suppliers to vehicle manufacturers FMEA method is used in 75%. One purpose of the application is that after the research and product development is considered resolved, any errors which may be detected; another purpose of applying the method is initiating appropriate measures to avoid mistakes. Achieving these two goals leads to a high level distribution in applying, to avoid errors already in the design phase of the product, thereby avoiding the emergence and development of additional costs in later stages of product manufacturing. During application of FMEA method using standardized forms; with their help will establish the initial assemblies of product structure, in which all components will be viewed without error. The work is an application of the method FMEA quality components to optimize the structure of the electrical parking brake (Electric Parching Brake - E.P.B). This is a component attached to the roller system which ensures automotive replacement of conventional mechanical parking brake while ensuring its comfort, functionality, durability and saves space in the passenger compartment. The paper describes the levels at which they appealed in applying FMEA, working arrangements in the 4 distinct levels of analysis, and how to determine the number of risk (Risk Priority Number); the analysis of risk factors and established authors who have imposed measures to reduce / eliminate risk completely exploiting this complex product.
Vascular Disease, ESRD, and Death: Interpreting Competing Risk Analyses
Coresh, Josef; Segev, Dorry L.; Kucirka, Lauren M.; Tighiouart, Hocine; Sarnak, Mark J.
2012-01-01
Summary Background and objectives Vascular disease, a common condition in CKD, is a risk factor for mortality and ESRD. Optimal patient care requires accurate estimation and ordering of these competing risks. Design, setting, participants, & measurements This is a prospective cohort study of screened (n=885) and randomized participants (n=837) in the Modification of Diet in Renal Disease study (original study enrollment, 1989–1992), evaluating the association of vascular disease with ESRD and pre-ESRD mortality using standard survival analysis and competing risk regression. Results The method of analysis resulted in markedly different estimates. Cumulative incidence by standard analysis (censoring at the competing event) implied that, with vascular disease, the 15-year incidence was 66% and 51% for ESRD and pre-ESRD death, respectively. A more accurate representation of absolute risk was estimated with competing risk regression: 15-year incidence was 54% and 29% for ESRD and pre-ESRD death, respectively. For the association of vascular disease with pre-ESRD death, estimates of relative risk by the two methods were similar (standard survival analysis adjusted hazard ratio, 1.63; 95% confidence interval, 1.20–2.20; competing risk regression adjusted subhazard ratio, 1.57; 95% confidence interval, 1.15–2.14). In contrast, the hazard and subhazard ratios differed substantially for other associations, such as GFR and pre-ESRD mortality. Conclusions When competing events exist, absolute risk is better estimated using competing risk regression, but etiologic associations by this method must be carefully interpreted. The presence of vascular disease in CKD decreases the likelihood of survival to ESRD, independent of age and other risk factors. PMID:22859747
Herrick, Cynthia J.; Yount, Byron W.; Eyler, Amy A.
2016-01-01
Objective Diabetes is a growing public health problem, and the environment in which people live and work may affect diabetes risk. The goal of this study was to examine the association between multiple aspects of environment and diabetes risk in an employee population. Design This was a retrospective cross-sectional analysis. Home environment variables were derived using employee zip code. Descriptive statistics were run on all individual and zip code level variables, stratified by diabetes risk and worksite. A multivariable logistic regression analysis was then conducted to determine the strongest associations with diabetes risk. Setting Data was collected from employee health fairs in a Midwestern health system 2009–2012. Subjects The dataset contains 25,227 unique individuals across four years of data. From this group, using an individual’s first entry into the database, 15,522 individuals had complete data for analysis. Results The prevalence of high diabetes risk in this population was 2.3%. There was significant variability in individual and zip code level variables across worksites. From the multivariable analysis, living in a zip code with higher percent poverty and higher walk score was positively associated with high diabetes risk, while living in a zip code with higher supermarket density was associated with a reduction in high diabetes risk. Conclusions Our study underscores the important relationship between poverty, home neighborhood environment, and diabetes risk, even in a relatively healthy employed population, and suggests a role for the employer in promoting health. PMID:26638995
NASA Technical Reports Server (NTRS)
McLeod, Ken; Stoltzfus, Joel
2006-01-01
Oxygen relief systems present a serious fire hazard risk with often severe consequences. This presentation offers a risk management solution strategy which encourages minimizing ignition hazards, maximizing best materials, and utilizing good practices. Additionally, the relief system should be designed for cleanability and ballistic flow. The use of the right metals, softgoods, and lubricants, along with the best assembly techniques, is stressed. Materials should also be tested if data is not available and a full hazard analysis should be conducted in an effort to minimize risk and harm.
Adaptive approaches to biosecurity governance.
Cook, David C; Liu, Shuang; Murphy, Brendan; Lonsdale, W Mark
2010-09-01
This article discusses institutional changes that may facilitate an adaptive approach to biosecurity risk management where governance is viewed as a multidisciplinary, interactive experiment acknowledging uncertainty. Using the principles of adaptive governance, evolved from institutional theory, we explore how the concepts of lateral information flows, incentive alignment, and policy experimentation might shape Australia's invasive species defense mechanisms. We suggest design principles for biosecurity policies emphasizing overlapping complementary response capabilities and the sharing of invasive species risks via a polycentric system of governance. © 2010 Society for Risk Analysis
Analysis of recreational closed-circuit rebreather deaths 1998-2010.
Fock, Andrew W
2013-06-01
Since the introduction of recreational closed-circuit rebreathers (CCRs) in 1998, there have been many recorded deaths. Rebreather deaths have been quoted to be as high as 1 in 100 users. Rebreather fatalities between 1998 and 2010 were extracted from the Deeplife rebreather mortality database, and inaccuracies were corrected where known. Rebreather absolute numbers were derived from industry discussions and training agency statistics. Relative numbers and brands were extracted from the Rebreather World website database and a Dutch rebreather survey. Mortality was compared with data from other databases. A fault-tree analysis of rebreathers was compared to that of open-circuit scuba of various configurations. Finally, a risk analysis was applied to the mortality database. The 181 recorded recreational rebreather deaths occurred at about 10 times the rate of deaths amongst open-circuit recreational scuba divers. No particular brand or type of rebreather was over-represented. Closed-circuit rebreathers have a 25-fold increased risk of component failure compared to a manifolded twin-cylinder open-circuit system. This risk can be offset by carrying a redundant 'bailout' system. Two-thirds of fatal dives were associated with a high-risk dive or high-risk behaviour. There are multiple points in the human-machine interface (HMI) during the use of rebreathers that can result in errors that may lead to a fatality. While rebreathers have an intrinsically higher risk of mechanical failure as a result of their complexity, this can be offset by good design incorporating redundancy and by carrying adequate 'bailout' or alternative gas sources for decompression in the event of a failure. Designs that minimize the chances of HMI errors and training that highlights this area may help to minimize fatalities.
NASA Astrophysics Data System (ADS)
Dhakal, N.; Jain, S.
2013-12-01
Rare and unusually large events (such as hurricanes and floods) can create unusual and interesting trends in statistics. Generalized Extreme Value (GEV) distribution is usually used to statistically describe extreme rainfall events. A number of the recent studies have shown that the frequency of extreme rainfall events has increased over the last century and as a result, there has been change in parameters of GEV distribution with the time (non-stationary). But what impact does a single unusually large rainfall event (e.g., hurricane Irene) have on the GEV parameters and consequently on the level of risks or the return periods used in designing the civil infrastructures? In other words, if such a large event occurs today, how will it influence the level of risks (estimated based on past rainfall records) for the civil infrastructures? To answer these questions, we performed sensitivity analysis of the distribution parameters of GEV as well as the return periods to unusually large outlier events. The long-term precipitation records over the period of 1981-2010 from 12 USHCN stations across the state of Maine were used for analysis. For most of the stations, addition of each outlier event caused an increase in the shape parameter with a huge decrease on the corresponding return period. This is a key consideration for time-varying engineering design. These isolated extreme weather events should simultaneously be considered with traditional statistical methodology related to extreme events while designing civil infrastructures (such as dams, bridges, and culverts). Such analysis is also useful in understanding the statistical uncertainty of projecting extreme events into future.
Risk of malignancy in ankylosing spondylitis: a systematic review and meta-analysis.
Deng, Chuiwen; Li, Wenli; Fei, Yunyun; Li, Yongzhe; Zhang, Fengchun
2016-08-18
Current knowledge about the overall and site-specific risk of malignancy associated with ankylosing spondylitis (AS) is inconsistent. We conducted a systematic review and meta-analysis to address this knowledge gap. Five databases (PubMed, EMBASE, Web of Science, the Cochrane library and the virtual health library) were systematically searched. A manual search of publications within the last 2 years in key journals in the field (Annals of the Rheumatic Diseases, Rheumatology and Arthritis &rheumatology) was also performed. STATA 11.2 software was used to conduct the meta-analysis. After screening, twenty-three studies, of different designs, were eligible for meta-analysis. AS is associated with a 14% (pooled RR 1.14; 95% CI 1.03-1.25) increase in the overall risk for malignancy. Compared to controls, patients with AS are at a specific increased risk for malignancy of the digestive system (pooled RR 1.20; 95% CI 1.01 to 1.42), multiple myelomas (pooled RR 1.92; 95% CI 1.37 to 3.69) and lymphomas (pooled RR 1.32; 95% CI 1.11 to 1.57). On subgroup analysis, evidence from high quality cohort studies indicated that AS patients from Asia are at highest risk for malignancy overall. Confirmation of findings from large-scale longitudinal studies is needed to identify specific risk factors and to evaluate treatment effects.
NASA Technical Reports Server (NTRS)
Moore, N. R.; Ebbeler, D. H.; Newlin, L. E.; Sutharshana, S.; Creager, M.
1992-01-01
An improved methodology for quantitatively evaluating failure risk of spaceflight systems to assess flight readiness and identify risk control measures is presented. This methodology, called Probabilistic Failure Assessment (PFA), combines operating experience from tests and flights with engineering analysis to estimate failure risk. The PFA methodology is of particular value when information on which to base an assessment of failure risk, including test experience and knowledge of parameters used in engineering analyses of failure phenomena, is expensive or difficult to acquire. The PFA methodology is a prescribed statistical structure in which engineering analysis models that characterize failure phenomena are used conjointly with uncertainties about analysis parameters and/or modeling accuracy to estimate failure probability distributions for specific failure modes, These distributions can then be modified, by means of statistical procedures of the PFA methodology, to reflect any test or flight experience. Conventional engineering analysis models currently employed for design of failure prediction are used in this methodology. The PFA methodology is described and examples of its application are presented. Conventional approaches to failure risk evaluation for spaceflight systems are discussed, and the rationale for the approach taken in the PFA methodology is presented. The statistical methods, engineering models, and computer software used in fatigue failure mode applications are thoroughly documented.
NASA Technical Reports Server (NTRS)
Moore, N. R.; Ebbeler, D. H.; Newlin, L. E.; Sutharshana, S.; Creager, M.
1992-01-01
An improved methodology for quantitatively evaluating failure risk of spaceflight systems to assess flight readiness and identify risk control measures is presented. This methodology, called Probabilistic Failure Assessment (PFA), combines operating experience from tests and flights with engineering analysis to estimate failure risk. The PFA methodology is of particular value when information on which to base an assessment of failure risk, including test experience and knowledge of parameters used in engineering analyses of failure phenomena, is expensive or difficult to acquire. The PFA methodology is a prescribed statistical structure in which engineering analysis models that characterize failure phenomena are used conjointly with uncertainties about analysis parameters and/or modeling accuracy to estimate failure probability distributions for specific failure modes. These distributions can then be modified, by means of statistical procedures of the PFA methodology, to reflect any test or flight experience. Conventional engineering analysis models currently employed for design of failure prediction are used in this methodology. The PFA methodology is described and examples of its application are presented. Conventional approaches to failure risk evaluation for spaceflight systems are discussed, and the rationale for the approach taken in the PFA methodology is presented. The statistical methods, engineering models, and computer software used in fatigue failure mode applications are thoroughly documented.
NASA Astrophysics Data System (ADS)
Dentoni, Marta; Deidda, Roberto; Paniconi, Claudio; Marrocu, Marino; Lecca, Giuditta
2014-05-01
Seawater intrusion (SWI) has become a major threat to coastal freshwater resources, particularly in the Mediterranean basin, where this problem is exacerbated by the lack of appropriate groundwater resources management and with serious potential impacts from projected climate changes. A proper analysis and risk assessment that includes climate scenarios is essential for the design of water management measures to mitigate the environmental and socio-economic impacts of SWI. In this study a methodology for SWI risk analysis in coastal aquifers is developed and applied to the Gaza Strip coastal aquifer in Palestine. The method is based on the origin-pathway-target model, evaluating the final value of SWI risk by applying the overlay principle to the hazard map (representing the origin of SWI), the vulnerability map (representing the pathway of groundwater flow) and the elements map (representing the target of SWI). Results indicate the important role of groundwater simulation in SWI risk assessment and illustrate how mitigation measures can be developed according to predefined criteria to arrive at quantifiable expected benefits. Keywords: Climate change, coastal aquifer, seawater intrusion, risk analysis, simulation/optimization model. Acknowledgements. The study is partially funded by the project "Climate Induced Changes on the Hydrology of Mediterranean Basins (CLIMB)", FP7-ENV-2009-1, GA 244151.
Merlo, Juan; Ohlsson, Henrik; Chaix, Basile; Lichtenstein, Paul; Kawachi, Ichiro; Subramanian, S V
2013-01-01
Neighborhood socioeconomic disadvantage is associated to increased individual risk of ischemic heart disease (IHD). However, the value of this association for causal inference is uncertain. Moreover, neighborhoods are often defined by available administrative boundaries without evaluating in which degree these boundaries embrace a relevant socio-geographical context that condition individual differences in IHD risk. Therefore, we performed an analysis of variance, and also compared the associations obtained by conventional multilevel analyses and by quasi-experimental family-based design that provides stronger evidence for causal inference. Linking the Swedish Multi-Generation Register to several other national registers, we analyzed 184,931 families embracing 415,540 full brothers 45-64 years old in 2004, and residing in 8408 small-area market statistics (SAMS) considered as "neighborhoods" in our study. We investigated the association between low neighborhood income (categorized in groups by deciles) and IHD risk in the next four years. We distinguished between family mean and intrafamilial-centered low neighborhood income, which allowed us to investigate both unrelated individuals from different families and full brothers within families. We applied multilevel logistic regression techniques to obtain odds ratios (OR), variance partition coefficients (VPC) and 95% credible intervals (CI). In unrelated individuals a decile unit increase of low neighborhood income increased individual IHD risk (OR = 1.04, 95% CI: 1.03-1.07). In the intrafamilial analysis this association was reduced (OR = 1.02, 95% CI: 1.02-1.04). Low neighborhood income seems associated with IHD risk in middle-aged men. However, despite the family-based design, we cannot exclude residual confounding by genetic and non-shared environmental factors. Besides, the low neighborhood level VPC = 1.5% suggest that the SAMS are a rather inappropriate construct of the socio-geographic context that conditions individual variance in IHD risk. In contrast the high family level VPC = 20.1% confirms the relevance of the family context for understanding IHD risk. Copyright © 2012 Elsevier Ltd. All rights reserved.
Malinowski, M L; Beling, P A; Haimes, Y Y; LaViers, A; Marvel, J A; Weiss, B A
2015-01-01
The fields of risk analysis and prognostics and health management (PHM) have developed in a largely independent fashion. However, both fields share a common core goal. They aspire to manage future adverse consequences associated with prospective dysfunctions of the systems under consideration due to internal or external forces. This paper describes how two prominent risk analysis theories and methodologies - Hierarchical Holographic Modeling (HHM) and Risk Filtering, Ranking, and Management (RFRM) - can be adapted to support the design of PHM systems in the context of smart manufacturing processes. Specifically, the proposed methodologies will be used to identify targets - components, subsystems, or systems - that would most benefit from a PHM system in regards to achieving the following objectives: minimizing cost, minimizing production/maintenance time, maximizing system remaining usable life (RUL), maximizing product quality, and maximizing product output. HHM is a comprehensive modeling theory and methodology that is grounded on the premise that no system can be modeled effectively from a single perspective. It can also be used as an inductive method for scenario structuring to identify emergent forced changes (EFCs) in a system. EFCs connote trends in external or internal sources of risk to a system that may adversely affect specific states of the system. An important aspect of proactive risk management includes bolstering the resilience of the system for specific EFCs by appropriately controlling the states. Risk scenarios for specific EFCs can be the basis for the design of prognostic and diagnostic systems that provide real-time predictions and recognition of scenario changes. The HHM methodology includes visual modeling techniques that can enhance stakeholders' understanding of shared states, resources, objectives and constraints among the interdependent and interconnected subsystems of smart manufacturing systems. In risk analysis, HHM is often paired with Risk Filtering, Ranking, and Management (RFRM). The RFRM process provides the users, (e.g., technology developers, original equipment manufacturers (OEMs), technology integrators, manufacturers), with the most critical risks to the objectives, which can be used to identify the most critical components and subsystems that would most benefit from a PHM system. A case study is presented in which HHM and RFRM are adapted for PHM in the context of an active manufacturing facility located in the United States. The methodologies help to identify the critical risks to the manufacturing process, and the major components and subsystems that would most benefit from a developed PHM system.
Malinowski, M.L.; Beling, P.A.; Haimes, Y.Y.; LaViers, A.; Marvel, J.A.; Weiss, B.A.
2017-01-01
The fields of risk analysis and prognostics and health management (PHM) have developed in a largely independent fashion. However, both fields share a common core goal. They aspire to manage future adverse consequences associated with prospective dysfunctions of the systems under consideration due to internal or external forces. This paper describes how two prominent risk analysis theories and methodologies – Hierarchical Holographic Modeling (HHM) and Risk Filtering, Ranking, and Management (RFRM) – can be adapted to support the design of PHM systems in the context of smart manufacturing processes. Specifically, the proposed methodologies will be used to identify targets – components, subsystems, or systems – that would most benefit from a PHM system in regards to achieving the following objectives: minimizing cost, minimizing production/maintenance time, maximizing system remaining usable life (RUL), maximizing product quality, and maximizing product output. HHM is a comprehensive modeling theory and methodology that is grounded on the premise that no system can be modeled effectively from a single perspective. It can also be used as an inductive method for scenario structuring to identify emergent forced changes (EFCs) in a system. EFCs connote trends in external or internal sources of risk to a system that may adversely affect specific states of the system. An important aspect of proactive risk management includes bolstering the resilience of the system for specific EFCs by appropriately controlling the states. Risk scenarios for specific EFCs can be the basis for the design of prognostic and diagnostic systems that provide real-time predictions and recognition of scenario changes. The HHM methodology includes visual modeling techniques that can enhance stakeholders’ understanding of shared states, resources, objectives and constraints among the interdependent and interconnected subsystems of smart manufacturing systems. In risk analysis, HHM is often paired with Risk Filtering, Ranking, and Management (RFRM). The RFRM process provides the users, (e.g., technology developers, original equipment manufacturers (OEMs), technology integrators, manufacturers), with the most critical risks to the objectives, which can be used to identify the most critical components and subsystems that would most benefit from a PHM system. A case study is presented in which HHM and RFRM are adapted for PHM in the context of an active manufacturing facility located in the United States. The methodologies help to identify the critical risks to the manufacturing process, and the major components and subsystems that would most benefit from a developed PHM system. PMID:28664162
Isahak, Anizan; Siwar, Chamhuri; Ismail, Shaharuddin M.; Hanafi, Zulkifli; Zainuddin, Mohd S.
2018-01-01
Shelter centres are important locations to safeguard people from helpless situations and are an integral part of disaster risk reduction (DRR), particularly for flood DRR. The establishment of shelter centres, and their design based on scientific assessment, is crucial. Yet, they are very much related to the geographic location, socio-economic conditions and the livelihoods of the affected communities. However, many parts of the developing world are still lagging behind in ensuring such scientific design. Considering the flood disaster in 2014 that affected the residents living along the Pahang River Basin, in this study we delineate the communities at risk and evaluate the existing shelter centres to determine how they reduce people’s vulnerability to the risks associated with rural and urban landscapes. We used spatial analysis tools to delineate risk zones and to evaluate existing evacuation systems. A flood disaster risk map was produced to determine which communities are living with risks. Subsequently, the distribution of shelter centres examined whether they are able to support people living at the flood risk zones. These centres were also evaluated using a set of international guidelines for effective disaster shelters. This reveals that the number of shelter centres is not adequate. The designation and designing of shelter centres are not being done scientifically. The maps produced here have a lot of potential to support disaster management decisions, in particular site selection and the prioritisation of centres. The study concludes with a set of guidelines and recommendations for structural and non-structural measures, such as alternative livelihoods and the potential of ecotourism, which may improve the resilience among flood-affected communities; and the decision-making process for the overall flood DRR initiatives.
Mediation Analysis of an Adolescent HIV/STI/Pregnancy Prevention Intervention
ERIC Educational Resources Information Center
Glassman, Jill R.; Franks, Heather M.; Baumler, Elizabeth R.; Coyle, Karin K.
2014-01-01
Most interventions designed to prevent HIV/STI/pregnancy risk behaviours in young people have multiple components based on psychosocial theories (e.g. social cognitive theory) dictating sets of mediating variables to influence to achieve desired changes in behaviours. Mediation analysis is a method for investigating the extent to which a variable…
NASA Astrophysics Data System (ADS)
Gómez, Wilmar
2017-04-01
By analyzing the spatial and temporal variability of extreme precipitation events we can prevent or reduce the threat and risk. Many water resources projects require joint probability distributions of random variables such as precipitation intensity and duration, which can not be independent with each other. The problem of defining a probability model for observations of several dependent variables is greatly simplified by the joint distribution in terms of their marginal by taking copulas. This document presents a general framework set frequency analysis bivariate and multivariate using Archimedean copulas for extreme events of hydroclimatological nature such as severe storms. This analysis was conducted in the lower Tunjuelo River basin in Colombia for precipitation events. The results obtained show that for a joint study of the intensity-duration-frequency, IDF curves can be obtained through copulas and thus establish more accurate and reliable information from design storms and associated risks. It shows how the use of copulas greatly simplifies the study of multivariate distributions that introduce the concept of joint return period used to represent the needs of hydrological designs properly in frequency analysis.
Turbine Design and Analysis for the J-2X Engine Turbopumps
NASA Technical Reports Server (NTRS)
Marcu, Bogdan; Tran, Ken; Dorney, Daniel J.; Schmauch, Preston
2008-01-01
Pratt and Whitney Rocketdyne and NASA Marshall Space Flight Center are developing the advanced upper stage J-2X engine based on the legacy design of the J-2/J-2S family of engines which powered the Apollo missions. The cryogenic propellant turbopumps have been denoted as Mark72-F and Mark72-0 for the fuel and oxidizer side, respectively. Special attention is focused on preserving the essential flight-proven design features while adapting the design to the new turbopump configuration. Advanced 3-D CFD analysis has been employed to verify turbine aero performance at current flow regime boundary conditions and to mitigate risks associated with stresses. A limited amount of redesign and overall configuration modifications allow for a robust design with performance level matching or exceeding requirement.
Risk-Significant Adverse Condition Awareness Strengthens Assurance of Fault Management Systems
NASA Technical Reports Server (NTRS)
Fitz, Rhonda
2017-01-01
As spaceflight systems increase in complexity, Fault Management (FM) systems are ranked high in risk-based assessment of software criticality, emphasizing the importance of establishing highly competent domain expertise to provide assurance. Adverse conditions (ACs) and specific vulnerabilities encountered by safety- and mission-critical software systems have been identified through efforts to reduce the risk posture of software-intensive NASA missions. Acknowledgement of potential off-nominal conditions and analysis to determine software system resiliency are important aspects of hazard analysis and FM. A key component of assuring FM is an assessment of how well software addresses susceptibility to failure through consideration of ACs. Focus on significant risk predicted through experienced analysis conducted at the NASA Independent Verification & Validation (IV&V) Program enables the scoping of effective assurance strategies with regard to overall asset protection of complex spaceflight as well as ground systems. Research efforts sponsored by NASAs Office of Safety and Mission Assurance (OSMA) defined terminology, categorized data fields, and designed a baseline repository that centralizes and compiles a comprehensive listing of ACs and correlated data relevant across many NASA missions. This prototype tool helps projects improve analysis by tracking ACs and allowing queries based on project, mission type, domain/component, causal fault, and other key characteristics. Vulnerability in off-nominal situations, architectural design weaknesses, and unexpected or undesirable system behaviors in reaction to faults are curtailed with the awareness of ACs and risk-significant scenarios modeled for analysts through this database. Integration within the Enterprise Architecture at NASA IV&V enables interfacing with other tools and datasets, technical support, and accessibility across the Agency. This paper discusses the development of an improved workflow process utilizing this database for adaptive, risk-informed FM assurance that critical software systems will safely and securely protect against faults and respond to ACs in order to achieve successful missions.
Yao, Baodong; Yan, Yujie; Ye, Xianwu; Fang, Hong; Xu, Huilin; Liu, Yinan; Li, Sheran; Zhao, Yanping
2014-12-01
Observational studies suggest an association between fruit and vegetables intake and risk of bladder cancer, but the results are controversial. We therefore summarized the evidence from observational studies in categorical, linear, and nonlinear, dose-response meta-analysis. Pertinent studies were identified by searching EMBASE and PubMed from their inception to August 2013. Thirty-one observational studies involving 12,610 cases and 1,121,649 participants were included. The combined rate ratio (RR, 95 % CI) of bladder cancer for the highest versus lowest intake was 0.83 (0.69-0.99) for total fruit and vegetables, 0.81 (0.70-0.93) for total vegetables, 0.77 (0.69-0.87) for total fruit, 0.84 (0.77-0.91) for cruciferous vegetables, 0.79 (0.68-0.91) for citrus fruits, and 0.74 (0.66-0.84) for yellow-orange vegetables. Subgroup analysis showed study design and gender as possible sources of heterogeneity. A nonlinear relationship was found of citrus fruits intake with risk of bladder cancer (P for nonlinearity = 0.018), and the RRs (95 % CI) of bladder cancer were 0.87 (0.78-0.96), 0.80 (0.67-0.94), 0.79 (0.66-0.94), 0.79 (0.65-0.96), and 0.79 (0.64-0.99) for 30, 60, 90, 120, and 150 g/day. A nonlinear relationship was also found of yellow-orange vegetable intake with risk of bladder cancer risk (P for nonlinearity = 0.033). Some evidence of publication bias was observed for fruit, citrus fruits, and yellow-orange vegetables. This meta-analysis supports the hypothesis that intakes of fruit and vegetables may reduce the risk of bladder cancer. Future well-designed studies are required to confirm this finding.
Risk-Significant Adverse Condition Awareness Strengthens Assurance of Fault Management Systems
NASA Technical Reports Server (NTRS)
Fitz, Rhonda
2017-01-01
As spaceflight systems increase in complexity, Fault Management (FM) systems are ranked high in risk-based assessment of software criticality, emphasizing the importance of establishing highly competent domain expertise to provide assurance. Adverse conditions (ACs) and specific vulnerabilities encountered by safety- and mission-critical software systems have been identified through efforts to reduce the risk posture of software-intensive NASA missions. Acknowledgement of potential off-nominal conditions and analysis to determine software system resiliency are important aspects of hazard analysis and FM. A key component of assuring FM is an assessment of how well software addresses susceptibility to failure through consideration of ACs. Focus on significant risk predicted through experienced analysis conducted at the NASA Independent Verification Validation (IVV) Program enables the scoping of effective assurance strategies with regard to overall asset protection of complex spaceflight as well as ground systems. Research efforts sponsored by NASA's Office of Safety and Mission Assurance defined terminology, categorized data fields, and designed a baseline repository that centralizes and compiles a comprehensive listing of ACs and correlated data relevant across many NASA missions. This prototype tool helps projects improve analysis by tracking ACs and allowing queries based on project, mission type, domaincomponent, causal fault, and other key characteristics. Vulnerability in off-nominal situations, architectural design weaknesses, and unexpected or undesirable system behaviors in reaction to faults are curtailed with the awareness of ACs and risk-significant scenarios modeled for analysts through this database. Integration within the Enterprise Architecture at NASA IVV enables interfacing with other tools and datasets, technical support, and accessibility across the Agency. This paper discusses the development of an improved workflow process utilizing this database for adaptive, risk-informed FM assurance that critical software systems will safely and securely protect against faults and respond to ACs in order to achieve successful missions.
Sanvido, Olivier; Widmer, Franco; Winzeler, Michael; Bigler, Franz
2005-01-01
Genetically modified plants (GMPs) may soon be cultivated commercially in several member countries of the European Union (EU). According to EU Directive 2001/18/EC, post-market monitoring (PMM) for commercial GMP cultivation must be implemented, in order to detect and prevent adverse effects on human health and the environment. However, no general PMM strategies for GMP cultivation have been established so far. We present a conceptual framework for the design of environmental PMM for GMP cultivation based on current EU legislation and common risk analysis procedures. We have established a comprehensive structure of the GMP approval process, consisting of pre-market risk assessment (PMRA) as well as PMM. Both programs can be distinguished conceptually due to principles inherent to risk analysis procedures. The design of PMM programs should take into account the knowledge gained during approval for commercialization of a specific GMP and the decisions made in the environmental risk assessments (ERAs). PMM is composed of case-specific monitoring (CSM) and general surveillance. CSM focuses on anticipated effects of a specific GMP. Selection of case-specific indicators for detection of ecological exposure and effects, as well as definition of effect sizes, are important for CSM. General surveillance is designed to detect unanticipated effects on general safeguard subjects, such as natural resources, which must not be adversely affected by human activities like GMP cultivation. We have identified clear conceptual differences between CSM and general surveillance, and propose to adopt separate frameworks when developing either of the two programs. Common to both programs is the need to put a value on possible ecological effects of GMP cultivation. The structure of PMM presented here will be of assistance to industry, researchers, and regulators, when assessing GMPs during commercialization.
Griffiths, A; Cox, T; Karanika, M; Khan, S; Tomás, J M
2006-10-01
To examine the factor structure, reliability, and validity of a new context-specific questionnaire for the assessment of work and organisational factors. The Work Organisation Assessment Questionnaire (WOAQ) was developed as part of a risk assessment and risk reduction methodology for hazards inherent in the design and management of work in the manufacturing sector. Two studies were conducted. Data were collected from 524 white- and blue-collar employees from a range of manufacturing companies. Exploratory factor analysis was carried out on 28 items that described the most commonly reported failures of work design and management in companies in the manufacturing sector. Concurrent validity data were also collected. A reliability study was conducted with a further 156 employees. Principal component analysis, with varimax rotation, revealed a strong 28-item, five factor structure. The factors were named: quality of relationships with management, reward and recognition, workload, quality of relationships with colleagues, and quality of physical environment. Analyses also revealed a more general summative factor. Results indicated that the questionnaire has good internal consistency and test-retest reliability and validity. Being associated with poor employee health and changes in health related behaviour, the WOAQ factors are possible hazards. It is argued that the strength of those associations offers some estimation of risk. Feedback from the organisations involved indicated that the WOAQ was easy to use and meaningful for them as part of their risk assessment procedures. The studies reported here describe a model of the hazards to employee health and health related behaviour inherent in the design and management of work in the manufacturing sector. It offers an instrument for their assessment. The scales derived which form the WOAQ were shown to be reliable, valid, and meaningful to the user population.
Malaria Disease Mapping in Malaysia based on Besag-York-Mollie (BYM) Model
NASA Astrophysics Data System (ADS)
Azah Samat, Nor; Mey, Liew Wan
2017-09-01
Disease mapping is the visual representation of the geographical distribution which give an overview info about the incidence of disease within a population through spatial epidemiology data. Based on the result of map, it helps in monitoring and planning resource needs at all levels of health care and designing appropriate interventions, tailored towards areas that deserve closer scrutiny or communities that lead to further investigations to identify important risk factors. Therefore, the choice of statistical model used for relative risk estimation is important because production of disease risk map relies on the model used. This paper proposes Besag-York-Mollie (BYM) model to estimate the relative risk for Malaria in Malaysia. The analysis involved using the number of Malaria cases that obtained from the Ministry of Health Malaysia. The outcomes of analysis are displayed through graph and map, including Malaria disease risk map that constructed according to the estimation of relative risk. The distribution of high and low risk areas of Malaria disease occurrences for all states in Malaysia can be identified in the risk map.
Deng, Xinyang; Jiang, Wen
2017-09-12
Failure mode and effect analysis (FMEA) is a useful tool to define, identify, and eliminate potential failures or errors so as to improve the reliability of systems, designs, and products. Risk evaluation is an important issue in FMEA to determine the risk priorities of failure modes. There are some shortcomings in the traditional risk priority number (RPN) approach for risk evaluation in FMEA, and fuzzy risk evaluation has become an important research direction that attracts increasing attention. In this paper, the fuzzy risk evaluation in FMEA is studied from a perspective of multi-sensor information fusion. By considering the non-exclusiveness between the evaluations of fuzzy linguistic variables to failure modes, a novel model called D numbers is used to model the non-exclusive fuzzy evaluations. A D numbers based multi-sensor information fusion method is proposed to establish a new model for fuzzy risk evaluation in FMEA. An illustrative example is provided and examined using the proposed model and other existing method to show the effectiveness of the proposed model.
Deng, Xinyang
2017-01-01
Failure mode and effect analysis (FMEA) is a useful tool to define, identify, and eliminate potential failures or errors so as to improve the reliability of systems, designs, and products. Risk evaluation is an important issue in FMEA to determine the risk priorities of failure modes. There are some shortcomings in the traditional risk priority number (RPN) approach for risk evaluation in FMEA, and fuzzy risk evaluation has become an important research direction that attracts increasing attention. In this paper, the fuzzy risk evaluation in FMEA is studied from a perspective of multi-sensor information fusion. By considering the non-exclusiveness between the evaluations of fuzzy linguistic variables to failure modes, a novel model called D numbers is used to model the non-exclusive fuzzy evaluations. A D numbers based multi-sensor information fusion method is proposed to establish a new model for fuzzy risk evaluation in FMEA. An illustrative example is provided and examined using the proposed model and other existing method to show the effectiveness of the proposed model. PMID:28895905
Madsen, Ida E H; Hannerz, Harald; Nyberg, Solja T; Magnusson Hanson, Linda L; Ahola, Kirsi; Alfredsson, Lars; Batty, G David; Bjorner, Jakob B; Borritz, Marianne; Burr, Hermann; Dragano, Nico; Ferrie, Jane E; Hamer, Mark; Jokela, Markus; Knutsson, Anders; Koskenvuo, Markku; Koskinen, Aki; Leineweber, Constanze; Nielsen, Martin L; Nordin, Maria; Oksanen, Tuula; Pejtersen, Jan H; Pentti, Jaana; Salo, Paula; Singh-Manoux, Archana; Suominen, Sakari; Theorell, Töres; Toppinen-Tanner, Salla; Vahtera, Jussi; Väänänen, Ari; Westerholm, Peter J M; Westerlund, Hugo; Fransson, Eleonor; Heikkilä, Katriina; Virtanen, Marianna; Rugulies, Reiner; Kivimäki, Mika
2013-01-01
Previous studies have shown that gainfully employed individuals with high work demands and low control at work (denoted "job strain") are at increased risk of common mental disorders, including depression. Most existing studies have, however, measured depression using self-rated symptom scales that do not necessarily correspond to clinically diagnosed depression. In addition, a meta-analysis from 2008 indicated publication bias in the field. This study protocol describes the planned design and analyses of an individual participant data meta-analysis, to examine whether job strain is associated with an increased risk of clinically diagnosed unipolar depression based on hospital treatment registers. The study will be based on data from approximately 120,000 individuals who participated in 14 studies on work environment and health in 4 European countries. The self-reported working conditions data will be merged with national registers on psychiatric hospital treatment, primarily hospital admissions. Study-specific risk estimates for the association between job strain and depression will be calculated using Cox regressions. The study-specific risk estimates will be pooled using random effects meta-analysis. The planned analyses will help clarify whether job strain is associated with an increased risk of clinically diagnosed unipolar depression. As the analysis is based on pre-planned study protocols and an individual participant data meta-analysis, the pooled risk estimates will not be influenced by selective reporting and publication bias. However, the results of the planned study may only pertain to severe cases of unipolar depression, because of the outcome measure applied.
Exploring association between statin use and breast cancer risk: an updated meta-analysis.
Islam, Md Mohaimenul; Yang, Hsuan-Chia; Nguyen, Phung-Anh; Poly, Tahmina Nasrin; Huang, Chih-Wei; Kekade, Shwetambara; Khalfan, Abdulwahed Mohammed; Debnath, Tonmoy; Li, Yu-Chuan Jack; Abdul, Shabbir Syed
2017-12-01
The benefits of statin treatment for preventing cardiac disease are well established. However, preclinical studies suggested that statins may influence mammary cancer growth, but the clinical evidence is still inconsistent. We, therefore, performed an updated meta-analysis to provide a precise estimate of the risk of breast cancer in individuals undergoing statin therapy. For this meta-analysis, we searched PubMed, the Cochrane Library, Web of Science, Embase, and CINAHL for published studies up to January 31, 2017. Articles were included if they (1) were published in English; (2) had an observational study design with individual-level exposure and outcome data, examined the effect of statin therapy, and reported the incidence of breast cancer; and (3) reported estimates of either the relative risk, odds ratios, or hazard ratios with 95% confidence intervals (CIs). We used random-effect models to pool the estimates. Of 2754 unique abstracts, 39 were selected for full-text review, and 36 studies reporting on 121,399 patients met all inclusion criteria. The overall pooled risks of breast cancer in patients using statins were 0.94 (95% CI 0.86-1.03) in random-effect models with significant heterogeneity between estimates (I 2 = 83.79%, p = 0.0001). However, we also stratified by region, the duration of statin therapy, methodological design, statin properties, and individual stain use. Our results suggest that there is no association between statin use and breast cancer risk. However, observational studies cannot clarify whether the observed epidemiologic association is a causal effect or the result of some unmeasured confounding variable. Therefore, more research is needed.
NASA Astrophysics Data System (ADS)
Oza, Amit R.
The focus of this study is to improve R&D effectiveness towards aerospace and defense planning in the early stages of the product development lifecycle. Emphasis is on: correct formulation of a decision problem, with special attention to account for data relationships between the individual design problem and the system capability required to size the aircraft, understanding of the meaning of the acquisition strategy objective and subjective data requirements that are required to arrive at a balanced analysis and/or "correct" mix of technology projects, understanding the meaning of the outputs that can be created from the technology analysis, and methods the researcher can use at effectively support decisions at the acquisition and conceptual design levels through utilization of a research and development portfolio strategy. The primary objectives of this study are to: (1) determine what strategy should be used to initialize conceptual design parametric sizing processes during requirements analysis for the materiel solution analysis stage of the product development lifecycle when utilizing data already constructed in the latter phase when working with a generic database management system synthesis tool integration architecture for aircraft design , and (2) assess how these new data relationships can contribute for innovative decision-making when solving acquisition hardware/technology portfolio problems. As such, an automated composable problem formulation system is developed to consider data interactions for the system architecture that manages acquisition pre-design concept refinement portfolio management, and conceptual design parametric sizing requirements. The research includes a way to: • Formalize the data storage and implement the data relationship structure with a system architecture automated through a database management system. • Allow for composable modeling, in terms of level of hardware abstraction, for the product model, mission model, and operational constraint model data blocks in the pre-design stages. • Allow the product model, mission model, and operational constraint model to be cross referenced with a generic aircraft synthesis capability to identify disciplinary analysis methods and processes. • Allow for matching, comparison, and balancing of the aircraft hardware portfolio to the associated developmental and technology risk metrics. • Allow for visualization technology portfolio decision space. The problem formulation architecture is finally implemented and verified for a generic hypersonic vehicle research demonstrator where a portfolio of technology hardware are measured for developmental and technology risks, prioritized by the researcher risk constraints, and the data generated delivered to a novel aircraft synthesis tool to confirm vehicle feasibility.
Peterson, A Townsend; Moses, Lina M; Bausch, Daniel G
2014-01-01
Lassa fever is a disease that has been reported from sites across West Africa; it is caused by an arenavirus that is hosted by the rodent M. natalensis. Although it is confined to West Africa, and has been documented in detail in some well-studied areas, the details of the distribution of risk of Lassa virus infection remain poorly known at the level of the broader region. In this paper, we explored the effects of certainty of diagnosis, oversampling in well-studied region, and error balance on results of mapping exercises. Each of the three factors assessed in this study had clear and consistent influences on model results, overestimating risk in southern, humid zones in West Africa, and underestimating risk in drier and more northern areas. The final, adjusted risk map indicates broad risk areas across much of West Africa. Although risk maps are increasingly easy to develop from disease occurrence data and raster data sets summarizing aspects of environments and landscapes, this process is highly sensitive to issues of data quality, sampling design, and design of analysis, with macrogeographic implications of each of these issues and the potential for misrepresenting real patterns of risk.
Tao, Da; Zhang, Rui; Qu, Xingda
2017-02-01
The purpose of this study was to explore the role of personality traits and driving experience in the prediction of risky driving behaviors and accident risk among Chinese population. A convenience sample of drivers (n=511; mean (SD) age=34.2 (8.8) years) completed a self-report questionnaire that was designed based on validated scales for measuring personality traits, risky driving behaviors and self-reported accident risk. Results from structural equation modeling analysis demonstrated that the data fit well with our theoretical model. While showing no direct effects on accident risk, personality traits had direct effects on risky driving behaviors, and yielded indirect effects on accident risk mediated by risky driving behaviors. Both driving experience and risky driving behaviors directly predicted accident risk and accounted for 15% of its variance. There was little gender difference in personality traits, risky driving behaviors and accident risk. The findings emphasized the importance of personality traits and driving experience in the understanding of risky driving behaviors and accident risk among Chinese drivers and provided new insight into the design of evidence-based driving education and accident prevention interventions. Copyright © 2016 Elsevier Ltd. All rights reserved.
Boussès, Christine; Ferey, Ludivine; Vedrines, Elodie; Gaudin, Karen
2015-11-10
An innovative combination of green chemistry and quality by design (QbD) approach is presented through the development of an UHPLC method for the analysis of the main degradation products of dextromethorphan hydrobromide. QbD strategy was integrated to the field of green analytical chemistry to improve method understanding while assuring quality and minimizing environmental impacts, and analyst exposure. This analytical method was thoroughly evaluated by applying risk assessment and multivariate analysis tools. After a scouting phase aimed at selecting a suitable stationary phase and an organic solvent in accordance with green chemistry principles, quality risk assessment tools were applied to determine the critical process parameters (CPPs). The effects of the CPPs on critical quality attributes (CQAs), i.e., resolutions, efficiencies, and solvent consumption were further evaluated by means of a screening design. A response surface methodology was then carried out to model CQAs as function of the selected CPPs and the optimal separation conditions were determined through a desirability analysis. Resulting contour plots enabled to establish the design space (DS) (method operable design region) where all CQAs fulfilled the requirements. An experimental validation of the DS proved that quality within the DS was guaranteed; therefore no more robustness study was required before the validation. Finally, this UHPLC method was validated using the concept of total error and was used to analyze a pharmaceutical drug product. Copyright © 2015 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Gigante-Barrera, Ángel; Dindar, Serdar; Kaewunruen, Sakdirat; Ruikar, Darshan
2017-10-01
Railway turnouts are complex systems designed using complex geometries and grades which makes them difficult to be managed in terms of risk prevention. This feature poses a substantial peril to rail users as it is considered a cause of derailment. In addition, derailment deals to financial losses due to operational downtimes and monetary compensations in case of death or injure. These are fundamental drivers to consider mitigating risks arising from poor risk management during design. Prevention through design (PtD) is a process that introduces tacit knowledge from industry professionals during the design process. There is evidence that Building Information Modelling (BIM) can help to mitigate risk since the inception of the project. BIM is considered an Information System (IS) were tacit knowledge can be stored and retrieved from a digital database making easy to take promptly decisions as information is ready to be analysed. BIM at the model element level entails working with 3D elements and embedded data, therefore adding a layer of complexity to the management of information along the different stages of the project and across different disciplines. In order to overcome this problem, the industry has created a framework for model progression specification named Level of Development (LOD). The paper presents an IDM based framework for design risk mitigation through code validation using the LOD. This effort resulted on risk datasets which describe graphically and non-graphically a rail turnout as the model progresses. Thus, permitting its inclusion within risk information systems. The assignment of an LOD construct to a set of data, requires specialised management and process related expertise. Furthermore, the selection of a set of LOD constructs requires a purpose based analysis. Therefore, a framework for LOD constructs implementation within the IDM for code checking is required for the industry to progress in this particular field.
Managing Analysis Models in the Design Process
NASA Technical Reports Server (NTRS)
Briggs, Clark
2006-01-01
Design of large, complex space systems depends on significant model-based support for exploration of the design space. Integrated models predict system performance in mission-relevant terms given design descriptions and multiple physics-based numerical models. Both the design activities and the modeling activities warrant explicit process definitions and active process management to protect the project from excessive risk. Software and systems engineering processes have been formalized and similar formal process activities are under development for design engineering and integrated modeling. JPL is establishing a modeling process to define development and application of such system-level models.
NASA Astrophysics Data System (ADS)
Liu, Hu-Chen; Liu, Long; Li, Ping
2014-10-01
Failure mode and effects analysis (FMEA) has shown its effectiveness in examining potential failures in products, process, designs or services and has been extensively used for safety and reliability analysis in a wide range of industries. However, its approach to prioritise failure modes through a crisp risk priority number (RPN) has been criticised as having several shortcomings. The aim of this paper is to develop an efficient and comprehensive risk assessment methodology using intuitionistic fuzzy hybrid weighted Euclidean distance (IFHWED) operator to overcome the limitations and improve the effectiveness of the traditional FMEA. The diversified and uncertain assessments given by FMEA team members are treated as linguistic terms expressed in intuitionistic fuzzy numbers (IFNs). Intuitionistic fuzzy weighted averaging (IFWA) operator is used to aggregate the FMEA team members' individual assessments into a group assessment. IFHWED operator is applied thereafter to the prioritisation and selection of failure modes. Particularly, both subjective and objective weights of risk factors are considered during the risk evaluation process. A numerical example for risk assessment is given to illustrate the proposed method finally.
An overview of safety assessment, regulation, and control of hazardous material use at NREL
NASA Astrophysics Data System (ADS)
Nelson, B. P.; Crandall, R. S.; Moskowitz, P. D.; Fthenakis, V. M.
1992-12-01
This paper summarizes the methodology we use to ensure the safe use of hazardous materials at the National Renewable Energy Laboratory (NREL). First, we analyze the processes and the materials used in those processes to identify the hazards presented. Then we study federal, state, and local regulations and apply the relevant requirements to our operations. When necessary, we generate internal safety documents to consolidate this information. We design research operations and support systems to conform to these requirements. Before we construct the systems, we perform a semiquantitative risk analysis on likely accident scenarios. All scenarios presenting an unacceptable risk require system or procedural modifications to reduce the risk. Following these modifications, we repeat the risk analysis to ensure that the respective accident scenarios present an acceptable risk. Once all risks are acceptable, we conduct an operational readiness review (ORR). A management-appointed panel performs the ORR ensuring compliance with all relevant requirements. After successful completion of the ORR, operations can begin.
A qualitative descriptive study of risk reduction for coronary disease among the Hong Kong Chinese.
Chan, Choi Wan; Lopez, Violeta
2014-01-01
Achieving optimal control and reduction in coronary heart disease (CHD) risks in Hong Kong (HK) remains significant and requires exploring. This article addresses the ability to reduce CHD risks among the HK Chinese. Through secondary analysis, a qualitative descriptive design using focus group interviews and content analysis were adopted. Older and younger adults were invited for the study. An interview schedule was used to guide discussions during focus group interviews. Four categories emerged from the data: planning of health actions, control of risk-reducing behavior, perceived opportunities for understanding CHD, and chest pain appraisal. Local culture and population needs play a central role in disease perception and prevention. The findings are essential to target strategies for initiating health acts for younger adults and establish public education resources that underscore understanding of CHD risk, symptom recognition, and disease management, particularly among those middle-aged and older people at high risk and with the diseased populations. © 2013 Wiley Periodicals, Inc.
ERIC Educational Resources Information Center
Ilo, Cajetan I.; Onwunaka, Chinagorom; Nwimo, Ignatius O.
2015-01-01
This descriptive survey was carried out in order to determine the personal health risks behaviour profile among university students in the south east of Nigeria. A random sample of 900 students completed the questionnaire designed for the study. Out of this number 821, representing about 91.2% return rate, were used for data analysis. Means and…
NASA Technical Reports Server (NTRS)
Cirillo, William M.; Earle, Kevin D.; Goodliff, Kandyce E.; Reeves, J. D.; Stromgren, Chel; Andraschko, Mark R.; Merrill, R. Gabe
2008-01-01
NASA s Constellation Program employs a strategic analysis methodology in providing an integrated analysis capability of Lunar exploration scenarios and to support strategic decision-making regarding those scenarios. The strategic analysis methodology integrates the assessment of the major contributors to strategic objective satisfaction performance, affordability, and risk and captures the linkages and feedbacks between all three components. Strategic analysis supports strategic decision making by senior management through comparable analysis of alternative strategies, provision of a consistent set of high level value metrics, and the enabling of cost-benefit analysis. The tools developed to implement the strategic analysis methodology are not element design and sizing tools. Rather, these models evaluate strategic performance using predefined elements, imported into a library from expert-driven design/sizing tools or expert analysis. Specific components of the strategic analysis tool set include scenario definition, requirements generation, mission manifesting, scenario lifecycle costing, crew time analysis, objective satisfaction benefit, risk analysis, and probabilistic evaluation. Results from all components of strategic analysis are evaluated a set of pre-defined figures of merit (FOMs). These FOMs capture the high-level strategic characteristics of all scenarios and facilitate direct comparison of options. The strategic analysis methodology that is described in this paper has previously been applied to the Space Shuttle and International Space Station Programs and is now being used to support the development of the baseline Constellation Program lunar architecture. This paper will present an overview of the strategic analysis methodology and will present sample results from the application of the strategic analysis methodology to the Constellation Program lunar architecture.
Time-varying nonstationary multivariate risk analysis using a dynamic Bayesian copula
NASA Astrophysics Data System (ADS)
Sarhadi, Ali; Burn, Donald H.; Concepción Ausín, María.; Wiper, Michael P.
2016-03-01
A time-varying risk analysis is proposed for an adaptive design framework in nonstationary conditions arising from climate change. A Bayesian, dynamic conditional copula is developed for modeling the time-varying dependence structure between mixed continuous and discrete multiattributes of multidimensional hydrometeorological phenomena. Joint Bayesian inference is carried out to fit the marginals and copula in an illustrative example using an adaptive, Gibbs Markov Chain Monte Carlo (MCMC) sampler. Posterior mean estimates and credible intervals are provided for the model parameters and the Deviance Information Criterion (DIC) is used to select the model that best captures different forms of nonstationarity over time. This study also introduces a fully Bayesian, time-varying joint return period for multivariate time-dependent risk analysis in nonstationary environments. The results demonstrate that the nature and the risk of extreme-climate multidimensional processes are changed over time under the impact of climate change, and accordingly the long-term decision making strategies should be updated based on the anomalies of the nonstationary environment.
Risk and Vulnerability Analysis of Satellites Due to MM/SD with PIRAT
NASA Astrophysics Data System (ADS)
Kempf, Scott; Schafer, Frank Rudolph, Martin; Welty, Nathan; Donath, Therese; Destefanis, Roberto; Grassi, Lilith; Janovsky, Rolf; Evans, Leanne; Winterboer, Arne
2013-08-01
Until recently, the state-of-the-art assessment of the threat posed to spacecraft by micrometeoroids and space debris was limited to the application of ballistic limit equations to the outer hull of a spacecraft. The probability of no penetration (PNP) is acceptable for assessing the risk and vulnerability of manned space mission, however, for unmanned missions, whereby penetrations of the spacecraft exterior do not necessarily constitute satellite or mission failure, these values are overly conservative. The newly developed software tool PIRAT (Particle Impact Risk and Vulnerability Analysis Tool) has been developed based on the Schäfer-Ryan-Lambert (SRL) triple-wall ballistic limit equation (BLE), applicable for various satellite components. As a result, it has become possible to assess the individual failure rates of satellite components. This paper demonstrates the modeling of an example satellite, the performance of a PIRAT analysis and the potential for subsequent design optimizations with respect of micrometeoroid and space debris (MM/SD) impact risk.
Cozzi, Gabriele; Musi, Gennaro; Bianchi, Roberto; Bottero, Danilo; Brescia, Antonio; Cioffi, Antonio; Cordima, Giovanni; Delor, Maurizio; Di Trapani, Ettore; Ferro, Matteo; Matei, Deliu Victor; Russo, Andrea; Mistretta, Francesco Alessandro; De Cobelli, Ottavio
2017-01-01
Background: The aim of this study was to compare oncologic outcomes of radical prostatectomy (RP) with brachytherapy (BT). Methods: A literature review was conducted according to the ‘Preferred reporting items for systematic reviews and meta-analyses’ (PRISMA) statement. We included studies reporting comparative oncologic outcomes of RP versus BT for localized prostate cancer (PCa). From each comparative study, we extracted the study design, the number and features of the included patients, and the oncologic outcomes expressed as all-cause mortality (ACM), PCa-specific mortality (PCSM) or, when the former were unavailable, as biochemical recurrence (BCR). All of the data retrieved from the selected studies were recorded in an electronic database. Cumulative analysis was conducted using the Review Manager version 5.3 software, designed for composing Cochrane Reviews (Cochrane Collaboration, Oxford, UK). Statistical heterogeneity was tested using the Chi-square test. Results: Our cumulative analysis did not show any significant difference in terms of BCR, ACM or PCSM rates between the RP and BT cohorts. Only three studies reported risk-stratified outcomes of intermediate- and high-risk patients, which are the most prone to treatment failure. Conclusions: our analysis suggested that RP and BT may have similar oncologic outcomes. However, the analysis included a limited number of studies, and most of them were retrospective, making it impossible to derive any definitive conclusion, especially for intermediate- and high-risk patients. In this scenario, appropriate urologic counseling remains of utmost importance. PMID:29662542
Cozzi, Gabriele; Musi, Gennaro; Bianchi, Roberto; Bottero, Danilo; Brescia, Antonio; Cioffi, Antonio; Cordima, Giovanni; Delor, Maurizio; Di Trapani, Ettore; Ferro, Matteo; Matei, Deliu Victor; Russo, Andrea; Mistretta, Francesco Alessandro; De Cobelli, Ottavio
2017-11-01
The aim of this study was to compare oncologic outcomes of radical prostatectomy (RP) with brachytherapy (BT). A literature review was conducted according to the 'Preferred reporting items for systematic reviews and meta-analyses' (PRISMA) statement. We included studies reporting comparative oncologic outcomes of RP versus BT for localized prostate cancer (PCa). From each comparative study, we extracted the study design, the number and features of the included patients, and the oncologic outcomes expressed as all-cause mortality (ACM), PCa-specific mortality (PCSM) or, when the former were unavailable, as biochemical recurrence (BCR). All of the data retrieved from the selected studies were recorded in an electronic database. Cumulative analysis was conducted using the Review Manager version 5.3 software, designed for composing Cochrane Reviews (Cochrane Collaboration, Oxford, UK). Statistical heterogeneity was tested using the Chi-square test. Our cumulative analysis did not show any significant difference in terms of BCR, ACM or PCSM rates between the RP and BT cohorts. Only three studies reported risk-stratified outcomes of intermediate- and high-risk patients, which are the most prone to treatment failure. our analysis suggested that RP and BT may have similar oncologic outcomes. However, the analysis included a limited number of studies, and most of them were retrospective, making it impossible to derive any definitive conclusion, especially for intermediate- and high-risk patients. In this scenario, appropriate urologic counseling remains of utmost importance.
Lin, Zi-Jing; Li, Lin; Cazzell, Mary; Liu, Hanli
2014-08-01
Diffuse optical tomography (DOT) is a variant of functional near infrared spectroscopy and has the capability of mapping or reconstructing three dimensional (3D) hemodynamic changes due to brain activity. Common methods used in DOT image analysis to define brain activation have limitations because the selection of activation period is relatively subjective. General linear model (GLM)-based analysis can overcome this limitation. In this study, we combine the atlas-guided 3D DOT image reconstruction with GLM-based analysis (i.e., voxel-wise GLM analysis) to investigate the brain activity that is associated with risk decision-making processes. Risk decision-making is an important cognitive process and thus is an essential topic in the field of neuroscience. The Balloon Analog Risk Task (BART) is a valid experimental model and has been commonly used to assess human risk-taking actions and tendencies while facing risks. We have used the BART paradigm with a blocked design to investigate brain activations in the prefrontal and frontal cortical areas during decision-making from 37 human participants (22 males and 15 females). Voxel-wise GLM analysis was performed after a human brain atlas template and a depth compensation algorithm were combined to form atlas-guided DOT images. In this work, we wish to demonstrate the excellence of using voxel-wise GLM analysis with DOT to image and study cognitive functions in response to risk decision-making. Results have shown significant hemodynamic changes in the dorsal lateral prefrontal cortex (DLPFC) during the active-choice mode and a different activation pattern between genders; these findings correlate well with published literature in functional magnetic resonance imaging (fMRI) and fNIRS studies. Copyright © 2014 The Authors. Human Brain Mapping Published by Wiley Periodicals, Inc.
Gianassi, S; Bisin, S; Bindi, B; Spitaleri, I; Bambi, F
2010-01-01
The collection and handling of hematopoietic stem cells (HSCs) must meet high quality requirements. An integrated Quality Risk Management can help to identify and contain potential risks related to HSC production. Risk analysis techniques allow one to "weigh" identified hazards, considering the seriousness of their effects, frequency, and detectability, seeking to prevent the most harmful hazards. The Hazard Analysis Critical Point, recognized as the most appropriate technique to identify risks associated with physical, chemical, and biological hazards for cellular products, consists of classifying finished product specifications and limits of acceptability, identifying all off-specifications, defining activities that can cause them, and finally establishing both a monitoring system for each Critical Control Point and corrective actions for deviations. The severity of possible effects on patients, as well as the occurrence and detectability of critical parameters, are measured on quantitative scales (Risk Priority Number [RPN]). Risk analysis was performed with this technique on manipulation process of HPC performed at our blood center. The data analysis showed that hazards with higher values of RPN with greater impact on the process are loss of dose and tracking; technical skills of operators and manual transcription of data were the most critical parameters. Problems related to operator skills are handled by defining targeted training programs, while other critical parameters can be mitigated with the use of continuous control systems. The blood center management software was completed by a labeling system with forms designed to be in compliance with standards in force and by starting implementation of a cryopreservation management module. Copyright 2010 Elsevier Inc. All rights reserved.
Serotonin reuptake inhibitors in pregnancy and the risk of major malformations: a systematic review.
Bellantuono, Cesario; Migliarese, Giovanni; Gentile, Salvatore
2007-04-01
To review studies conducted to establish the risk of major congenital malformations in women exposed to serotonin reuptake inhibitors (SRIs) during the first trimester of pregnancy. A literature search [corrected] was conducted within PsycINFO [corrected] EMBASE, MEDLINE and Cochrane databases from 1966 to October 2006, to identify studies assessing the risk of major malformations in infants whose mother was taking SRIs (SSRIs and SNRIs) during the first trimester of pregnancy. Fifteen studies were selected for the analysis: seven adopted a prospective cohort design and seven a retrospective design, of these one was a case-control study. The reviewed studies suggest that exposure to fluoxetine, sertraline, citalopram and venlafaxine in early pregnancy is not associated with an increased risk of major congenital malformations. For paroxetine, recent data call for caution in prescribing such a drug in early pregnancy. For the other SRIs, the risk remains substantially undetermined, as data are so far scanty. Given this background, large prospective cohort studies are urgently needed to better assess the risk/benefit ratio of SRIs-treatment during pregnancy. Copyright 2007 John Wiley & Sons, Ltd.
Identifying Items to Assess Methodological Quality in Physical Therapy Trials: A Factor Analysis
Cummings, Greta G.; Fuentes, Jorge; Saltaji, Humam; Ha, Christine; Chisholm, Annabritt; Pasichnyk, Dion; Rogers, Todd
2014-01-01
Background Numerous tools and individual items have been proposed to assess the methodological quality of randomized controlled trials (RCTs). The frequency of use of these items varies according to health area, which suggests a lack of agreement regarding their relevance to trial quality or risk of bias. Objective The objectives of this study were: (1) to identify the underlying component structure of items and (2) to determine relevant items to evaluate the quality and risk of bias of trials in physical therapy by using an exploratory factor analysis (EFA). Design A methodological research design was used, and an EFA was performed. Methods Randomized controlled trials used for this study were randomly selected from searches of the Cochrane Database of Systematic Reviews. Two reviewers used 45 items gathered from 7 different quality tools to assess the methodological quality of the RCTs. An exploratory factor analysis was conducted using the principal axis factoring (PAF) method followed by varimax rotation. Results Principal axis factoring identified 34 items loaded on 9 common factors: (1) selection bias; (2) performance and detection bias; (3) eligibility, intervention details, and description of outcome measures; (4) psychometric properties of the main outcome; (5) contamination and adherence to treatment; (6) attrition bias; (7) data analysis; (8) sample size; and (9) control and placebo adequacy. Limitation Because of the exploratory nature of the results, a confirmatory factor analysis is needed to validate this model. Conclusions To the authors' knowledge, this is the first factor analysis to explore the underlying component items used to evaluate the methodological quality or risk of bias of RCTs in physical therapy. The items and factors represent a starting point for evaluating the methodological quality and risk of bias in physical therapy trials. Empirical evidence of the association among these items with treatment effects and a confirmatory factor analysis of these results are needed to validate these items. PMID:24786942
Risk analysis for autonomous underwater vehicle operations in extreme environments.
Brito, Mario Paulo; Griffiths, Gwyn; Challenor, Peter
2010-12-01
Autonomous underwater vehicles (AUVs) are used increasingly to explore hazardous marine environments. Risk assessment for such complex systems is based on subjective judgment and expert knowledge as much as on hard statistics. Here, we describe the use of a risk management process tailored to AUV operations, the implementation of which requires the elicitation of expert judgment. We conducted a formal judgment elicitation process where eight world experts in AUV design and operation were asked to assign a probability of AUV loss given the emergence of each fault or incident from the vehicle's life history of 63 faults and incidents. After discussing methods of aggregation and analysis, we show how the aggregated risk estimates obtained from the expert judgments were used to create a risk model. To estimate AUV survival with mission distance, we adopted a statistical survival function based on the nonparametric Kaplan-Meier estimator. We present theoretical formulations for the estimator, its variance, and confidence limits. We also present a numerical example where the approach is applied to estimate the probability that the Autosub3 AUV would survive a set of missions under Pine Island Glacier, Antarctica in January-March 2009. © 2010 Society for Risk Analysis.
Population heterogeneity in the salience of multiple risk factors for adolescent delinquency.
Lanza, Stephanie T; Cooper, Brittany R; Bray, Bethany C
2014-03-01
To present mixture regression analysis as an alternative to more standard regression analysis for predicting adolescent delinquency. We demonstrate how mixture regression analysis allows for the identification of population subgroups defined by the salience of multiple risk factors. We identified population subgroups (i.e., latent classes) of individuals based on their coefficients in a regression model predicting adolescent delinquency from eight previously established risk indices drawn from the community, school, family, peer, and individual levels. The study included N = 37,763 10th-grade adolescents who participated in the Communities That Care Youth Survey. Standard, zero-inflated, and mixture Poisson and negative binomial regression models were considered. Standard and mixture negative binomial regression models were selected as optimal. The five-class regression model was interpreted based on the class-specific regression coefficients, indicating that risk factors had varying salience across classes of adolescents. Standard regression showed that all risk factors were significantly associated with delinquency. Mixture regression provided more nuanced information, suggesting a unique set of risk factors that were salient for different subgroups of adolescents. Implications for the design of subgroup-specific interventions are discussed. Copyright © 2014 Society for Adolescent Health and Medicine. Published by Elsevier Inc. All rights reserved.
NASA Technical Reports Server (NTRS)
1973-01-01
Results of the design and manufacturing reviews on the maturity of the Skylab modules are presented along with results of investigations on the scope of the cluster risk assessment efforts. The technical management system and its capability to assess and resolve problems are studied.
Operations Analysis of the 2nd Generation Reusable Launch Vehicle
NASA Technical Reports Server (NTRS)
Noneman, Steven R.; Smith, C. A. (Technical Monitor)
2002-01-01
The Space Launch Initiative (SLI) program is developing a second-generation reusable launch vehicle. The program goals include lowering the risk of loss of crew to 1 in 10,000 and reducing annual operations cost to one third of the cost of the Space Shuttle. The SLI missions include NASA, military and commercial satellite launches and crew and cargo launches to the space station. The SLI operations analyses provide an assessment of the operational support and infrastructure needed to operate candidate system architectures. Measures of the operability are estimated (i.e. system dependability, responsiveness, and efficiency). Operations analysis is used to determine the impact of specific technologies on operations. A conceptual path to reducing annual operations costs by two thirds is based on key design characteristics, such as reusability, and improved processes lowering labor costs. New operations risks can be expected to emerge. They can be mitigated with effective risk management with careful identification, assignment, tracking, and closure. SLI design characteristics such as nearly full reusability, high reliability, advanced automation, and lowered maintenance and servicing coupled with improved processes are contributors to operability and large operating cost reductions.
Micrometeoroid and Orbital Debris (MMOD) Shield Ballistic Limit Analysis Program
NASA Technical Reports Server (NTRS)
Ryan, Shannon
2013-01-01
This software implements penetration limit equations for common micrometeoroid and orbital debris (MMOD) shield configurations, windows, and thermal protection systems. Allowable MMOD risk is formulated in terms of the probability of penetration (PNP) of the spacecraft pressure hull. For calculating the risk, spacecraft geometry models, mission profiles, debris environment models, and penetration limit equations for installed shielding configurations are required. Risk assessment software such as NASA's BUMPERII is used to calculate mission PNP; however, they are unsuitable for use in shield design and preliminary analysis studies. The software defines a single equation for the design and performance evaluation of common MMOD shielding configurations, windows, and thermal protection systems, along with a description of their validity range and guidelines for their application. Recommendations are based on preliminary reviews of fundamental assumptions, and accuracy in predicting experimental impact test results. The software is programmed in Visual Basic for Applications for installation as a simple add-in for Microsoft Excel. The user is directed to a graphical user interface (GUI) that requires user inputs and provides solutions directly in Microsoft Excel workbooks.
NASA Technical Reports Server (NTRS)
Hatfield, Glen S.; Hark, Frank; Stott, James
2016-01-01
Launch vehicle reliability analysis is largely dependent upon using predicted failure rates from data sources such as MIL-HDBK-217F. Reliability prediction methodologies based on component data do not take into account risks attributable to manufacturing, assembly, and process controls. These sources often dominate component level reliability or risk of failure probability. While consequences of failure is often understood in assessing risk, using predicted values in a risk model to estimate the probability of occurrence will likely underestimate the risk. Managers and decision makers often use the probability of occurrence in determining whether to accept the risk or require a design modification. Due to the absence of system level test and operational data inherent in aerospace applications, the actual risk threshold for acceptance may not be appropriately characterized for decision making purposes. This paper will establish a method and approach to identify the pitfalls and precautions of accepting risk based solely upon predicted failure data. This approach will provide a set of guidelines that may be useful to arrive at a more realistic quantification of risk prior to acceptance by a program.
Dietary Cholesterol Intake and Risk of Lung Cancer: A Meta-Analysis.
Lin, Xiaojing; Liu, Lingli; Fu, Youyun; Gao, Jing; He, Yunyun; Wu, Yang; Lian, Xuemei
2018-02-08
Multiple epidemiologic studies have evaluated the relationship between dietary cholesterol and lung cancer risk, but the association is controversial and inconclusive. A meta-analysis of case-control studies and cohort studies was conducted to evaluate the relationship between dietary cholesterol intake and lung cancer risk in this study. A relevant literature search up to October 2017 was performed in Web of Science, PubMed, China National Knowledge Infrastructure, Sinomed, and VIP Journal Integration Platform. Ten case-control studies and six cohort studies were included in the meta-analysis, and the risk estimates were pooled using either fixed or random effects models. The case-control studies with a total of 6894 lung cancer cases and 29,736 controls showed that dietary cholesterol intake was positively associated with lung cancer risk (Odds Ratio = 1.70, 95% Confidence Interval: 1.43-2.03). However, there was no evidence of an association between dietary cholesterol intake and risk of lung cancer among the 241,920 participants and 1769 lung cancer cases in the cohort studies (Relative Risk = 1.08, 95% Confidence Interval: 0.94-1.25). Due to inconsistent results from case-control and cohort studies, it is difficult to draw any conclusion regarding the effects of dietary cholesterol intake on lung cancer risk. Carefully designed and well-conducted cohort studies are needed to identify the association between dietary cholesterol and lung cancer risk.
Changes in Concurrent Risk of Warm and Dry Years under Impact of Climate Change
NASA Astrophysics Data System (ADS)
Sarhadi, A.; Wiper, M.; Touma, D. E.; Ausín, M. C.; Diffenbaugh, N. S.
2017-12-01
Anthropogenic global warming has changed the nature and the risk of extreme climate phenomena. The changing concurrence of multiple climatic extremes (warm and dry years) may result in intensification of undesirable consequences for water resources, human and ecosystem health, and environmental equity. The present study assesses how global warming influences the probability that warm and dry years co-occur in a global scale. In the first step of the study a designed multivariate Mann-Kendall trend analysis is used to detect the areas in which the concurrence of warm and dry years has increased in the historical climate records and also climate models in the global scale. The next step investigates the concurrent risk of the extremes under dynamic nonstationary conditions. A fully generalized multivariate risk framework is designed to evolve through time under dynamic nonstationary conditions. In this methodology, Bayesian, dynamic copulas are developed to model the time-varying dependence structure between the two different climate extremes (warm and dry years). The results reveal an increasing trend in the concurrence risk of warm and dry years, which are in agreement with the multivariate trend analysis from historical and climate models. In addition to providing a novel quantification of the changing probability of compound extreme events, the results of this study can help decision makers develop short- and long-term strategies to prepare for climate stresses now and in the future.
Tool for Rapid Analysis of Monte Carlo Simulations
NASA Technical Reports Server (NTRS)
Restrepo, Carolina; McCall, Kurt E.; Hurtado, John E.
2013-01-01
Designing a spacecraft, or any other complex engineering system, requires extensive simulation and analysis work. Oftentimes, the large amounts of simulation data generated are very difficult and time consuming to analyze, with the added risk of overlooking potentially critical problems in the design. The authors have developed a generic data analysis tool that can quickly sort through large data sets and point an analyst to the areas in the data set that cause specific types of failures. The first version of this tool was a serial code and the current version is a parallel code, which has greatly increased the analysis capabilities. This paper describes the new implementation of this analysis tool on a graphical processing unit, and presents analysis results for NASA's Orion Monte Carlo data to demonstrate its capabilities.
Rare-Variant Association Analysis: Study Designs and Statistical Tests
Lee, Seunggeung; Abecasis, Gonçalo R.; Boehnke, Michael; Lin, Xihong
2014-01-01
Despite the extensive discovery of trait- and disease-associated common variants, much of the genetic contribution to complex traits remains unexplained. Rare variants can explain additional disease risk or trait variability. An increasing number of studies are underway to identify trait- and disease-associated rare variants. In this review, we provide an overview of statistical issues in rare-variant association studies with a focus on study designs and statistical tests. We present the design and analysis pipeline of rare-variant studies and review cost-effective sequencing designs and genotyping platforms. We compare various gene- or region-based association tests, including burden tests, variance-component tests, and combined omnibus tests, in terms of their assumptions and performance. Also discussed are the related topics of meta-analysis, population-stratification adjustment, genotype imputation, follow-up studies, and heritability due to rare variants. We provide guidelines for analysis and discuss some of the challenges inherent in these studies and future research directions. PMID:24995866
Verification and Validation in a Rapid Software Development Process
NASA Technical Reports Server (NTRS)
Callahan, John R.; Easterbrook, Steve M.
1997-01-01
The high cost of software production is driving development organizations to adopt more automated design and analysis methods such as rapid prototyping, computer-aided software engineering (CASE) tools, and high-level code generators. Even developers of safety-critical software system have adopted many of these new methods while striving to achieve high levels Of quality and reliability. While these new methods may enhance productivity and quality in many cases, we examine some of the risks involved in the use of new methods in safety-critical contexts. We examine a case study involving the use of a CASE tool that automatically generates code from high-level system designs. We show that while high-level testing on the system structure is highly desirable, significant risks exist in the automatically generated code and in re-validating releases of the generated code after subsequent design changes. We identify these risks and suggest process improvements that retain the advantages of rapid, automated development methods within the quality and reliability contexts of safety-critical projects.
NASA Astrophysics Data System (ADS)
Bartolini, S.; Becerril, L.; Martí, J.
2014-11-01
One of the most important issues in modern volcanology is the assessment of volcanic risk, which will depend - among other factors - on both the quantity and quality of the available data and an optimum storage mechanism. This will require the design of purpose-built databases that take into account data format and availability and afford easy data storage and sharing, and will provide for a more complete risk assessment that combines different analyses but avoids any duplication of information. Data contained in any such database should facilitate spatial and temporal analysis that will (1) produce probabilistic hazard models for future vent opening, (2) simulate volcanic hazards and (3) assess their socio-economic impact. We describe the design of a new spatial database structure, VERDI (Volcanic managEment Risk Database desIgn), which allows different types of data, including geological, volcanological, meteorological, monitoring and socio-economic information, to be manipulated, organized and managed. The root of the question is to ensure that VERDI will serve as a tool for connecting different kinds of data sources, GIS platforms and modeling applications. We present an overview of the database design, its components and the attributes that play an important role in the database model. The potential of the VERDI structure and the possibilities it offers in regard to data organization are here shown through its application on El Hierro (Canary Islands). The VERDI database will provide scientists and decision makers with a useful tool that will assist to conduct volcanic risk assessment and management.
Estimating urban flood risk - uncertainty in design criteria
NASA Astrophysics Data System (ADS)
Newby, M.; Franks, S. W.; White, C. J.
2015-06-01
The design of urban stormwater infrastructure is generally performed assuming that climate is static. For engineering practitioners, stormwater infrastructure is designed using a peak flow method, such as the Rational Method as outlined in the Australian Rainfall and Runoff (AR&R) guidelines and estimates of design rainfall intensities. Changes to Australian rainfall intensity design criteria have been made through updated releases of the AR&R77, AR&R87 and the recent 2013 AR&R Intensity Frequency Distributions (IFDs). The primary focus of this study is to compare the three IFD sets from 51 locations Australia wide. Since the release of the AR&R77 IFDs, the duration and number of locations for rainfall data has increased and techniques for data analysis have changed. Updated terminology coinciding with the 2013 IFD release has also resulted in a practical change to the design rainfall. For example, infrastructure that is designed for a 1 : 5 year ARI correlates with an 18.13% AEP, however for practical purposes, hydraulic guidelines have been updated with the more intuitive 20% AEP. The evaluation of design rainfall variation across Australia has indicated that the changes are dependent upon location, recurrence interval and rainfall duration. The changes to design rainfall IFDs are due to the application of differing data analysis techniques, the length and number of data sets and the change in terminology from ARI to AEP. Such changes mean that developed infrastructure has been designed to a range of different design criteria indicating the likely inadequacy of earlier developments to the current estimates of flood risk. In many cases, the under-design of infrastructure is greater than the expected impact of increased rainfall intensity under climate change scenarios.
Ergonomics and simulation-based approach in improving facility layout
NASA Astrophysics Data System (ADS)
Abad, Jocelyn D.
2018-02-01
The use of the simulation-based technique in facility layout has been a choice in the industry due to its convenience and efficient generation of results. Nevertheless, the solutions generated are not capable of addressing delays due to worker's health and safety which significantly impact overall operational efficiency. It is, therefore, critical to incorporate ergonomics in facility design. In this study, workstation analysis was incorporated into Promodel simulation to improve the facility layout of a garment manufacturing. To test the effectiveness of the method, existing and improved facility designs were measured using comprehensive risk level, efficiency, and productivity. Results indicated that the improved facility layout generated a decrease in comprehensive risk level and rapid upper limb assessment score; an increase of 78% in efficiency and 194% increase in productivity compared to existing design and thus proved that the approach is effective in attaining overall facility design improvement.
NASA Astrophysics Data System (ADS)
Brown, Casey; Carriquiry, Miguel
2007-11-01
This paper explores the performance of a system of economic instruments designed to facilitate the reduction of hydroclimatologic variability-induced impacts on stakeholders of shared water supply. The system is composed of bulk water option contracts between urban water suppliers and agricultural users and insurance indexed on reservoir inflows. The insurance is designed to cover the financial needs of the water supplier in situations where the option is likely to be exercised. Insurance provides the irregularly needed funds for exercising the water options. The combined option contract - reservoir index insurance system creates risk sharing between sectors that is currently lacking in many shared water situations. Contracts are designed for a shared agriculture - urban water system in Metro Manila, Philippines, using optimization and Monte Carlo analysis. Observed reservoir inflows are used to simulate contract performance. Results indicate the option - insurance design effectively smooths water supply costs of hydrologic variability for both agriculture and urban water.
Gong, Jian; Yang, Jianxin; Tang, Wenwu
2015-11-09
Land use and land cover change is driven by multiple influential factors from environmental and social dimensions in a land system. Land use practices of human decision-makers modify the landscape of the land system, possibly leading to landscape fragmentation, biodiversity loss, or environmental pollution-severe environmental or ecological impacts. While landscape-level ecological risk assessment supports the evaluation of these impacts, investigations on how these ecological risks induced by land use practices change over space and time in response to alternative policy intervention remain inadequate. In this article, we conducted spatially explicit landscape ecological risk analysis in Ezhou City, China. Our study area is a national ecologically representative region experiencing drastic land use and land cover change, and is regulated by multiple policies represented by farmland protection, ecological conservation, and urban development. We employed landscape metrics to consider the influence of potential landscape-level disturbance for the evaluation of landscape ecological risks. Using spatiotemporal simulation, we designed scenarios to examine spatiotemporal patterns in landscape ecological risks in response to policy intervention. Our study demonstrated that spatially explicit landscape ecological risk analysis combined with simulation-driven scenario analysis is of particular importance for guiding the sustainable development of ecologically vulnerable land systems.
Poultry and fish intake and risk of esophageal cancer: A meta-analysis of observational studies.
Jiang, Gengxi; Li, Bailing; Liao, Xiaohong; Zhong, Chongjun
2016-03-01
Mixed results regarding the association between white meat (including poultry and fish) intake and the risk of esophageal cancer (EC) have been reported. We performed a meta-analysis to provide a quantitative assessment of this association. Relevant studies were identified in MEDLINE until December 31, 2012. Summary relative risks (SRRs) with 95% confidence intervals (CIs) were pooled with a random-effects model. A total of 20 articles, including 3990 cases with EC, were included in this meta-analysis. Compared to individuals with the lowest level of fish intake, individuals with the highest fish intake were found to have reduced risk of EC (SRRs = 0.69; 95% CIs: 0.57-0.85), while poultry intake was not associated with EC (SRRs = 0.83; 95% CIs: 0.62-1.12). Total fish consumption is associated with reduced esophageal squamous cell carcinoma (ESCC) risk, while poultry consumption was not associated with ESCC risk. Additionally, neither poultry nor fish consumption was associated with esophageal adenocarcinoma risk. Our results suggest that fish consumption may have a potential role in EC prevention, while poultry intake has no effect. However, because the majority of data was from case-control studies, further well-designed prospective studies are warranted. © 2013 Wiley Publishing Asia Pty Ltd.
Gong, Jian; Yang, Jianxin; Tang, Wenwu
2015-01-01
Land use and land cover change is driven by multiple influential factors from environmental and social dimensions in a land system. Land use practices of human decision-makers modify the landscape of the land system, possibly leading to landscape fragmentation, biodiversity loss, or environmental pollution—severe environmental or ecological impacts. While landscape-level ecological risk assessment supports the evaluation of these impacts, investigations on how these ecological risks induced by land use practices change over space and time in response to alternative policy intervention remain inadequate. In this article, we conducted spatially explicit landscape ecological risk analysis in Ezhou City, China. Our study area is a national ecologically representative region experiencing drastic land use and land cover change, and is regulated by multiple policies represented by farmland protection, ecological conservation, and urban development. We employed landscape metrics to consider the influence of potential landscape-level disturbance for the evaluation of landscape ecological risks. Using spatiotemporal simulation, we designed scenarios to examine spatiotemporal patterns in landscape ecological risks in response to policy intervention. Our study demonstrated that spatially explicit landscape ecological risk analysis combined with simulation-driven scenario analysis is of particular importance for guiding the sustainable development of ecologically vulnerable land systems. PMID:26569270
48 CFR 339.7002 - Notice of intended award.
Code of Federal Regulations, 2013 CFR
2013-10-01
... approval to make an award to other than a GSA BPA holder for independent risk analysis services and either..., the CAO, or designee, shall send a notice of intended award to the designated GSA BPA Contracting...
48 CFR 339.7002 - Notice of intended award.
Code of Federal Regulations, 2014 CFR
2014-10-01
... approval to make an award to other than a GSA BPA holder for independent risk analysis services and either..., the CAO, or designee, shall send a notice of intended award to the designated GSA BPA Contracting...
48 CFR 339.7002 - Notice of intended award.
Code of Federal Regulations, 2011 CFR
2011-10-01
... approval to make an award to other than a GSA BPA holder for independent risk analysis services and either..., the CAO, or designee, shall send a notice of intended award to the designated GSA BPA Contracting...
48 CFR 339.7002 - Notice of intended award.
Code of Federal Regulations, 2010 CFR
2010-10-01
... approval to make an award to other than a GSA BPA holder for independent risk analysis services and either..., the CAO, or designee, shall send a notice of intended award to the designated GSA BPA Contracting...
48 CFR 339.7002 - Notice of intended award.
Code of Federal Regulations, 2012 CFR
2012-10-01
... approval to make an award to other than a GSA BPA holder for independent risk analysis services and either..., the CAO, or designee, shall send a notice of intended award to the designated GSA BPA Contracting...
Analysis of INDOT current hydraulic policies.
DOT National Transportation Integrated Search
2011-01-01
Hydraulic design often tends to be on a conservative side for safety reasons. Hydraulic structures are : typically oversized with the goal being reduced future maintenance costs, and to reduce the risk of : property owner complaints. This approach le...
Analysis of INDOT current hydraulic policies : [spreadsheet].
DOT National Transportation Integrated Search
2011-01-01
Hydraulic design often tends to be on a conservative side for safety reasons. Hydraulic structures are typically oversized with the goal being reduced future maintenance costs, and to reduce the risk of property owner complaints. This approach leads ...
A Meta-Analysis of Interventions to Reduce Loneliness
Masi, Christopher M.; Chen, Hsi-Yuan; Hawkley, Louise C.; Cacioppo, John T.
2013-01-01
Social and demographic trends are placing an increasing number of adults at risk for loneliness, an established risk factor for physical and mental illness. The growing costs of loneliness have led to a number of loneliness reduction interventions. Qualitative reviews have identified four primary intervention strategies: 1) improving social skills, 2) enhancing social support, 3) increasing opportunities for social contact, and 4) addressing maladaptive social cognition. An integrative meta-analysis of loneliness reduction interventions was conducted to quantify the effects of each strategy and to examine the potential role of moderator variables. Results revealed that single group pre-post and non-randomized comparison studies yielded larger mean effect sizes relative to randomized comparison studies. Among studies that used the latter design, the most successful interventions addressed maladaptive social cognition. This is consistent with current theories regarding loneliness and its etiology. Theoretical and methodological issues associated with designing new loneliness reduction interventions are discussed. PMID:20716644
DOE Office of Scientific and Technical Information (OSTI.GOV)
Visscher, W.A.
A retrospective cohort study was done which was designed to assess the effects of medical x-ray exposure on cancer incidence among scoliosis patients. Although the primary purpose of the study was to assess cancer incidence, a secondary goal was to investigate whether diagnostic x-ray exposure is related to adverse reproductive events in the female subjects. A series of case-control analyses were done which were designed to assess these effects. Radiation exposure was measured both by total films received and by an estimate of the number of films received and by an estimate of the number of films which involved ovarianmore » irradiation. Radiation appeared to increase a woman's risk of any adverse event in the overall analysis and her risk of a premature or low birth weight infant in the separate analyses. Radiation did not appear to be related to spontaneous abortion, complications of pregnancy or delivery or birth defects, although the results of the pregnancy complications analysis was suggestive.« less
NASA Technical Reports Server (NTRS)
2002-01-01
Under a Phase II SBIR contract, Kennedy and Lumina Decision Systems, Inc., jointly developed the Schedule and Cost Risk Analysis Modeling (SCRAM) system, based on a version of Lumina's flagship software product, Analytica(R). Acclaimed as "the best single decision-analysis program yet produced" by MacWorld magazine, Analytica is a "visual" tool used in decision-making environments worldwide to build, revise, and present business models, minus the time-consuming difficulty commonly associated with spreadsheets. With Analytica as their platform, Kennedy and Lumina created the SCRAM system in response to NASA's need to identify the importance of major delays in Shuttle ground processing, a critical function in project management and process improvement. As part of the SCRAM development project, Lumina designed a version of Analytica called the Analytica Design Engine (ADE) that can be easily incorporated into larger software systems. ADE was commercialized and utilized in many other developments, including web-based decision support.
NASA Astrophysics Data System (ADS)
Mardi Safitri, Dian; Arfi Nabila, Zahra; Azmi, Nora
2018-03-01
Musculoskeletal Disorders (MSD) is one of the ergonomic risks due to manual activity, non-neutral posture and repetitive motion. The purpose of this study is to measure risk and implement ergonomic interventions to reduce the risk of MSD on the paper pallet assembly work station. Measurements to work posture are done by Ovako Working Posture Analysis (OWAS) methods and Rapid Entire Body Assessment (REBA) method, while the measurement of work repetitiveness was using Strain Index (SI) method. Assembly processes operators are identified has the highest risk level. OWAS score, Strain Index, and REBA values are 4, 20.25, and 11. Ergonomic improvements are needed to reduce that level of risk. Proposed improvements will be developed using the Quality Function Deployment (QFD) method applied with Axiomatic House of Quality (AHOQ) and Morphological Chart. As the result, risk level based on OWAS score & REBA score turn out from 4 & 11 to be 1 & 2. Biomechanics analysis of the operator also shows the decreasing values for L4-L5 moment, compression, joint shear, and joint moment strength.
Ares-I-X Stability and Control Flight Test: Analysis and Plans
NASA Technical Reports Server (NTRS)
Brandon, Jay M.; Derry, Stephen D.; Heim, Eugene H.; Hueschen, Richard M.; Bacon, Barton J.
2008-01-01
The flight test of the Ares I-X vehicle provides a unique opportunity to reduce risk of the design of the Ares I vehicle and test out design, math modeling, and analysis methods. One of the key features of the Ares I design is the significant static aerodynamic instability coupled with the relatively flexible vehicle - potentially resulting in a challenging controls problem to provide adequate flight path performance while also providing adequate structural mode damping and preventing adverse control coupling to the flexible structural modes. Another challenge is to obtain enough data from the single flight to be able to conduct analysis showing the effectiveness of the controls solutions and have data to inform design decisions for Ares I. This paper will outline the modeling approaches and control system design to conduct this flight test, and also the system identification techniques developed to extract key information such as control system performance (gain/phase margins, for example), structural dynamics responses, and aerodynamic model estimations.
The role of Indigenous knowledge in environmental health risk management in Yukon, Canada
Friendship, Katelyn A.; Furgal, Chris M.
2012-01-01
Objectives This project aimed to gain better understandings of northern Indigenous risk perception related to food safety and to identify the role that Indigenous knowledge (IK) plays in risk management processes to support more effective and culturally relevant benefit-risk (B-R) management strategies. Study design The project used an exploratory qualitative case study design to investigate the role and place of IK in the management of environmental contaminants exposure via consumption of traditional foods in Yukon First Nations (YFNs). Methods Forty-one semi-directive interviews with Traditional Food Knowledge Holders and Health and Environment Decision-makers were conducted. A review and analysis of organizational documents related to past risk management events for the issue was conducted. Thematic content analysis was used to analyze transcripts and documents for key themes related to the research question. Results There was a recognized need by all participants for better collaboration between scientists and YFN communities. YFNs have been involved in identifying and defining community concerns about past risk issues, setting a local context, and participating in communications strategies. Interviewees stressed the need to commit adequate time for building relationships, physically being in the community, and facilitating open communication. Conducting community-based projects was identified as critical for collaboration and for cooperative learning and management of these issues. Conclusions The perception of “effective” benefit-risk management is significantly influenced by the efforts made to include local communities in the process. A set of common guiding principles within a process that brings together people and knowledge systems may provide a more effective way forward in cross-cultural, multiple knowledge system contexts for complex benefit-risk issues than a prescriptive rigid framework. PMID:22868192
NASA System Safety Handbook. Volume 1; System Safety Framework and Concepts for Implementation
NASA Technical Reports Server (NTRS)
Dezfuli, Homayoon; Benjamin, Allan; Everett, Christopher; Smith, Curtis; Stamatelatos, Michael; Youngblood, Robert
2011-01-01
System safety assessment is defined in NPR 8715.3C, NASA General Safety Program Requirements as a disciplined, systematic approach to the analysis of risks resulting from hazards that can affect humans, the environment, and mission assets. Achievement of the highest practicable degree of system safety is one of NASA's highest priorities. Traditionally, system safety assessment at NASA and elsewhere has focused on the application of a set of safety analysis tools to identify safety risks and formulate effective controls.1 Familiar tools used for this purpose include various forms of hazard analyses, failure modes and effects analyses, and probabilistic safety assessment (commonly also referred to as probabilistic risk assessment (PRA)). In the past, it has been assumed that to show that a system is safe, it is sufficient to provide assurance that the process for identifying the hazards has been as comprehensive as possible and that each identified hazard has one or more associated controls. The NASA Aerospace Safety Advisory Panel (ASAP) has made several statements in its annual reports supporting a more holistic approach. In 2006, it recommended that "... a comprehensive risk assessment, communication and acceptance process be implemented to ensure that overall launch risk is considered in an integrated and consistent manner." In 2009, it advocated for "... a process for using a risk-informed design approach to produce a design that is optimally and sufficiently safe." As a rationale for the latter advocacy, it stated that "... the ASAP applauds switching to a performance-based approach because it emphasizes early risk identification to guide designs, thus enabling creative design approaches that might be more efficient, safer, or both." For purposes of this preface, it is worth mentioning three areas where the handbook emphasizes a more holistic type of thinking. First, the handbook takes the position that it is important to not just focus on risk on an individual basis but to consider measures of aggregate safety risk and to ensure wherever possible that there be quantitative measures for evaluating how effective the controls are in reducing these aggregate risks. The term aggregate risk, when used in this handbook, refers to the accumulation of risks from individual scenarios that lead to a shortfall in safety performance at a high level: e.g., an excessively high probability of loss of crew, loss of mission, planetary contamination, etc. Without aggregated quantitative measures such as these, it is not reasonable to expect that safety has been optimized with respect to other technical and programmatic objectives. At the same time, it is fully recognized that not all sources of risk are amenable to precise quantitative analysis and that the use of qualitative approaches and bounding estimates may be appropriate for those risk sources. Second, the handbook stresses the necessity of developing confidence that the controls derived for the purpose of achieving system safety not only handle risks that have been identified and properly characterized but also provide a general, more holistic means for protecting against unidentified or uncharacterized risks. For example, while it is not possible to be assured that all credible causes of risk have been identified, there are defenses that can provide protection against broad categories of risks and thereby increase the chances that individual causes are contained. Third, the handbook strives at all times to treat uncertainties as an integral aspect of risk and as a part of making decisions. The term "uncertainty" here does not refer to an actuarial type of data analysis, but rather to a characterization of our state of knowledge regarding results from logical and physical models that approximate reality. Uncertainty analysis finds how the output parameters of the models are related to plausible variations in the input parameters and in the modeling assumptions. The evaluation of unrtainties represents a method of probabilistic thinking wherein the analyst and decision makers recognize possible outcomes other than the outcome perceived to be "most likely." Without this type of analysis, it is not possible to determine the worth of an analysis product as a basis for making decisions related to safety and mission success. In line with these considerations the handbook does not take a hazard-analysis-centric approach to system safety. Hazard analysis remains a useful tool to facilitate brainstorming but does not substitute for a more holistic approach geared to a comprehensive identification and understanding of individual risk issues and their contributions to aggregate safety risks. The handbook strives to emphasize the importance of identifying the most critical scenarios that contribute to the risk of not meeting the agreed-upon safety objectives and requirements using all appropriate tools (including but not limited to hazard analysis). Thereafter, emphasis shifts to identifying the risk drivers that cause these scenarios to be critical and ensuring that there are controls directed toward preventing or mitigating the risk drivers. To address these and other areas, the handbook advocates a proactive, analytic-deliberative, risk-informed approach to system safety, enabling the integration of system safety activities with systems engineering and risk management processes. It emphasizes how one can systematically provide the necessary evidence to substantiate the claim that a system is safe to within an acceptable risk tolerance, and that safety has been achieved in a cost-effective manner. The methodology discussed in this handbook is part of a systems engineering process and is intended to be integral to the system safety practices being conducted by the NASA safety and mission assurance and systems engineering organizations. The handbook posits that to conclude that a system is adequately safe, it is necessary to consider a set of safety claims that derive from the safety objectives of the organization. The safety claims are developed from a hierarchy of safety objectives and are therefore hierarchical themselves. Assurance that all the claims are true within acceptable risk tolerance limits implies that all of the safety objectives have been satisfied, and therefore that the system is safe. The acceptable risk tolerance limits are provided by the authority who must make the decision whether or not to proceed to the next step in the life cycle. These tolerances are therefore referred to as the decision maker's risk tolerances. In general, the safety claims address two fundamental facets of safety: 1) whether required safety thresholds or goals have been achieved, and 2) whether the safety risk is as low as possible within reasonable impacts on cost, schedule, and performance. The latter facet includes consideration of controls that are collective in nature (i.e., apply generically to broad categories of risks) and thereby provide protection against unidentified or uncharacterized risks.
The tumor necrosis factor-α-238 polymorphism and digestive system cancer risk: a meta-analysis.
Hui, Ming; Yan, Xiaojuan; Jiang, Ying
2016-08-01
Many studies have reported the association between tumor necrosis factor-α (TNF-α)-238 polymorphism and digestive system cancer susceptibility, but the results were inconclusive. We performed a meta-analysis to derive a more precise estimation of the relationship between TNF-α-238 G/A polymorphism and digestive system cancer risk. Pooled analysis for the TNF-α-238 G/A polymorphism contained 26 studies with a total of 4849 cases and 8567 controls. The meta-analysis observed a significant association between TNF-α-238 G/A polymorphism and digestive system cancer risk in the overall population (GA vs GG: OR 1.19, 95 % CI 1.00-1.40, P heterpgeneity = 0.016; A vs G: OR 1.19, 95 % CI 1.03-1.39, P heterpgeneity = 0.015; dominant model: OR 1.20, 95 % CI 1.02-1.41, P heterpgeneity = 0.012). In the analysis of the ethnic subgroups, however, similar results were observed only in the Asian population, but not in the Caucasian population. Therefore, this meta-analysis suggests that TNF-α-238 G/A polymorphism is associated with a significantly increased risk of digestive system cancer. Further large and well-designed studies are needed to confirm these findings.
Lin, Chia-Ying; Hsiao, Chun-Ching; Chen, Po-Quan; Hollister, Scott J
2004-08-15
An approach combining global layout and local microstructure topology optimization was used to create a new interbody fusion cage design that concurrently enhanced stability, biofactor delivery, and mechanical tissue stimulation for improved arthrodesis. To develop a new interbody fusion cage design by topology optimization with porous internal architecture. To compare the performance of this new design to conventional threaded cage designs regarding early stability and long-term stress shielding effects on ingrown bone. Conventional interbody cage designs mainly fall into categories of cylindrical or rectangular shell shapes. The designs contribute to rigid stability and maintain disc height for successful arthrodesis but may also suffer mechanically mediated failures of dislocation or subsidence, as well as the possibility of bone resorption. The new optimization approach created a cage having designed microstructure that achieved desired mechanical performance while providing interconnected channels for biofactor delivery. The topology optimization algorithm determines the material layout under desirable volume fraction (50%) and displacement constraints favorable to bone formation. A local microstructural topology optimization method was used to generate periodic microstructures for porous isotropic materials. Final topology was generated by the integration of the two-scaled structures according to segmented regions and the corresponding material density. Image-base finite element analysis was used to compare the mechanical performance of the topology-optimized cage and conventional threaded cage. The final design can be fabricated by a variety of Solid Free-Form systems directly from the image output. The new design exhibited a narrower, more uniform displacement range than the threaded cage design and lower stress at the cage-vertebra interface, suggesting a reduced risk of subsidence. Strain energy density analysis also indicated that a higher portion of total strain energy density was transferred into the new bone region inside the new designed cage, indicating a reduced risk of stress shielding. The new design approach using integrated topology optimization demonstrated comparable or better stability by limited displacement and reduced localized deformation related to the risk of subsidence. Less shielding of newly formed bone was predicted inside the new designed cage. Using the present approach, it is also possible to tailor cage design for specific materials, either titanium or polymer, that can attain the desired balance between stability, reduced stress shielding, and porosity for biofactor delivery.
ERIC Educational Resources Information Center
Gage, Nicholas A.; Lewis, Timothy J.; Stichter, Janine P.
2012-01-01
Of the myriad practices currently utilized for students with disabilities, particularly students with or at risk for emotional and/or behavioral disorder (EBD), functional behavior assessment (FBA) is a practice with an emerging solid research base. However, the FBA research base relies on single-subject design (SSD) and synthesis has relied on…
USDA-ARS?s Scientific Manuscript database
OBJECTIVE To study the association of depressive symptoms or antidepressant medicine (ADM) use with subsequent cardiovascular disease (CVD) risk factor status in the Look AHEAD (Action for Health in Diabetes) trial of weight loss in type 2 diabetes. RESEARCH DESIGN AND METHODS Participants (n = 5,1...
A Classification and Analysis of Contracting Literature
1989-12-01
Pricing Model ( CAPM . This is a model designed by investment analysts to determine required rates of return given the systematic risk of a company. The...For the amount of risk they take, these profit margins were not excessively high. The author examined profitability in terms of the Capital Asset ...taxonomy was applied was limited , the results were necessarily qualified. However, at the least this application provided areas for further research
Wiedemann, Peter M; Schütz, Holger; Clauberg, Martin
2008-02-01
This study investigated whether the SAR value is a purchase-relevant characteristic of mobile phones for laypersons and what effect the disclosure of a precautionary SAR value has on laypersons' risk perception. The study consisted of two parts: Study part 1 used a conjoint analysis design to explore the relevance of the SAR value and other features of mobile phones for an intended buying decision. Study part 2 used an experimental, repeated measures design to examine the effect of the magnitude of SAR values and the disclosure of a precautionary SAR value on risk perception. In addition, the study included an analysis of prior concerns of the study participants with regard to mobile phone risks. Part 1 indicates that the SAR value has a high relevance for laypersons' purchase intentions. In the experimental purchase setting it ranks even before price and equipment features. The results of study part 2 show that providing information of a precautionary limit value does not influence risk perception. This result suggests that laypersons' underlying subjective "safety model" for mobile phones resembles more a "margin of safety" concept than a threshold concept. The latter observation holds true no matter how concerned the participants are. (c) 2007 Wiley-Liss, Inc.
Reliability considerations for the total strain range version of strainrange partitioning
NASA Technical Reports Server (NTRS)
Wirsching, P. H.; Wu, Y. T.
1984-01-01
A proposed total strainrange version of strainrange partitioning (SRP) to enhance the manner in which SRP is applied to life prediction is considered with emphasis on how advanced reliability technology can be applied to perform risk analysis and to derive safety check expressions. Uncertainties existing in the design factors associated with life prediction of a component which experiences the combined effects of creep and fatigue can be identified. Examples illustrate how reliability analyses of such a component can be performed when all design factors in the SRP model are random variables reflecting these uncertainties. The Rackwitz-Fiessler and Wu algorithms are used and estimates of the safety index and the probablity of failure are demonstrated for a SRP problem. Methods of analysis of creep-fatigue data with emphasis on procedures for producing synoptic statistics are presented. An attempt to demonstrate the importance of the contribution of the uncertainties associated with small sample sizes (fatique data) to risk estimates is discussed. The procedure for deriving a safety check expression for possible use in a design criteria document is presented.
The impact of moderate wine consumption on the risk of developing prostate cancer
Ferro, Matteo; Foerster, Beat; Abufaraj, Mohammad; Briganti, Alberto; Karakiewicz, Pierre I; Shariat, Shahrokh F
2018-01-01
Objective To investigate the impact of moderate wine consumption on the risk of prostate cancer (PCa). We focused on the differential effect of moderate consumption of red versus white wine. Design This study was a meta-analysis that includes data from case–control and cohort studies. Materials and methods A systematic search of Web of Science, Medline/PubMed, and Cochrane library was performed on December 1, 2017. Studies were deemed eligible if they assessed the risk of PCa due to red, white, or any wine using multivariable logistic regression analysis. We performed a formal meta-analysis for the risk of PCa according to moderate wine and wine type consumption (white or red). Heterogeneity between studies was assessed using Cochrane’s Q test and I2 statistics. Publication bias was assessed using Egger’s regression test. Results A total of 930 abstracts and titles were initially identified. After removal of duplicates, reviews, and conference abstracts, 83 full-text original articles were screened. Seventeen studies (611,169 subjects) were included for final evaluation and fulfilled the inclusion criteria. In the case of moderate wine consumption: the pooled risk ratio (RR) for the risk of PCa was 0.98 (95% CI 0.92–1.05, p=0.57) in the multivariable analysis. Moderate white wine consumption increased the risk of PCa with a pooled RR of 1.26 (95% CI 1.10–1.43, p=0.001) in the multi-variable analysis. Meanwhile, moderate red wine consumption had a protective role reducing the risk by 12% (RR 0.88, 95% CI 0.78–0.999, p=0.047) in the multivariable analysis that comprised 222,447 subjects. Conclusions In this meta-analysis, moderate wine consumption did not impact the risk of PCa. Interestingly, regarding the type of wine, moderate consumption of white wine increased the risk of PCa, whereas moderate consumption of red wine had a protective effect. Further analyses are needed to assess the differential molecular effect of white and red wine conferring their impact on PCa risk. PMID:29713200
PRA (Probabilistic Risk Assessments) Participation versus Validation
NASA Technical Reports Server (NTRS)
DeMott, Diana; Banke, Richard
2013-01-01
Probabilistic Risk Assessments (PRAs) are performed for projects or programs where the consequences of failure are highly undesirable. PRAs primarily address the level of risk those projects or programs posed during operations. PRAs are often developed after the design has been completed. Design and operational details used to develop models include approved and accepted design information regarding equipment, components, systems and failure data. This methodology basically validates the risk parameters of the project or system design. For high risk or high dollar projects, using PRA methodologies during the design process provides new opportunities to influence the design early in the project life cycle to identify, eliminate or mitigate potential risks. Identifying risk drivers before the design has been set allows the design engineers to understand the inherent risk of their current design and consider potential risk mitigation changes. This can become an iterative process where the PRA model can be used to determine if the mitigation technique is effective in reducing risk. This can result in more efficient and cost effective design changes. PRA methodology can be used to assess the risk of design alternatives and can demonstrate how major design changes or program modifications impact the overall program or project risk. PRA has been used for the last two decades to validate risk predictions and acceptability. Providing risk information which can positively influence final system and equipment design the PRA tool can also participate in design development, providing a safe and cost effective product.
Design and analysis of group-randomized trials in cancer: A review of current practices.
Murray, David M; Pals, Sherri L; George, Stephanie M; Kuzmichev, Andrey; Lai, Gabriel Y; Lee, Jocelyn A; Myles, Ranell L; Nelson, Shakira M
2018-06-01
The purpose of this paper is to summarize current practices for the design and analysis of group-randomized trials involving cancer-related risk factors or outcomes and to offer recommendations to improve future trials. We searched for group-randomized trials involving cancer-related risk factors or outcomes that were published or online in peer-reviewed journals in 2011-15. During 2016-17, in Bethesda MD, we reviewed 123 articles from 76 journals to characterize their design and their methods for sample size estimation and data analysis. Only 66 (53.7%) of the articles reported appropriate methods for sample size estimation. Only 63 (51.2%) reported exclusively appropriate methods for analysis. These findings suggest that many investigators do not adequately attend to the methodological challenges inherent in group-randomized trials. These practices can lead to underpowered studies, to an inflated type 1 error rate, and to inferences that mislead readers. Investigators should work with biostatisticians or other methodologists familiar with these issues. Funders and editors should ensure careful methodological review of applications and manuscripts. Reviewers should ensure that studies are properly planned and analyzed. These steps are needed to improve the rigor and reproducibility of group-randomized trials. The Office of Disease Prevention (ODP) at the National Institutes of Health (NIH) has taken several steps to address these issues. ODP offers an online course on the design and analysis of group-randomized trials. ODP is working to increase the number of methodologists who serve on grant review panels. ODP has developed standard language for the Application Guide and the Review Criteria to draw investigators' attention to these issues. Finally, ODP has created a new Research Methods Resources website to help investigators, reviewers, and NIH staff better understand these issues. Published by Elsevier Inc.
Past, present, and future design of urban drainage systems with focus on Danish experiences.
Arnbjerg-Nielsen, K
2011-01-01
Climate change will influence the water cycle substantially, and extreme precipitation will become more frequent in many regions in the years to come. How should this fact be incorporated into design of urban drainage systems, if at all? And how important is climate change compared to other changes over time? Based on an analysis of the underlying key drivers of changes that are expected to affect urban drainage systems the current problems and their predicted development over time are presented. One key issue is management of risk and uncertainties and therefore a framework for design and analysis of urban structures in light of present and future uncertainties is presented.
Advanced Vibration Analysis Tool Developed for Robust Engine Rotor Designs
NASA Technical Reports Server (NTRS)
Min, James B.
2005-01-01
The primary objective of this research program is to develop vibration analysis tools, design tools, and design strategies to significantly improve the safety and robustness of turbine engine rotors. Bladed disks in turbine engines always feature small, random blade-to-blade differences, or mistuning. Mistuning can lead to a dramatic increase in blade forced-response amplitudes and stresses. Ultimately, this results in high-cycle fatigue, which is a major safety and cost concern. In this research program, the necessary steps will be taken to transform a state-of-the-art vibration analysis tool, the Turbo- Reduce forced-response prediction code, into an effective design tool by enhancing and extending the underlying modeling and analysis methods. Furthermore, novel techniques will be developed to assess the safety of a given design. In particular, a procedure will be established for using natural-frequency curve veerings to identify ranges of operating conditions (rotational speeds and engine orders) in which there is a great risk that the rotor blades will suffer high stresses. This work also will aid statistical studies of the forced response by reducing the necessary number of simulations. Finally, new strategies for improving the design of rotors will be pursued.
Using the CABLES model to assess and minimize risk in research: control group hazards.
Koocher, G P
2002-01-01
CABLES is both an acronym and metaphor for conceptualizing research participation risk by considering 6 distinct domains in which risks of harm to research participants may exist: cognitive, affective, biological, legal, economic, and social/cultural. These domains are described and illustrated, along with suggestions for minimizing or eliminating the potential hazards to human participants in biomedical and behavioral science research. Adoption of a thoughtful ethical analysis addressing all 6 CABLES strands in designing research provides a strong protective step toward safeguarding and promoting the well-being of study participants.
Operationalization Of The Professional Risks Assessment Activity
NASA Astrophysics Data System (ADS)
Ivascu, Victoria Larisa; Cirjaliu, Bianca; Draghici, Anca
2015-07-01
Professional risks assessment approach (integration of analysis and evaluation processes) is linked with the general concerns of nowadays companies for their employees' health and safety assurances, in the context of organizations sustainable development. The paper presents an approach for the operationalization of the professional risk assessment activity in companies through the implementation and use of the OnRisk platform (this have been tested in some industrial companies). The short presentation of the relevant technical reports and statistics on OSH management at the European Union level underlines the need for the development of a professional risks assessment. Finally, there have been described the designed and developed OnRisk platform as a web platform together with some case studies that have validate the created tool.
Yang, Yi; George, Kaisha C; Shang, Wei-Feng; Zeng, Rui; Ge, Shu-Wang; Xu, Gang
2017-01-01
Recent studies have suggested a potential increased risk of acute kidney injury (AKI) among proton-pump inhibitor (PPI) users. However, the present results are conflicting. Thus, we performed a meta-analysis to investigate the association between PPI therapy and the risk of AKI. EMBASE, PubMed, Web of Science, and Cochrane Library databases (up to September 23, 2016) were systematically searched for any studies assessing the relationship between PPI use and risk of AKI. Studies that reported relevant risk ratios (RRs), odds ratios, or hazard ratios were included. We calculated the pooled RRs with 95% confidence intervals (CI) using a random-effects model of the meta-analysis. Subgroup analysis was conducted to explore the source of heterogeneity. Seven observational studies (five cohort studies and two case-control studies) were identified and included, and a total of 513,696 cases of PPI use among 2,404,236 participants were included in the meta-analysis. The pooled adjusted RR of AKI in patients with PPIs use was 1.61 (95% CI: 1.16-2.22; I 2 =98.1%). Furthermore, higher risks of AKI were found in the subgroups of cohort studies, participant's average age <60 years, participants with and without baseline PPI excluded, sample size <300,000, and number of adjustments ≥11. Subgroup analyses revealed that participants with or without baseline PPI excluded might be a source of heterogeneity. PPI use could be a risk factor for AKI and should be administered carefully. Nevertheless, some confounding factors might impact the outcomes. More well-designed prospective studies are needed to clarify the association.
Moayeri, Ardeshir; Mohamadpour, Mahmoud; Mousavi, Seyedeh Fatemeh; Shirzadpour, Ehsan; Mohamadpour, Safoura; Amraei, Mansour
2017-01-01
Aim Patients with type 2 diabetes mellitus (T2DM) have an increased risk of bone fractures. A variable increase in fracture risk has been reported depending on skeletal site, diabetes duration, study design, insulin use, and so on. The present meta-analysis aimed to investigate the association between T2DM with fracture risk and possible risk factors. Methods Different databases including PubMed, Institute for Scientific Information, and Scopus were searched up to May 2016. All epidemiologic studies on the association between T2DM and fracture risk were included. The relevant data obtained from these papers were analyzed by a random effects model and publication bias was assessed by funnel plot. All analyses were done by R software (version 3.2.1) and STATA (version 11.1). Results Thirty eligible studies were selected for the meta-analysis. We found a statistically significant positive association between T2DM and hip, vertebral, or foot fractures and no association between T2DM and wrist, proximal humerus, or ankle fractures. Overall, T2DM was associated with an increased risk of any fracture (summary relative risk =1.05, 95% confidence interval: 1.04, 1.06) and increased with age, duration of diabetes, and insulin therapy. Conclusion Our findings strongly support an association between T2DM and increased risk of overall fracture. These findings emphasize the need for fracture prevention strategies in patients with diabetes. PMID:28442913
Naing, Cho; Aung, Kyan; Lai, Pei Kuan; Mak, Joon Wah
2017-01-05
Human chromosomes are capped and stabilized by telomeres. Telomere length regulates a 'cellular mitotic clock' that defines the number of cell divisions and hence, cellular life span. This study aimed to synthesize the evidence on the association between peripheral blood leucocytes (PBL) telomere length and the risk of colorectal cancer (CRC). We searched relevant studies in electronic databases. When two or more observational studies reported the same outcome measures, we performed pooled analysis. All the analyses were performed on PBL using PCR. The odds ratio (OR) and its 95% confidence interval (CI) were used to assess the strength of association. Seven studies (with 8 datasets) were included in this meta-analysis; 3 prospective studies, 3 retrospective studies and 1 study with a separate prospective and retrospective designs. The pooled analysis of 4 prospective studies (summary OR 1.01, 95% CI: 0.77-1.34, I 2 :30%) and 4 retrospective studies (summary OR 1.65, 95% CI: 0.96-2.83, I 2 :96%) showed no relationship between PBL telomere length and the CRC risk. A subgroup analysis of 2 prospective studies exclusively on females also showed no association between PBL telomere length and the CRC risk (summary OR, 1.17, 95% CI:0.72-1.91, I 2 :57%). The current analysis is insufficient to provide evidence on the relationship between PBL telomere length and the risk of CRC. Findings suggest that there may be a complex relationship between PBL telomere length and the CRC risk or discrepancy between genetics, age of patients and clinical studies. Future well powered, large prospective studies on the relationship between telomere length and the risk of CRC, and the investigations of the biologic mechanisms are recommended.
Preliminary candidate advanced avionics system for general aviation
NASA Technical Reports Server (NTRS)
Mccalla, T. M.; Grismore, F. L.; Greatline, S. E.; Birkhead, L. M.
1977-01-01
An integrated avionics system design was carried out to the level which indicates subsystem function, and the methods of overall system integration. Sufficient detail was included to allow identification of possible system component technologies, and to perform reliability, modularity, maintainability, cost, and risk analysis upon the system design. Retrofit to older aircraft, availability of this system to the single engine two place aircraft, was considered.
Takeuchi, Yoshinori; Shinozaki, Tomohiro; Matsuyama, Yutaka
2018-01-08
Despite the frequent use of self-controlled methods in pharmacoepidemiological studies, the factors that may bias the estimates from these methods have not been adequately compared in real-world settings. Here, we comparatively examined the impact of a time-varying confounder and its interactions with time-invariant confounders, time trends in exposures and events, restrictions, and misspecification of risk period durations on the estimators from three self-controlled methods. This study analyzed self-controlled case series (SCCS), case-crossover (CCO) design, and sequence symmetry analysis (SSA) using simulated and actual electronic medical records datasets. We evaluated the performance of the three self-controlled methods in simulated cohorts for the following scenarios: 1) time-invariant confounding with interactions between the confounders, 2) time-invariant and time-varying confounding without interactions, 3) time-invariant and time-varying confounding with interactions among the confounders, 4) time trends in exposures and events, 5) restricted follow-up time based on event occurrence, and 6) patient restriction based on event history. The sensitivity of the estimators to misspecified risk period durations was also evaluated. As a case study, we applied these methods to evaluate the risk of macrolides on liver injury using electronic medical records. In the simulation analysis, time-varying confounding produced bias in the SCCS and CCO design estimates, which aggravated in the presence of interactions between the time-invariant and time-varying confounders. The SCCS estimates were biased by time trends in both exposures and events. Erroneously short risk periods introduced bias to the CCO design estimate, whereas erroneously long risk periods introduced bias to the estimates of all three methods. Restricting the follow-up time led to severe bias in the SSA estimates. The SCCS estimates were sensitive to patient restriction. The case study showed that although macrolide use was significantly associated with increased liver injury occurrence in all methods, the value of the estimates varied. The estimations of the three self-controlled methods depended on various underlying assumptions, and the violation of these assumptions may cause non-negligible bias in the resulting estimates. Pharmacoepidemiologists should select the appropriate self-controlled method based on how well the relevant key assumptions are satisfied with respect to the available data.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wenzel, Tom P.
In its 2012 report NHTSA simulated the effect four fleetwide mass reduction scenarios would have on the change in annual fatalities. NHTSA estimated that the most aggressive of these scenarios (reducing mass 5.2% in heavier light trucks and 2.6% in all other vehicles types except lighter cars) would result in a small reduction in societal fatalities. LBNL replicated the methodology NHTSA used to simulate six mass reduction scenarios, including the mass reductions recommended in the 2015 NRC committee report, and estimated in 2021 and 2025 by EPA in the TAR, using the updated data through 2012. The analysis indicates thatmore » the estimated x change in fatalities under each scenario based on the updated analysis is comparable to that in the 2012 analysis, but less beneficial or more detrimental than that in the 2016 analysis. For example, an across the board 100-lb reduction in mass would result in an estimated 157 additional annual fatalities based on the 2012 analysis, but would result in only an estimated 91 additional annual fatalities based on the 2016 analysis, and an additional 87 fatalities based on the current analysis. The mass reductions recommended by the 2015 NRC committee report6 would result in a 224 increase in annual fatalities in the 2012 analysis, a 344 decrease in annual fatalities in the 2016 analysis, and a 141 increase in fatalities in the current analysis. The mass reductions EPA estimated for 2025 in the TAR7 would result in a 203 decrease in fatalities based on the 2016 analysis, but an increase of 39 fatalities based on the current analysis. These results support NHTSA’s conclusion from its 2012 study that, when footprint is held fixed, “no judicious combination of mass reductions in the various classes of vehicles results in a statistically significant fatality increase and many potential combinations are safety-neutral as point estimates.”Like the previous NHTSA studies, this updated report concludes that the estimated effect of mass reduction while maintaining footprint on societal U.S. fatality risk is small, and not statistically significant at the 95% or 90% confidence level for all vehicle types based on the jack-knife method NHTSA used. This report also finds that the estimated effects of other control variables, such as vehicle type, specific safety technologies, and crash conditions such as whether the crash occurred at night, in a rural county, or on a high-speed road, on risk are much larger, in some cases two orders of magnitude larger, than the estimated effect of mass or footprint reduction on risk. Finally, this report shows that after accounting for the many vehicle, driver, and crash variables NHTSA used in its regression analyses, there remains a wide variation in risk by vehicle make and model, and this variation is unrelated to vehicle mass. Although the purpose of the NHTSA and LBNL reports is to estimate the effect of vehicle mass reduction on societal risk, this is not exactly what the regression models are estimating. Rather, they are estimating the recent historical relationship between mass and risk, after accounting for most measurable differences between vehicles, drivers, and crash times and locations. In essence, the regression models are comparing the risk of a 2600-lb Dodge Neon with that of a 2500-lb Honda Civic, after attempting to account for all other differences between the two vehicles. The models are not estimating the effect of literally removing 100 pounds from the Neon, leaving everything else unchanged. In addition, the analyses are based on the relationship of vehicle mass and footprint on risk for recent vehicle designs (model year 2004 to 2011). These relationships may or may not continue into the future as manufacturers utilize new vehicle designs and incorporate new technologies, such as more extensive use of strong lightweight materials and specific safety technologies. Therefore, throughout this report we use the phrase “the estimated effect of mass (or footprint) reduction on risk” as shorthand for “the estimated change in risk as a function of its relationship to mass (or footprint) for vehicle models of recent design.”« less
NASA Technical Reports Server (NTRS)
Moore, N. R.; Ebbeler, D. H.; Newlin, L. E.; Sutharshana, S.; Creager, M.
1992-01-01
An improved methodology for quantitatively evaluating failure risk of spaceflight systems to assess flight readiness and identify risk control measures is presented. This methodology, called Probabilistic Failure Assessment (PFA), combines operating experience from tests and flights with engineering analysis to estimate failure risk. The PFA methodology is of particular value when information on which to base an assessment of failure risk, including test experience and knowledge of parameters used in engineering analyses of failure phenomena, is expensive or difficult to acquire. The PFA methodology is a prescribed statistical structure in which engineering analysis models that characterize failure phenomena are used conjointly with uncertainties about analysis parameters and/or modeling accuracy to estimate failure probability distributions for specific failure modes. These distributions can then be modified, by means of statistical procedures of the PFA methodology, to reflect any test or flight experience. Conventional engineering analysis models currently employed for design of failure prediction are used in this methodology. The PFA methodology is described and examples of its application are presented. Conventional approaches to failure risk evaluation for spaceflight systems are discussed, and the rationale for the approach taken in the PFA methodology is presented. The statistical methods, engineering models, and computer software used in fatigue failure mode applications are thoroughly documented.
Trades Between Opposition and Conjunction Class Trajectories for Early Human Missions to Mars
NASA Technical Reports Server (NTRS)
Mattfeld, Bryan; Stromgren, Chel; Shyface, Hilary; Komar, David R.; Cirillo, William; Goodliff, Kandyce
2014-01-01
Candidate human missions to Mars, including NASA's Design Reference Architecture 5.0, have focused on conjunction-class missions with long crewed durations and minimum energy trajectories to reduce total propellant requirements and total launch mass. However, in order to progressively reduce risk and gain experience in interplanetary mission operations, it may be desirable that initial human missions to Mars, whether to the surface or to Mars orbit, have shorter total crewed durations and minimal stay times at the destination. Opposition-class missions require larger total energy requirements relative to conjunction-class missions but offer the potential for much shorter mission durations, potentially reducing risk and overall systems performance requirements. This paper will present a detailed comparison of conjunction-class and opposition-class human missions to Mars vicinity with a focus on how such missions could be integrated into the initial phases of a Mars exploration campaign. The paper will present the results of a trade study that integrates trajectory/propellant analysis, element design, logistics and sparing analysis, and risk assessment to produce a comprehensive comparison of opposition and conjunction exploration mission constructs. Included in the trade study is an assessment of the risk to the crew and the trade offs between the mission duration and element, logistics, and spares mass. The analysis of the mission trade space was conducted using four simulation and analysis tools developed by NASA. Trajectory analyses for Mars destination missions were conducted using VISITOR (Versatile ImpulSive Interplanetary Trajectory OptimizeR), an in-house tool developed by NASA Langley Research Center. Architecture elements were evaluated using EXploration Architecture Model for IN-space and Earth-to-orbit (EXAMINE), a parametric modeling tool that generates exploration architectures through an integrated systems model. Logistics analysis was conducted using NASA's Human Exploration Logistics Model (HELM), and sparing allocation predictions were generated via the Exploration Maintainability Analysis Tool (EMAT), which is a probabilistic simulation engine that evaluates trades in spacecraft reliability and sparing requirements based on spacecraft system maintainability and reparability.
Coal gasification systems engineering and analysis, volume 2
NASA Technical Reports Server (NTRS)
1980-01-01
The major design related features of each generic plant system were characterized in a catalog. Based on the catalog and requirements data, approximately 17 designs and cost estimates were developed for MBG and alternate products. A series of generic trade studies was conducted to support all of the design studies. A set of cost and programmatic analyses were conducted to supplement the designs. The cost methodology employed for the design and sensitivity studies was documented and implemented in a computer program. Plant design and construction schedules were developed for the K-T, Texaco, and B&W MBG plant designs. A generic work breakdown structure was prepared, based on the K-T design, to coincide with TVA's planned management approach. An extensive set of cost sensitivity analyses was completed for K-T, Texaco, and B&W design. Product price competitiveness was evaluated for MBG and the alternate products. A draft management policy and procedures manual was evaluated. A supporting technology development plan was developed to address high technology risk issues. The issues were identified and ranked in terms of importance and tractability, and a plan developed for obtaining data or developing technology required to mitigate the risk.
Using incident response trees as a tool for risk management of online financial services.
Gorton, Dan
2014-09-01
The article introduces the use of probabilistic risk assessment for modeling the incident response process of online financial services. The main contribution is the creation of incident response trees, using event tree analysis, which provides us with a visual tool and a systematic way to estimate the probability of a successful incident response process against the currently known risk landscape, making it possible to measure the balance between front-end and back-end security measures. The model is presented using an illustrative example, and is then applied to the incident response process of a Swedish bank. Access to relevant data is verified and the applicability and usability of the proposed model is verified using one year of historical data. Potential advantages and possible shortcomings are discussed, referring to both the design phase and the operational phase, and future work is presented. © 2014 Society for Risk Analysis.
Li, Ning; Liu, Xueqin; Xie, Wei; Wu, Jidong; Zhang, Peng
2013-01-01
New features of natural disasters have been observed over the last several years. The factors that influence the disasters' formation mechanisms, regularity of occurrence and main characteristics have been revealed to be more complicated and diverse in nature than previously thought. As the uncertainty involved increases, the variables need to be examined further. This article discusses the importance and the shortage of multivariate analysis of natural disasters and presents a method to estimate the joint probability of the return periods and perform a risk analysis. Severe dust storms from 1990 to 2008 in Inner Mongolia were used as a case study to test this new methodology, as they are normal and recurring climatic phenomena on Earth. Based on the 79 investigated events and according to the dust storm definition with bivariate, the joint probability distribution of severe dust storms was established using the observed data of maximum wind speed and duration. The joint return periods of severe dust storms were calculated, and the relevant risk was analyzed according to the joint probability. The copula function is able to simulate severe dust storm disasters accurately. The joint return periods generated are closer to those observed in reality than the univariate return periods and thus have more value in severe dust storm disaster mitigation, strategy making, program design, and improvement of risk management. This research may prove useful in risk-based decision making. The exploration of multivariate analysis methods can also lay the foundation for further applications in natural disaster risk analysis. © 2012 Society for Risk Analysis.
NASA Astrophysics Data System (ADS)
Sun, Wenqing; Tseng, Tzu-Liang B.; Zheng, Bin; Zhang, Jianying; Qian, Wei
2015-03-01
A novel breast cancer risk analysis approach is proposed for enhancing performance of computerized breast cancer risk analysis using bilateral mammograms. Based on the intensity of breast area, five different sub-regions were acquired from one mammogram, and bilateral features were extracted from every sub-region. Our dataset includes 180 bilateral mammograms from 180 women who underwent routine screening examinations, all interpreted as negative and not recalled by the radiologists during the original screening procedures. A computerized breast cancer risk analysis scheme using four image processing modules, including sub-region segmentation, bilateral feature extraction, feature selection, and classification was designed to detect and compute image feature asymmetry between the left and right breasts imaged on the mammograms. The highest computed area under the curve (AUC) is 0.763 ± 0.021 when applying the multiple sub-region features to our testing dataset. The positive predictive value and the negative predictive value were 0.60 and 0.73, respectively. The study demonstrates that (1) features extracted from multiple sub-regions can improve the performance of our scheme compared to using features from whole breast area only; (2) a classifier using asymmetry bilateral features can effectively predict breast cancer risk; (3) incorporating texture and morphological features with density features can boost the classification accuracy.
Risk of development of acute pancreatitis with pre-existing diabetes: a meta-analysis.
Xue, Yuzheng; Sheng, Yingyue; Dai, Hong; Cao, Haiyan; Liu, Zongliang; Li, Zhaoshen
2012-09-01
It is well established that acute pancreatitis (AP) often causes diabetes mellitus. However, whether pre-existing diabetes is associated with the development of AP remains unknown. To clarify the association of pre-existing diabetes and the development of AP, we carried out a meta-analysis of observational studies. A computerized literature search was performed in MEDLINE (from 1 January 1966) and EMBASE (from 1 January 1974), through 31 January 2012. We also searched the reference lists of relevant articles. Summary relative risks with their corresponding 95% confidence intervals (CIs) were calculated using a random-effects model. Between-study heterogeneity was assessed using Cochran's Q statistic and the I 2. A total of seven articles (10 523 incident cases of AP) were included in this meta-analysis. Analysis of seven studies indicated that, compared with nondiabetic individuals, diabetic individuals had a 92% increased risk of development of AP (95% CI 1.50-2.47). There was significant evidence of heterogeneity among these studies (P heterogeneity<0.001, I 2=93.0%). These increased risks were independent of alcohol use, gallstones, and hyperlipidemia. Although the current evidence supports a positive link between pre-existing diabetes and an increased risk of development of AP, additional studies, with a perfect design, are required before definitive conclusions can be drawn.
Analysis of INDOT current hydraulic policies : [technical summary].
DOT National Transportation Integrated Search
2011-01-01
Hydraulic design often tends to be on a conservative side for safety reasons. Hydraulic structures are typically oversized with the goal being reduced future maintenance costs, and to reduce the risk of property owner complaints. This approach leads ...
Analysis of en route operational errors : probability of resolution and time-on-position.
DOT National Transportation Integrated Search
2012-02-01
The Federation Administrations Air Traffic Control Organization Safety Management System (SMS) is : designed to prevent the introduction of unacceptable safety risk into the National Airspace System. One of the : most important safety metrics used...
NASA Astrophysics Data System (ADS)
Jeuken, Ad; Mendoza, Guillermo; Matthews, John; Ray, Patrick; Haasnoot, Marjolijn; Gilroy, Kristin; Olsen, Rolf; Kucharski, John; Stakhiv, Gene; Cushing, Janet; Brown, Casey
2016-04-01
Engineers and water managers have always incorporated uncertainty in water resources operations, design and planning. In recent years, concern has been growing concern that many of the fundamental principles to address uncertainty in planning and design are insufficient for coping with unprecedented shifts in climate, especially given the long lifetimes of water investments - spanning decades, even centuries. Can we design and operate new flood risk management, energy, water supply and sanitation, and agricultural projects that are robust to shifts over 20, 50, or more years? Since about 2009, better approaches to planning and designing under climate uncertainty have been gaining ground worldwide. The main challenge is to operationalize these approaches and bring them from science to practice, embed them within the existing decision-making processes of particular institutions, and shift from highly specialized "boutique" applications to methods that result in consistent, replicable outcomes accessible to water managers worldwide. With CRIDA a serious step is taken to achieve these goals. CRIDA is built on two innovative but complementary approaches that have developed in isolation across the Atlantic over the past seven years: diagnosing and assessing risk (decision scaling), and developing sequential decision steps to compensate for uncertainty within regulatory / performance standards (adaptation pathways). First, the decision scaling or "bottom up" framework to climate change adaptation was first conceptualized during the US/Canada Great Lakes regulation study and has recently been placed in a decision-making context for water-related investments published by the World Bank Second, the adaptation pathways approach was developed in the Netherlands to cope with the level of climate uncertainty we now face. Adaptation pathways is a tool for maintaining options and flexibility while meeting operational goals by envisioning how sequences of decisions can be navigated over time. They are part of the Dutch adaptive planning approach Adaptive Delta Management, executed and develop by the Dutch Delta program. Both decision scaling and adaptation pathways have been piloted in studies worldwide. The objective of CRIDA is to mainstream effective climate adaptation for professional water managers. The CRIDA publication, due in april 2016, follows the generic water design planning design cycle. At each step, CRIDA describes stepwise guidance for incorporating climate robustness: problem definition, stress test, alternatives formulation and recommendation, evaluation and selection. In the presentation the origin, goal, steps and practical tools available at each step of CRIDA will be explained. In two other abstracts ("Climate Risk Informed Decision Analysis: A Hypothetical Application to the Waas Region" by Gilroy et al., "The Application of Climate Risk Informed Decision Analysis to the Ioland Water Treatment Plant in Lusaka, Zambia, by Kucharski et al.), the application of CRIDA to cases is explained
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shatkin, J. A.; Ong, Kimberly J.; Beaudrie, Christian
The Society for Risk Analysis (SRA) has a history of bringing thought leadership to topics of emerging risk. In September 2014, the SRA Emerging Nanoscale Materials Specialty Group convened an international workshop to examine the use of alternative testing strategies (ATS) for manufactured nanomaterials (NM) from a risk analysis perspective. Experts in NM environmental health and safety, human health, ecotoxicology, regulatory compliance, risk analysis, and ATS evaluated and discussed the state of the science for in vitro and other alternatives to traditional toxicology testing for NM. Based on this review, experts recommended immediate and near-term actions that would advance ATSmore » use in NM risk assessment. Three focal areas-human health, ecological health, and exposure considerations-shaped deliberations about information needs, priorities, and the next steps required to increase confidence in and use of ATS in NM risk assessment. The deliberations revealed that ATS are now being used for screening, and that, in the near term, ATS could be developed for use in read-across or categorization decision making within certain regulatory frameworks. Participants recognized that leadership is required from within the scientific community to address basic challenges, including standardizing materials, protocols, techniques and reporting, and designing experiments relevant to real-world conditions, as well as coordination and sharing of large-scale collaborations and data. Experts agreed that it will be critical to include experimental parameters that can support the development of adverse outcome pathways. Numerous other insightful ideas for investment in ATS emerged throughout the discussions and are further highlighted in this article.« less
Shatkin, J A; Ong, Kimberly J; Beaudrie, Christian; Clippinger, Amy J; Hendren, Christine Ogilvie; Haber, Lynne T; Hill, Myriam; Holden, Patricia; Kennedy, Alan J; Kim, Baram; MacDonell, Margaret; Powers, Christina M; Sharma, Monita; Sheremeta, Lorraine; Stone, Vicki; Sultan, Yasir; Turley, Audrey; White, Ronald H
2016-08-01
The Society for Risk Analysis (SRA) has a history of bringing thought leadership to topics of emerging risk. In September 2014, the SRA Emerging Nanoscale Materials Specialty Group convened an international workshop to examine the use of alternative testing strategies (ATS) for manufactured nanomaterials (NM) from a risk analysis perspective. Experts in NM environmental health and safety, human health, ecotoxicology, regulatory compliance, risk analysis, and ATS evaluated and discussed the state of the science for in vitro and other alternatives to traditional toxicology testing for NM. Based on this review, experts recommended immediate and near-term actions that would advance ATS use in NM risk assessment. Three focal areas-human health, ecological health, and exposure considerations-shaped deliberations about information needs, priorities, and the next steps required to increase confidence in and use of ATS in NM risk assessment. The deliberations revealed that ATS are now being used for screening, and that, in the near term, ATS could be developed for use in read-across or categorization decision making within certain regulatory frameworks. Participants recognized that leadership is required from within the scientific community to address basic challenges, including standardizing materials, protocols, techniques and reporting, and designing experiments relevant to real-world conditions, as well as coordination and sharing of large-scale collaborations and data. Experts agreed that it will be critical to include experimental parameters that can support the development of adverse outcome pathways. Numerous other insightful ideas for investment in ATS emerged throughout the discussions and are further highlighted in this article. © 2016 Society for Risk Analysis.
Probability of Loss of Crew Achievability Studies for NASA's Exploration Systems Development
NASA Technical Reports Server (NTRS)
Boyer, Roger L.; Bigler, Mark; Rogers, James H.
2014-01-01
Over the last few years, NASA has been evaluating various vehicle designs for multiple proposed design reference missions (DRM) beyond low Earth orbit in support of its Exploration Systems Development (ESD) programs. This paper addresses several of the proposed missions and the analysis techniques used to assess the key risk metric, probability of loss of crew (LOC). Probability of LOC is a metric used to assess the safety risk as well as a design requirement. These risk assessments typically cover the concept phase of a DRM, i.e. when little more than a general idea of the mission is known and are used to help establish "best estimates" for proposed program and agency level risk requirements. These assessments or studies were categorized as LOC achievability studies to help inform NASA management as to what "ball park" estimates of probability of LOC could be achieved for each DRM and were eventually used to establish the corresponding LOC requirements. Given that details of the vehicles and mission are not well known at this time, the ground rules, assumptions, and consistency across the programs become the important basis of the assessments as well as for the decision makers to understand.
Probability of Loss of Crew Achievability Studies for NASA's Exploration Systems Development
NASA Technical Reports Server (NTRS)
Boyer, Roger L.; Bigler, Mark; Rogers, James H.
2015-01-01
Over the last few years, NASA has been evaluating various vehicle designs for multiple proposed design reference missions (DRM) beyond low Earth orbit in support of its Exploration Systems Development (ESD) programs. This paper addresses several of the proposed missions and the analysis techniques used to assess the key risk metric, probability of loss of crew (LOC). Probability of LOC is a metric used to assess the safety risk as well as a design requirement. These risk assessments typically cover the concept phase of a DRM, i.e. when little more than a general idea of the mission is known and are used to help establish "best estimates" for proposed program and agency level risk requirements. These assessments or studies were categorized as LOC achievability studies to help inform NASA management as to what "ball park" estimates of probability of LOC could be achieved for each DRM and were eventually used to establish the corresponding LOC requirements. Given that details of the vehicles and mission are not well known at this time, the ground rules, assumptions, and consistency across the programs become the important basis of the assessments as well as for the decision makers to understand.
Singh, Jay P; Doll, Helen; Grann, Martin
2012-01-01
Objective To investigate the predictive validity of tools commonly used to assess the risk of violence, sexual, and criminal behaviour. Design Systematic review and tabular meta-analysis of replication studies following PRISMA guidelines. Data sources PsycINFO, Embase, Medline, and United States Criminal Justice Reference Service Abstracts. Review methods We included replication studies from 1 January 1995 to 1 January 2011 if they provided contingency data for the offending outcome that the tools were designed to predict. We calculated the diagnostic odds ratio, sensitivity, specificity, area under the curve, positive predictive value, negative predictive value, the number needed to detain to prevent one offence, as well as a novel performance indicator—the number safely discharged. We investigated potential sources of heterogeneity using metaregression and subgroup analyses. Results Risk assessments were conducted on 73 samples comprising 24 847 participants from 13 countries, of whom 5879 (23.7%) offended over an average of 49.6 months. When used to predict violent offending, risk assessment tools produced low to moderate positive predictive values (median 41%, interquartile range 27-60%) and higher negative predictive values (91%, 81-95%), and a corresponding median number needed to detain of 2 (2-4) and number safely discharged of 10 (4-18). Instruments designed to predict violent offending performed better than those aimed at predicting sexual or general crime. Conclusions Although risk assessment tools are widely used in clinical and criminal justice settings, their predictive accuracy varies depending on how they are used. They seem to identify low risk individuals with high levels of accuracy, but their use as sole determinants of detention, sentencing, and release is not supported by the current evidence. Further research is needed to examine their contribution to treatment and management. PMID:22833604
Decision Support Methods and Tools
NASA Technical Reports Server (NTRS)
Green, Lawrence L.; Alexandrov, Natalia M.; Brown, Sherilyn A.; Cerro, Jeffrey A.; Gumbert, Clyde r.; Sorokach, Michael R.; Burg, Cecile M.
2006-01-01
This paper is one of a set of papers, developed simultaneously and presented within a single conference session, that are intended to highlight systems analysis and design capabilities within the Systems Analysis and Concepts Directorate (SACD) of the National Aeronautics and Space Administration (NASA) Langley Research Center (LaRC). This paper focuses on the specific capabilities of uncertainty/risk analysis, quantification, propagation, decomposition, and management, robust/reliability design methods, and extensions of these capabilities into decision analysis methods within SACD. These disciplines are discussed together herein under the name of Decision Support Methods and Tools. Several examples are discussed which highlight the application of these methods within current or recent aerospace research at the NASA LaRC. Where applicable, commercially available, or government developed software tools are also discussed
Design for Reliability and Safety Approach for the NASA New Launch Vehicle
NASA Technical Reports Server (NTRS)
Safie, Fayssal, M.; Weldon, Danny M.
2007-01-01
The United States National Aeronautics and Space Administration (NASA) is in the midst of a space exploration program intended for sending crew and cargo to the international Space Station (ISS), to the moon, and beyond. This program is called Constellation. As part of the Constellation program, NASA is developing new launch vehicles aimed at significantly increase safety and reliability, reduce the cost of accessing space, and provide a growth path for manned space exploration. Achieving these goals requires a rigorous process that addresses reliability, safety, and cost upfront and throughout all the phases of the life cycle of the program. This paper discusses the "Design for Reliability and Safety" approach for the NASA new crew launch vehicle called ARES I. The ARES I is being developed by NASA Marshall Space Flight Center (MSFC) in support of the Constellation program. The ARES I consists of three major Elements: A solid First Stage (FS), an Upper Stage (US), and liquid Upper Stage Engine (USE). Stacked on top of the ARES I is the Crew exploration vehicle (CEV). The CEV consists of a Launch Abort System (LAS), Crew Module (CM), Service Module (SM), and a Spacecraft Adapter (SA). The CEV development is being led by NASA Johnson Space Center (JSC). Designing for high reliability and safety require a good integrated working environment and a sound technical design approach. The "Design for Reliability and Safety" approach addressed in this paper discusses both the environment and the technical process put in place to support the ARES I design. To address the integrated working environment, the ARES I project office has established a risk based design group called "Operability Design and Analysis" (OD&A) group. This group is an integrated group intended to bring together the engineering, design, and safety organizations together to optimize the system design for safety, reliability, and cost. On the technical side, the ARES I project has, through the OD&A environment, implemented a probabilistic approach to analyze and evaluate design uncertainties and understand their impact on safety, reliability, and cost. This paper focuses on the use of the various probabilistic approaches that have been pursued by the ARES I project. Specifically, the paper discusses an integrated functional probabilistic analysis approach that addresses upffont some key areas to support the ARES I Design Analysis Cycle (DAC) pre Preliminary Design (PD) Phase. This functional approach is a probabilistic physics based approach that combines failure probabilities with system dynamics and engineering failure impact models to identify key system risk drivers and potential system design requirements. The paper also discusses other probabilistic risk assessment approaches planned by the ARES I project to support the PD phase and beyond.
Association between physical activity and risk of nonalcoholic fatty liver disease: a meta-analysis.
Qiu, Shanhu; Cai, Xue; Sun, Zilin; Li, Ling; Zügel, Martina; Steinacker, Jürgen Michael; Schumann, Uwe
2017-09-01
Increased physical activity (PA) is a key element in the management of patients with nonalcoholic fatty liver disease (NAFLD); however, its association with NAFLD risk has not been systematically assessed. This meta-analysis of observational studies was to quantify this association with dose-response analysis. Electronic databases were searched to January 2017 for studies of adults reporting the risk of NAFLD in relation to PA with cohort or case-control designs. Studies that reported sex-specific data were included as separate studies. The overall risk estimates were pooled using a random-effects model, and the dose-response analysis was conducted to shape the quantitative relationship. A total of 6 cohort studies from 5 articles with 32,657 incident NAFLD cases from 142,781 participants, and 4 case-control studies from 3 articles with 382 NAFLD cases and 302 controls were included. Compared with the lowest PA level, the highest PA level was associated with a risk reduction of NAFLD in cohort [RR (risk ratio) 0.79, 95% CI (confidence interval) 0.71-0.89] and case-control studies [OR (odds ratio) 0.43, 95% CI 0.27-0.68]. For cohort studies, both highest and moderate PA levels were superior to the light one in lowering NAFLD risk ( p for interaction = 0.006 and 0.02, respectively), and there was a log-linear dose-response association ( p for nonlinearity = 0.10) between PA and NAFLD risk [RR 0.82 (95% CI 0.73-0.91) for every 500 metabolic equivalent (MET)-minutes/week increment in PA]. Increased PA may lead to a reduced risk of NAFLD in a dose-dependent manner, and the current guideline-recommended minimum PA level that approximates to 500 MET-minutes/week is able to moderately reduce the NAFLD risk.
Carotenoids and risk of fracture: a meta-analysis of observational studies
Song, Xiaochao; Zhang, Xi; Li, Xinli
2017-01-01
To quantify the association between dietary and circulating carotenoids and fracture risk, a meta-analysis was conducted by searching MEDLINE and EMBASE databases for eligible articles published before May 2016. Five prospective and 2 case-control studies with 140,265 participants and 4,324 cases were identified in our meta-analysis. Among which 5 studies assessed the association between dietary carotenoids levels and hip fracture risk, 2 studies focused on the association between circulating carotenoids levels and any fracture risk. A random-effects model was employed to summarize the risk estimations and their 95% confidence intervals (CIs). Hip fracture risk among participants with high dietary total carotenoids intake was 28% lower than that in participants with low dietary total carotenoids (OR: 0.72; 95% CI: 0.51, 1.01). A similar risk of hip fracture was found for β-carotene based on 5 studies, the summarized OR for high vs. low dietary β-carotene was 0.72 (95% CI: 0.54, 0.95). However, a significant between-study heterogeneity was found (total carotene: I2 = 59.4%, P = 0.06; β-carotene: I2 = 74.4%, P = 0.04). Other individual carotenoids did not show significant associations with hip fracture risk. Circulating carotene levels had no significant association with any fracture risk, the pooled OR (95% CI) was 0.83 (0.59, 1.17). Based on the evidence from observational studies, our meta-analysis supported the hypothesis that higher dietary total carotenoids or β-carotene intake might be potentially associated with a low risk of hip fracture, however, future well-designed prospective cohort studies and randomized controlled trials are warranted to specify the associations between carotenoids and fracture. PMID:27911854
Carotenoids and risk of fracture: a meta-analysis of observational studies.
Xu, Jiuhong; Song, Chunli; Song, Xiaochao; Zhang, Xi; Li, Xinli
2017-01-10
To quantify the association between dietary and circulating carotenoids and fracture risk, a meta-analysis was conducted by searching MEDLINE and EMBASE databases for eligible articles published before May 2016. Five prospective and 2 case-control studies with 140,265 participants and 4,324 cases were identified in our meta-analysis. Among which 5 studies assessed the association between dietary carotenoids levels and hip fracture risk, 2 studies focused on the association between circulating carotenoids levels and any fracture risk. A random-effects model was employed to summarize the risk estimations and their 95% confidence intervals (CIs). Hip fracture risk among participants with high dietary total carotenoids intake was 28% lower than that in participants with low dietary total carotenoids (OR: 0.72; 95% CI: 0.51, 1.01). A similar risk of hip fracture was found for β-carotene based on 5 studies, the summarized OR for high vs. low dietary β-carotene was 0.72 (95% CI: 0.54, 0.95). However, a significant between-study heterogeneity was found (total carotene: I2 = 59.4%, P = 0.06; β-carotene: I2 = 74.4%, P = 0.04). Other individual carotenoids did not show significant associations with hip fracture risk. Circulating carotene levels had no significant association with any fracture risk, the pooled OR (95% CI) was 0.83 (0.59, 1.17). Based on the evidence from observational studies, our meta-analysis supported the hypothesis that higher dietary total carotenoids or β-carotene intake might be potentially associated with a low risk of hip fracture, however, future well-designed prospective cohort studies and randomized controlled trials are warranted to specify the associations between carotenoids and fracture.
NASA Astrophysics Data System (ADS)
Liu, P.
2013-12-01
Quantitative analysis of the risk for reservoir real-time operation is a hard task owing to the difficulty of accurate description of inflow uncertainties. The ensemble-based hydrologic forecasts directly depict the inflows not only the marginal distributions but also their persistence via scenarios. This motivates us to analyze the reservoir real-time operating risk with ensemble-based hydrologic forecasts as inputs. A method is developed by using the forecast horizon point to divide the future time into two stages, the forecast lead-time and the unpredicted time. The risk within the forecast lead-time is computed based on counting the failure number of forecast scenarios, and the risk in the unpredicted time is estimated using reservoir routing with the design floods and the reservoir water levels of forecast horizon point. As a result, a two-stage risk analysis method is set up to quantify the entire flood risks by defining the ratio of the number of scenarios that excessive the critical value to the total number of scenarios. The China's Three Gorges Reservoir (TGR) is selected as a case study, where the parameter and precipitation uncertainties are implemented to produce ensemble-based hydrologic forecasts. The Bayesian inference, Markov Chain Monte Carlo, is used to account for the parameter uncertainty. Two reservoir operation schemes, the real operated and scenario optimization, are evaluated for the flood risks and hydropower profits analysis. With the 2010 flood, it is found that the improvement of the hydrologic forecast accuracy is unnecessary to decrease the reservoir real-time operation risk, and most risks are from the forecast lead-time. It is therefore valuable to decrease the avarice of ensemble-based hydrologic forecasts with less bias for a reservoir operational purpose.
NASA Astrophysics Data System (ADS)
DELİCE, Yavuz
2015-04-01
Highways, Located in the city and intercity locations are generally prone to many kind of natural disaster risks. Natural hazards and disasters that may occur firstly from highway project making to construction and operation stages and later during the implementation of highway maintenance and repair stages have to be taken into consideration. And assessment of risks that may occur against adverse situations is very important in terms of project design, construction, operation maintenance and repair costs. Making hazard and natural disaster risk analysis is largely depending on the definition of the likelihood of the probable hazards on the highways. However, assets at risk , and the impacts of the events must be examined and to be rated in their own. With the realization of these activities, intended improvements against natural hazards and disasters will be made with the utilization of Failure Mode Effects Analysis (FMEA) method and their effects will be analyzed with further works. FMEA, is a useful method to identify the failure mode and effects depending on the type of failure rate effects priorities and finding the most optimum economic and effective solution. Although relevant measures being taken for the identified risks by this analysis method , it may also provide some information for some public institutions about the nature of these risks when required. Thus, the necessary measures will have been taken in advance in the city and intercity highways. Many hazards and natural disasters are taken into account in risk assessments. The most important of these dangers can be listed as follows; • Natural disasters 1. Meteorological based natural disasters (floods, severe storms, tropical storms, winter storms, avalanches, etc.). 2. Geological based natural disasters (earthquakes, tsunamis, landslides, subsidence, sinkholes, etc) • Human originated disasters 1. Transport accidents (traffic accidents), originating from the road surface defects (icing, signaling caused malfunctions and risks), fire or explosion etc.- In this study, with FMEA method, risk analysis of the urban and intercity motorways against natural disasters and hazards have been performed and found solutions were brought against these risks. Keywords: Failure Modes Effects Analysis (FMEA), Pareto Analyses (PA), Highways, Risk Management.
Managing Space Radiation Risks on Lunar and Mars Missions: Risk Assessment and Mitigation
NASA Technical Reports Server (NTRS)
Cucinotta, F. A.; George, K.; Hu, X.; Kim, M. H.; Nikjoo, H.
2006-01-01
Radiation-induced health risks are a primary concern for human exploration outside the Earth's magnetosphere, and require improved approaches to risk estimation and tools for mitigation including shielding and biological countermeasures. Solar proton events are the major concern for short-term lunar missions (<60 d), and for long-term missions (>60 d) such as Mars exploration, the exposures to the high energy and charge (HZE) ions that make-up the galactic cosmic rays are the major concern. Health risks from radiation exposure are chronic risks including carcinogenesis and degenerative tissue risks, central nervous system effects, and acute risk such as radiation sickness or early lethality. The current estimate is that a more than four-fold uncertainty exists in the projection of lifetime mortality risk from cosmic rays, which severely limits analysis of possible benefits of shielding or biological countermeasure designs. Uncertainties in risk projections are largely due to insufficient knowledge of HZE ion radiobiology, which has led NASA to develop a unique probabilistic approach to radiation protection. We review NASA's approach to radiation risk assessment including its impact on astronaut dose limits and application of the ALARA (As Low as Reasonably Achievable) principle. The recently opened NASA Space Radiation Laboratory (NSRL) provides the capability to simulate the cosmic rays in controlled ground-based experiments with biological and shielding models. We discuss how research at NSRL will lead to reductions in the uncertainties in risk projection models. In developing mission designs, the reduction of health risks and mission constraints including costs are competing concerns that need to be addressed through optimization procedures. Mitigating the risks from space radiation is a multi-factorial problem involving individual factors (age, gender, genetic makeup, and exposure history), operational factors (planetary destination, mission length, and period in the solar cycle), and shielding characteristics (materials, mass, and topology). We review optimization metrics for radiation protection including scenarios that integrate biophysics models of radiation risks, operational variables, and shielding design tools needed to assess exploration mission designs. We discuss the application of a crosscutting metric, based on probabilistic risk assessment, to lunar and Mars mission trade studies including the assessment of multi-factorial problems and the potential benefits of new radiation health research strategies or mitigation technologies.
Managing Space Radiation Risks On Lunar and Mars Missions: Risk Assessment and Mitigation
NASA Technical Reports Server (NTRS)
Cucinotta, F. A.; George, K.; Hu, X.; Kim, M. H.; Nikjoo, H.
2005-01-01
Radiation-induced health risks are a primary concern for human exploration outside the Earth's magnetosphere, and require improved approaches to risk estimation and tools for mitigation including shielding and biological countermeasures. Solar proton events are the major concern for short-term lunar missions (<60 d), and for long-term missions (>60 d) such as Mars exploration, the exposures to the high energy and charge (HZE) ions that make-up the galactic cosmic rays are the major concern. Health risks from radiation exposure are chronic risks including carcinogenesis and degenerative tissue risks, central nervous system effects, and acute risk such as radiation sickness or early lethality. The current estimate is that a more than four-fold uncertainty exists in the projection of lifetime mortality risk from cosmic rays, which severely limits analysis of possible benefits of shielding or biological countermeasure designs. Uncertainties in risk projections are largely due to insufficient knowledge of HZE ion radiobiology, which has led NASA to develop a unique probabilistic approach to radiation protection. We review NASA's approach to radiation risk assessment including its impact on astronaut dose limits and application of the ALARA (As Low as Reasonably Achievable) principle. The recently opened NASA Space Radiation Laboratory (NSRL) provides the capability to simulate the cosmic rays in controlled ground-based experiments with biological and shielding models. We discuss how research at NSRL will lead to reductions in the uncertainties in risk projection models. In developing mission designs, the reduction of health risks and mission constraints including costs are competing concerns that need to be addressed through optimization procedures. Mitigating the risks from space radiation is a multi-factorial problem involving individual factors (age, gender, genetic makeup, and exposure history), operational factors (planetary destination, mission length, and period in the solar cycle), and shielding characteristics (materials, mass, and topology). We review optimization metrics for radiation protection including scenarios that integrate biophysics models of radiation risks, operational variables, and shielding design tools needed to assess exploration mission designs. We discuss the application of a crosscutting metric, based on probabilistic risk assessment, to lunar and Mars mission trade studies including the assessment of multi-factorial problems and the potential benefits of new radiation health research strategies or mitigation technologies.
Managing Space Radiation Risks on Lunar and Mars Missions: Risk Assessment and Mitigation
NASA Technical Reports Server (NTRS)
Cucinotta, F. A.; George, K.; Hu, X.; Kim, M. H.; Nikjoo, H.; Ponomarev, A.; Ren, L.; Shavers, M. R.; Wu, H.
2005-01-01
Radiation-induced health risks are a primary concern for human exploration outside the Earth's magnetosphere, and require improved approaches to risk estimation and tools for mitigation including shielding and biological countermeasures. Solar proton events are the major concern for short-term lunar missions (<60 d), and for long-term missions (>60 d) such as Mars exploration, the exposures to the high energy and charge (HZE) ions that make-up the galactic cosmic rays are the major concern. Health risks from radiation exposure are chronic risks including carcinogenesis and degenerative tissue risks, central nervous system effects, and acute risk such as radiation sickness or early lethality. The current estimate is that a more than four-fold uncertainty exists in the projection of lifetime mortality risk from cosmic rays, which severely limits analysis of possible benefits of shielding or biological countermeasure designs. Uncertainties in risk projections are largely due to insufficient knowledge of HZE ion radiobiology, which has led NASA to develop a unique probabilistic approach to radiation protection. We review NASA's approach to radiation risk assessment including its impact on astronaut dose limits and application of the ALARA (As Low as Reasonably Achievable) principle. The recently opened NASA Space Radiation Laboratory (NSRL) provides the capability to simulate the cosmic rays in controlled ground-based experiments with biological and shielding models. We discuss how research at NSRL will lead to reductions in the uncertainties in risk projection models. In developing mission designs, the reduction of health risks and mission constraints including costs are competing concerns that need to be addressed through optimization procedures. Mitigating the risks from space radiation is a multi-factorial problem involving individual factors (age, gender, genetic makeup, and exposure history), operational factors (planetary destination, mission length, and period in the solar cycle), and shielding characteristics (materials, mass, and topology). We review optimization metrics for radiation protection including scenarios that integrate biophysics models of radiation risks, operational variables, and shielding design tools needed to assess exploration mission designs. We discuss the application of a crosscutting metric, based on probabilistic risk assessment, to lunar and Mars mission trade studies including the assessment of multi-factorial problems and the potential benefits of new radiation health research strategies or mitigation technologies.
Hussen, Sophia A; Gilliard, Danielle; Caldwell, Cleopatra H; Andes, Karen; Chakraborty, Rana; Malebranche, David J
2014-08-01
Young black men who have sex with men (YBMSM) are experiencing high and rising rates of HIV infection, more than any other age-risk group category in the USA. Contributors to HIV risk in this group remain incompletely elucidated. We conducted exploratory qualitative interviews with 20 HIV-positive YBMSM aged 17-24 and found that father-son relationships were perceived to be important sociocontextual influences in participants' lives. Participants discussed the degree of their fathers' involvement in their lives, emotional qualities of the father-son relationship, communication about sex, and masculine socialization. Participants also described pathways linking father-son relationships to HIV risk, which were mediated by psychological and situational risk scenarios. Our thematic analysis suggests that father-son relationships are important to the psychosocial development of YBMSM, with the potential to either exacerbate or attenuate sexual risk for HIV. Interventions designed to strengthen father-son relationships may provide a promising direction for future health promotion efforts in this population.
Analyses in support of risk-informed natural gas vehicle maintenance facility codes and standards :
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ekoto, Isaac W.; Blaylock, Myra L.; LaFleur, Angela Christine
2014-03-01
Safety standards development for maintenance facilities of liquid and compressed gas fueled large-scale vehicles is required to ensure proper facility design and operation envelopes. Standard development organizations are utilizing risk-informed concepts to develop natural gas vehicle (NGV) codes and standards so that maintenance facilities meet acceptable risk levels. The present report summarizes Phase I work for existing NGV repair facility code requirements and highlights inconsistencies that need quantitative analysis into their effectiveness. A Hazardous and Operability study was performed to identify key scenarios of interest. Finally, scenario analyses were performed using detailed simulations and modeling to estimate the overpressure hazardsmore » from HAZOP defined scenarios. The results from Phase I will be used to identify significant risk contributors at NGV maintenance facilities, and are expected to form the basis for follow-on quantitative risk analysis work to address specific code requirements and identify effective accident prevention and mitigation strategies.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nelson, J. G.; Morton, R. L.; Castillo, C.
2011-02-01
A multi-level (facility and programmatic) risk assessment was conducted for the facilities in the Nevada National Security Site (NNSS) Readiness in Technical Base and Facilities (RTBF) Program and results were included in a new Risk Management Plan (RMP), which was incorporated into the fiscal year (FY) 2010 Integrated Plans. Risks, risk events, probability, consequence(s), and mitigation strategies were identified and captured, for most scope areas (i.e., risk categories) during the facilitated risk workshops. Risk mitigations (i.e., efforts in addition to existing controls) were identified during the facilitated risk workshops when the risk event was identified. Risk mitigation strategies fell intomore » two broad categories: threats or opportunities. Improvement projects were identified and linked to specific risks they mitigate, making the connection of risk reduction through investments for the annual Site Execution Plan. Due to the amount of that was collected, analysis to be performed, and reports to be generated, a Risk Assessment/ Management Tool (RAMtool) database was developed to analyze the risks in real-time, at multiple levels, which reinforced the site-level risk management process and procedures. The RAMtool database was developed and designed to assist in the capturing and analysis of the key elements of risk: probability, consequence, and impact. The RAMtool calculates the facility-level and programmatic-level risk factors to enable a side-by-side comparison to see where the facility manager and program manager should focus their risk reduction efforts and funding. This enables them to make solid decisions on priorities and funding to maximize the risk reduction. A more active risk management process was developed where risks and opportunities are actively managed, monitored, and controlled by each facility more aggressively and frequently. risk owners have the responsibility and accountability to manage their assigned risk in real-time, using the RAMtool database.« less
Subscale and Full-Scale Testing of Buckling-Critical Launch Vehicle Shell Structures
NASA Technical Reports Server (NTRS)
Hilburger, Mark W.; Haynie, Waddy T.; Lovejoy, Andrew E.; Roberts, Michael G.; Norris, Jeffery P.; Waters, W. Allen; Herring, Helen M.
2012-01-01
New analysis-based shell buckling design factors (aka knockdown factors), along with associated design and analysis technologies, are being developed by NASA for the design of launch vehicle structures. Preliminary design studies indicate that implementation of these new knockdown factors can enable significant reductions in mass and mass-growth in these vehicles and can help mitigate some of NASA s launch vehicle development and performance risks by reducing the reliance on testing, providing high-fidelity estimates of structural performance, reliability, robustness, and enable increased payload capability. However, in order to validate any new analysis-based design data or methods, a series of carefully designed and executed structural tests are required at both the subscale and full-scale level. This paper describes recent buckling test efforts at NASA on two different orthogrid-stiffened metallic cylindrical shell test articles. One of the test articles was an 8-ft-diameter orthogrid-stiffened cylinder and was subjected to an axial compression load. The second test article was a 27.5-ft-diameter Space Shuttle External Tank-derived cylinder and was subjected to combined internal pressure and axial compression.
Risk analysis for the flood control capacity of dikes under climate change
NASA Astrophysics Data System (ADS)
Wei, Hsiao Ping; Yeh, Keh-Chia; Hsiao, Yi-Hua
2017-04-01
Climate change is the major reason for many extreme disaster events. In recent years, scientists have revealed many findings and most of them agree that the frequency of extreme weather and its corresponding hydrological impact will increase due to climate change. In such situation, the current hydrologic designs based upon historical observation, which could be changed, are necessary to review again under the scenario of climate change. It is for this reason that this study uses Kao-Ping River Basin as an example, using high resolution dynamical downscaling data (base period, near future, and end of the century) to simulate changes in hourly flow rate of typhoon events in each of the three 25-year periods. Results are further compared with the design flow rate announced by the competent authority of water resources, as well as recorded river water levels of the most severe typhoon event in history and risk analysis basic on factors, to evaluate the risk and impact of river flooding under climate change.From the simulation results, the frequency of exceeding design discharge in Kao-ping river catchment will increase in the end of century. The water level at these LI-LIN BRIDGE and SAN-TI-MEN gauges could be obviously influenced due to the extreme rainfall events, so that their flood control capacity should be assessed and improved.
Carroll, Robert; Ramagopalan, Sreeram V.; Cid-Ruzafa, Javier; Lambrelli, Dimitra; McDonald, Laura
2017-01-01
Background: The objective of this study was to investigate the study design characteristics of Post-Authorisation Studies (PAS) requested by the European Medicines Agency which were recorded on the European Union (EU) PAS Register held by the European Network of Centres for Pharmacoepidemiology and Pharmacovigilance (ENCePP). Methods: We undertook a cross-sectional descriptive analysis of all studies registered on the EU PAS Register as of 18 th October 2016. Results: We identified a total of 314 studies on the EU PAS Register, including 81 (26%) finalised, 160 (51%) ongoing and 73 (23%) planned. Of those studies identified, 205 (65%) included risk assessment in their scope, 133 (42%) included drug utilisation and 94 (30%) included effectiveness evaluation. Just over half of the studies (175; 56%) used primary data capture, 135 (43%) used secondary data and 4 (1%) used a hybrid design combining both approaches. Risk assessment and effectiveness studies were more likely to use primary data capture (60% and 85% respectively as compared to 39% and 14% respectively for secondary). The converse was true for drug utilisation studies where 59% were secondary vs. 39% for primary. For type 2 diabetes mellitus, database studies were more commonly used (80% vs 3% chart review, 3% hybrid and 13% primary data capture study designs) whereas for studies in oncology, primary data capture were more likely to be used (85% vs 4% chart review, and 11% database study designs). Conclusions: Results of this analysis show that PAS design varies according to study objectives and therapeutic area. PMID:29188016
Iurian, Sonia; Turdean, Luana; Tomuta, Ioan
2017-01-01
This study focuses on the development of a drug product based on a risk assessment-based approach, within the quality by design paradigm. A prolonged release system was proposed for paliperidone (Pal) delivery, containing Kollidon ® SR as an insoluble matrix agent and hydroxypropyl cellulose, hydroxypropyl methylcellulose (HPMC), or sodium carboxymethyl cellulose as a hydrophilic polymer. The experimental part was preceded by the identification of potential sources of variability through Ishikawa diagrams, and failure mode and effects analysis was used to deliver the critical process parameters that were further optimized by design of experiments. A D-optimal design was used to investigate the effects of Kollidon SR ratio ( X 1 ), the type of hydrophilic polymer ( X 2 ), and the percentage of hydrophilic polymer ( X 3 ) on the percentages of dissolved Pal over 24 h ( Y 1 - Y 9 ). Effects expressed as regression coefficients and response surfaces were generated, along with a design space for the preparation of a target formulation in an experimental area with low error risk. The optimal formulation contained 27.62% Kollidon SR and 8.73% HPMC and achieved the prolonged release of Pal, with low burst effect, at ratios that were very close to the ones predicted by the model. Thus, the parameters with the highest impact on the final product quality were studied, and safe ranges were established for their variations. Finally, a risk mitigation and control strategy was proposed to assure the quality of the system, by constant process monitoring.
NASA Technical Reports Server (NTRS)
1976-01-01
After the disaster of Staten Island in 1973 where 40 people were killed repairing a liquid natural gas storage tank, the New York Fire Commissioner requested NASA's help in drawing up a comprehensive plan to cover the design, construction, and operation of liquid natural gas facilities. Two programs are underway. The first transfers comprehensive risk management techniques and procedures which take the form of an instruction document that includes determining liquid-gas risks through engineering analysis and tests, controlling these risks by setting up redundant fail safe techniques, and establishing criteria calling for decisions that eliminate or accept certain risks. The second program prepares a liquid gas safety manual (the first of its kind).
NASA Astrophysics Data System (ADS)
Gan, Luping; Li, Yan-Feng; Zhu, Shun-Peng; Yang, Yuan-Jian; Huang, Hong-Zhong
2014-06-01
Failure mode, effects and criticality analysis (FMECA) and Fault tree analysis (FTA) are powerful tools to evaluate reliability of systems. Although single failure mode issue can be efficiently addressed by traditional FMECA, multiple failure modes and component correlations in complex systems cannot be effectively evaluated. In addition, correlated variables and parameters are often assumed to be precisely known in quantitative analysis. In fact, due to the lack of information, epistemic uncertainty commonly exists in engineering design. To solve these problems, the advantages of FMECA, FTA, fuzzy theory, and Copula theory are integrated into a unified hybrid method called fuzzy probability weighted geometric mean (FPWGM) risk priority number (RPN) method. The epistemic uncertainty of risk variables and parameters are characterized by fuzzy number to obtain fuzzy weighted geometric mean (FWGM) RPN for single failure mode. Multiple failure modes are connected using minimum cut sets (MCS), and Boolean logic is used to combine fuzzy risk priority number (FRPN) of each MCS. Moreover, Copula theory is applied to analyze the correlation of multiple failure modes in order to derive the failure probabilities of each MCS. Compared to the case where dependency among multiple failure modes is not considered, the Copula modeling approach eliminates the error of reliability analysis. Furthermore, for purpose of quantitative analysis, probabilities importance weight from failure probabilities are assigned to FWGM RPN to reassess the risk priority, which generalize the definition of probability weight and FRPN, resulting in a more accurate estimation than that of the traditional models. Finally, a basic fatigue analysis case drawn from turbine and compressor blades in aeroengine is used to demonstrate the effectiveness and robustness of the presented method. The result provides some important insights on fatigue reliability analysis and risk priority assessment of structural system under failure correlations.
NASA Astrophysics Data System (ADS)
Augustine, Kurt E.; Camp, Jon J.; Holmes, David R.; Huddleston, Paul M.; Lu, Lichun; Yaszemski, Michael J.; Robb, Richard A.
2012-03-01
Failure of the spine's structural integrity from metastatic disease can lead to both pain and neurologic deficit. Fractures that require treatment occur in over 30% of bony metastases. Our objective is to use computed tomography (CT) in conjunction with analytic techniques that have been previously developed to predict fracture risk in cancer patients with metastatic disease to the spine. Current clinical practice for cancer patients with spine metastasis often requires an empirical decision regarding spinal reconstructive surgery. Early image-based software systems used for CT analysis are time consuming and poorly suited for clinical application. The Biomedical Image Resource (BIR) at Mayo Clinic, Rochester has developed an image analysis computer program that calculates from CT scans, the residual load-bearing capacity in a vertebra with metastatic cancer. The Spine Cancer Assessment (SCA) program is built on a platform designed for clinical practice, with a workflow format that allows for rapid selection of patient CT exams, followed by guided image analysis tasks, resulting in a fracture risk report. The analysis features allow the surgeon to quickly isolate a single vertebra and obtain an immediate pre-surgical multiple parallel section composite beam fracture risk analysis based on algorithms developed at Mayo Clinic. The analysis software is undergoing clinical validation studies. We expect this approach will facilitate patient management and utilization of reliable guidelines for selecting among various treatment option based on fracture risk.
Suicide Risk in the Hospitalized Elderly in Turkey and Affecting Factors.
Avci, Dilek; Selcuk, Kevser Tari; Dogan, Selma
2017-02-01
This study aimed to investigate the suicide risk among the elderly hospitalized and treated because of physical illnesses, and the factors affecting the risk. The study has a cross-sectional design. It was conducted with 459 elderly people hospitalized and treated in a public hospital between May 25, 2015 and December 4, 2015. Data were collected with the Personal Information Form, Suicide Probability Scale and Hospital Anxiety and Depression Scale. For the analysis, descriptive statistics, the chi-square test, Fisher's exact test and logistic regression analysis were used. In the study, 24.0% of the elderly were at high risk for suicide. Suicide risk was even higher among the elderly in the 60-74 age group, living alone, drinking alcohol, perceiving his/her religious beliefs as weak, being treated for cancer, having the diagnosis 11 years or over, having a history of admission to a psychiatry clinic, and being at risk for anxiety and depression. In the study, approximately one out of every four elderly people was at high risk for suicide. Therefore, older people should be assessed for suicide risk and programs targeting to prevent the elderly from committing suicide should be organized. Copyright © 2016 Elsevier Inc. All rights reserved.
Study of a risk-based piping inspection guideline system.
Tien, Shiaw-Wen; Hwang, Wen-Tsung; Tsai, Chih-Hung
2007-02-01
A risk-based inspection system and a piping inspection guideline model were developed in this study. The research procedure consists of two parts--the building of a risk-based inspection model for piping and the construction of a risk-based piping inspection guideline model. Field visits at the plant were conducted to develop the risk-based inspection and strategic analysis system. A knowledge-based model had been built in accordance with international standards and local government regulations, and the rational unified process was applied for reducing the discrepancy in the development of the models. The models had been designed to analyze damage factors, damage models, and potential damage positions of piping in the petrochemical plants. The purpose of this study was to provide inspection-related personnel with the optimal planning tools for piping inspections, hence, to enable effective predictions of potential piping risks and to enhance the better degree of safety in plant operations that the petrochemical industries can be expected to achieve. A risk analysis was conducted on the piping system of a petrochemical plant. The outcome indicated that most of the risks resulted from a small number of pipelines.
Maternal Smoking and Autism Spectrum Disorder: A Meta-Analysis
ERIC Educational Resources Information Center
Rosen, Brittany N.; Lee, Brian K.; Lee, Nora L.; Yang, Yunwen; Burstyn, Igor
2015-01-01
We conducted a meta-analysis of 15 studies on maternal prenatal smoking and ASD risk in offspring. Using a random-effects model, we found no evidence of an association (summary OR 1.02, 95% CI 0.93-1.12). Stratifying by study design, birth year, type of healthcare system, and adjustment for socioeconomic status or psychiatric history did not alter…
Adolescent sexual victimization: a prospective study on risk factors for first time sexual assault.
Bramsen, Rikke Holm; Lasgaard, Mathias; Koss, Mary P; Elklit, Ask; Banner, Jytte
2012-09-01
The present study set out to investigate predictors of first time adolescent peer-on-peer sexual victimization (APSV) among 238 female Grade 9 students from 30 schools in Denmark. A prospective research design was utilized to examine the relationship among five potential predictors as measured at baseline and first time APSV during a 6-month period. Data analysis was a binary logistic regression analysis. Number of sexual partners and displaying sexual risk behaviors significantly predicted subsequent first time peer-on-peer sexual victimization, whereas a history of child sexual abuse, early sexual onset and failing to signal sexual boundaries did not. The present study identifies specific risk factors for first time sexual victimization that are potentially changeable. Thus, the results may inform prevention initiatives targeting initial experiences of APSV.
NASA Astrophysics Data System (ADS)
Lee, Junyung; Yi, Kyongsu; Yoo, Hyunjae; Chong, Hyokjin; Ko, Bongchul
2015-06-01
This paper describes a risk management algorithm for rear-side collision avoidance. The proposed risk management algorithm consists of a supervisor and a coordinator. The supervisor is designed to monitor collision risks between the subject vehicle and approaching vehicle in the adjacent lane. An appropriate criterion of intervention, which satisfies high acceptance to drivers through the consideration of a realistic traffic, has been determined based on the analysis of the kinematics of the vehicles in longitudinal and lateral directions. In order to assist the driver actively and increase driver's safety, a coordinator is designed to combine lateral control using a steering torque overlay by motor-driven power steering and differential braking by vehicle stability control. In order to prevent the collision while limiting actuator's control inputs and vehicle dynamics to safe values for the assurance of the driver's comfort, the Lyapunov theory and linear matrix inequalities based optimisation methods have been used. The proposed risk management algorithm has been evaluated via simulation using CarSim and MATLAB/Simulink.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Clayton, Daniel James; Lipinski, Ronald J.; Bechtel, Ryan D.
As compact and light weight power sources with reliable, long lives, Radioisotope Power Systems (RPSs) have made space missions to explore the solar system possible. Due to the hazardous material that can be released during a launch accident, the potential health risk of an accident must be quantified, so that appropriate launch approval decisions can be made. One part of the risk estimation involves modeling the response of the RPS to potential accident environments. Due to the complexity of modeling the full RPS response deterministically on dynamic variables, the evaluation is performed in a stochastic manner with a Monte Carlomore » simulation. The potential consequences can be determined by modeling the transport of the hazardous material in the environment and in human biological pathways. The consequence analysis results are summed and weighted by appropriate likelihood values to give a collection of probabilistic results for the estimation of the potential health risk. This information is used to guide RPS designs, spacecraft designs, mission architecture, or launch procedures to potentially reduce the risk, as well as to inform decision makers of the potential health risks resulting from the use of RPSs for space missions.« less
Extreme Storm Surges in the North Sea
NASA Astrophysics Data System (ADS)
Goennert, G.; Buß, Th.; Mueller, O.; Thumm, S.
2009-04-01
Extreme Storm Surges in the North Sea Gabriele Gönnert, Olaf Müller, Thomas Buß and Sigrid Thumm Climate Change will cause a rise of the sea level and probably more frequent and more violent storm surges. This has serious consequences for the safety of people as well as for their values and assets behind the dikes. It is therefore inevitable to first assess how sea level rise and an extreme storm surge event designes. In a second step it is possible to determine the risk for specific locations and develop strategies. The Project XtremRisk - Extreme Storm Surges at the North Sea Coast and in Estuaries. Risk calculation and risk strategies, funded by the German Federal Government will help answering these questions. The „Source-Pathway-Receptor" Concept will be used as a basis for risk analysis and development of new strategies. The Project offers methods to assess the development of extreme events under the conditions of today. Under conditions reflecting the climate change it will be tried to design an extreme event. For these three main points will be considered: a) Analysis and calculation of each factor, which produce a storm surge and its maximum level occurring in the last 100 years. These are: - maximum surge level: surge (due to the wind), - influence of the tide and the interaction between surge and tide, - influence of external surges , b) The hydrodynamics of a storm surge cause nonlinear effects in the interaction of the named factors. These factors and effects will both be taken into account to calculate the magnitude of the extreme storm surge. This step is very complex and need additional examination by numerical models. c) Analysis of the different scenarios to mean sea level rise and to the increase of wind speed due to the climate change. The presentation will introduce methods and show first results of the analysis of extreme events and the mean sea level rise.
Riccardo, Flavia; Dente, Maria Grazia; Kärki, Tommi; Fabiani, Massimo; Napoli, Christian; Chiarenza, Antonio; Giorgi Rossi, Paolo; Velasco Munoz, Cesar; Noori, Teymur; Declich, Silvia
2015-01-01
There are limitations in our capacity to interpret point estimates and trends of infectious diseases occurring among diverse migrant populations living in the European Union/European Economic Area (EU/EEA). The aim of this study was to design a data collection framework that could capture information on factors associated with increased risk to infectious diseases in migrant populations in the EU/EEA. The authors defined factors associated with increased risk according to a multi-dimensional framework and performed a systematic literature review in order to identify whether those factors well reflected the reported risk factors for infectious disease in these populations. Following this, the feasibility of applying this framework to relevant available EU/EEA data sources was assessed. The proposed multidimensional framework is well suited to capture the complexity and concurrence of these risk factors and in principle applicable in the EU/EEA. The authors conclude that adopting a multi-dimensional framework to monitor infectious diseases could favor the disaggregated collection and analysis of migrant health data. PMID:26393623
Valdor, Paloma F; Gómez, Aina G; Puente, Araceli
2015-01-15
Diffuse pollution from oil spills is a widespread problem in port areas (as a result of fuel supply, navigation and loading/unloading activities). This article presents a method to assess the environmental risk of oil handling facilities in port areas. The method is based on (i) identification of environmental hazards, (ii) characterization of meteorological and oceanographic conditions, (iii) characterization of environmental risk scenarios, and (iv) assessment of environmental risk. The procedure has been tested by application to the Tarragona harbor. The results show that the method is capable of representing (i) specific local pollution cases (i.e., discriminating between products and quantities released by a discharge source), (ii) oceanographic and meteorological conditions (selecting a representative subset data), and (iii) potentially affected areas in probabilistic terms. Accordingly, it can inform the design of monitoring plans to study and control the environmental impact of these facilities, as well as the design of contingency plans. Copyright © 2014 Elsevier Ltd. All rights reserved.
Riccardo, Flavia; Dente, Maria Grazia; Kärki, Tommi; Fabiani, Massimo; Napoli, Christian; Chiarenza, Antonio; Giorgi Rossi, Paolo; Munoz, Cesar Velasco; Noori, Teymur; Declich, Silvia
2015-09-17
There are limitations in our capacity to interpret point estimates and trends of infectious diseases occurring among diverse migrant populations living in the European Union/European Economic Area (EU/EEA). The aim of this study was to design a data collection framework that could capture information on factors associated with increased risk to infectious diseases in migrant populations in the EU/EEA. The authors defined factors associated with increased risk according to a multi-dimensional framework and performed a systematic literature review in order to identify whether those factors well reflected the reported risk factors for infectious disease in these populations. Following this, the feasibility of applying this framework to relevant available EU/EEA data sources was assessed. The proposed multidimensional framework is well suited to capture the complexity and concurrence of these risk factors and in principle applicable in the EU/EEA. The authors conclude that adopting a multi-dimensional framework to monitor infectious diseases could favor the disaggregated collection and analysis of migrant health data.
An Extreme-Value Approach to Anomaly Vulnerability Identification
NASA Technical Reports Server (NTRS)
Everett, Chris; Maggio, Gaspare; Groen, Frank
2010-01-01
The objective of this paper is to present a method for importance analysis in parametric probabilistic modeling where the result of interest is the identification of potential engineering vulnerabilities associated with postulated anomalies in system behavior. In the context of Accident Precursor Analysis (APA), under which this method has been developed, these vulnerabilities, designated as anomaly vulnerabilities, are conditions that produce high risk in the presence of anomalous system behavior. The method defines a parameter-specific Parameter Vulnerability Importance measure (PVI), which identifies anomaly risk-model parameter values that indicate the potential presence of anomaly vulnerabilities, and allows them to be prioritized for further investigation. This entails analyzing each uncertain risk-model parameter over its credible range of values to determine where it produces the maximum risk. A parameter that produces high system risk for a particular range of values suggests that the system is vulnerable to the modeled anomalous conditions, if indeed the true parameter value lies in that range. Thus, PVI analysis provides a means of identifying and prioritizing anomaly-related engineering issues that at the very least warrant improved understanding to reduce uncertainty, such that true vulnerabilities may be identified and proper corrective actions taken.
Lamm, Steven H; Ferdosi, Hamid; Dissen, Elisabeth K; Li, Ji; Ahn, Jaeil
2015-12-07
High levels (> 200 µg/L) of inorganic arsenic in drinking water are known to be a cause of human lung cancer, but the evidence at lower levels is uncertain. We have sought the epidemiological studies that have examined the dose-response relationship between arsenic levels in drinking water and the risk of lung cancer over a range that includes both high and low levels of arsenic. Regression analysis, based on six studies identified from an electronic search, examined the relationship between the log of the relative risk and the log of the arsenic exposure over a range of 1-1000 µg/L. The best-fitting continuous meta-regression model was sought and found to be a no-constant linear-quadratic analysis where both the risk and the exposure had been logarithmically transformed. This yielded both a statistically significant positive coefficient for the quadratic term and a statistically significant negative coefficient for the linear term. Sub-analyses by study design yielded results that were similar for both ecological studies and non-ecological studies. Statistically significant X-intercepts consistently found no increased level of risk at approximately 100-150 µg/L arsenic.
Lamm, Steven H.; Ferdosi, Hamid; Dissen, Elisabeth K.; Li, Ji; Ahn, Jaeil
2015-01-01
High levels (> 200 µg/L) of inorganic arsenic in drinking water are known to be a cause of human lung cancer, but the evidence at lower levels is uncertain. We have sought the epidemiological studies that have examined the dose-response relationship between arsenic levels in drinking water and the risk of lung cancer over a range that includes both high and low levels of arsenic. Regression analysis, based on six studies identified from an electronic search, examined the relationship between the log of the relative risk and the log of the arsenic exposure over a range of 1–1000 µg/L. The best-fitting continuous meta-regression model was sought and found to be a no-constant linear-quadratic analysis where both the risk and the exposure had been logarithmically transformed. This yielded both a statistically significant positive coefficient for the quadratic term and a statistically significant negative coefficient for the linear term. Sub-analyses by study design yielded results that were similar for both ecological studies and non-ecological studies. Statistically significant X-intercepts consistently found no increased level of risk at approximately 100–150 µg/L arsenic. PMID:26690190
Climate Risk Assessment: Technical Guidance Manual for DoD Installations and Built Environment
2016-09-06
climate change risks to DoD installations and the built environment. The approach, which we call “decision-scaling,” reveals the core sensitivity of...DoD installations to climate change . It is designed to illuminate the sensitivity of installations and their supporting infrastructure systems...including water and energy, to climate changes and other uncertainties without dependence on climate change projections. In this way the analysis and
Lee, Heeyoung; Song, Seungyeon; Oh, Yun-Kyoung; Kang, WonKu; Kim, Eunyoung
2017-04-01
To evaluate the role of gender as a risk factor for developing contrast media-associated adverse drug reactions (CM-ADRs) by comparing the incidence of CM-ADR between male and female patients according to study design, ADR type, and computed tomography (CT) examination. We systematically searched three electronic databases for eligible studies. In the studies included (n=18), we assessed effect estimates of the relative incidence of CM-ADR, analysed by experimental design, ADR type and CT examination. This was calculated by using a random effects model if clinical conditions showed heterogeneity; otherwise, a fixed effects model was used. We identified 10,776 patients administered CM. According to the designs, studies were classified into randomised controlled trials (RCTs) and observational studies. Results were as follows: risk ratio (RR)=1.07 (95% confidence interval (CI): 0.79-1.46, P=0.66) for RCTs, and RR=0.77 (95% CI: 0.58-1.04, P=0.09) for observational studies. The results of analysis according to ADR type and for undergoing CT demonstrated that the incidence of CM-ADR did not differ between males and females. We found no significant difference in the incidence of CM-ADRs between male and female patients according to study design, ADR type, or CT examination. Future studies to determine why gender has shown different roles as a risk factor between CM-ADRs and non-CM ADRs are needed. Copyright © 2017 Elsevier B.V. All rights reserved.
Brody, Gene H.; Yu, Tianyi; Chen, Yi-fu; Kogan, Steven M.; Evans, Gary W.; Beach, Steven R. H.; Windle, Michael; Simons, Ronald L.; Gerrard, Meg; Gibbons, Frederick X.; Philibert, Robert A.
2012-01-01
The health disparities literature identified a common pattern among middle-aged African Americans that includes high rates of chronic disease along with low rates of psychiatric disorders despite exposure to high levels of cumulative SES risk. The current study was designed to test hypotheses about the developmental precursors to this pattern. Hypotheses were tested with a representative sample of 443 African American youths living in the rural South. Cumulative SES risk and protective processes were assessed at 11-13 years; psychological adjustment was assessed at ages 14-18 years; genotyping at the 5-HTTLPR was conducted at age 16 years; and allostatic load (AL) was assessed at age 19 years. A Latent Profile Analysis identified 5 profiles that evinced distinct patterns of SES risk, AL, and psychological adjustment, with 2 relatively large profiles designated as focal profiles: a physical health vulnerability profile characterized by high SES risk/high AL/low adjustment problems, and a resilient profile characterized by high SES risk/low AL/low adjustment problems. The physical health vulnerability profile mirrored the pattern found in the adult health disparities literature. Multinomial logistic regression analyses indicated that carrying an s allele at the 5-HTTLPR and receiving less peer support distinguished the physical health vulnerability profile from the resilient profile. Protective parenting and planful self-regulation distinguished both focal profiles from the other 3 profiles. The results suggest the public health importance of preventive interventions that enhance coping and reduce the effects of stress across childhood and adolescence. PMID:22709130
Brody, Gene H; Yu, Tianyi; Chen, Yi-fu; Kogan, Steven M; Evans, Gary W; Beach, Steven R H; Windle, Michael; Simons, Ronald L; Gerrard, Meg; Gibbons, Frederick X; Philibert, Robert A
2013-05-01
The health disparities literature has identified a common pattern among middle-aged African Americans that includes high rates of chronic disease along with low rates of psychiatric disorders despite exposure to high levels of cumulative socioeconomic status (SES) risk. The current study was designed to test hypotheses about the developmental precursors to this pattern. Hypotheses were tested with a representative sample of 443 African American youths living in the rural South. Cumulative SES risk and protective processes were assessed at ages 11-13 years; psychological adjustment was assessed at ages 14-18 years; genotyping at the 5-HTTLPR was conducted at age 16 years; and allostatic load (AL) was assessed at age 19 years. A latent profile analysis identified 5 profiles that evinced distinct patterns of SES risk, AL, and psychological adjustment, with 2 relatively large profiles designated as focal profiles: a physical health vulnerability profile characterized by high SES risk/high AL/low adjustment problems, and a resilient profile characterized by high SES risk/low AL/low adjustment problems. The physical health vulnerability profile mirrored the pattern found in the adult health disparities literature. Multinomial logistic regression analyses indicated that carrying an s allele at the 5-HTTLPR and receiving less peer support distinguished the physical health vulnerability profile from the resilient profile. Protective parenting and planful self-regulation distinguished both focal profiles from the other 3 profiles. The results suggest the public health importance of preventive interventions that enhance coping and reduce the effects of stress across childhood and adolescence.
A prioritization and analysis strategy for environmental surveillance results.
Shyr, L J; Herrera, H; Haaker, R
1997-11-01
DOE facilities are required to conduct environmental surveillance to verify that facility operations are operated within the approved risk envelope and have not caused undue risk to the public and the environment. Given a reduced budget, a strategy for analyzing environmental surveillance data was developed to set priorities for sampling needs. The radiological and metal data collected at Sandia National Laboratories, New Mexico, were used to demonstrate the analysis strategy. Sampling locations were prioritized for further investigation and the needs for routine sampling. The process of data management, analysis, prioritization, and presentation has been automated through a custom-designed computer tool. Data collected over years can be analyzed and summarized in a short table format for prioritization and decision making.
NASA Astrophysics Data System (ADS)
Feng, Nan; Wu, Harris; Li, Minqiang; Wu, Desheng; Chen, Fuzan; Tian, Jin
2016-09-01
Information sharing across organisations is critical to effectively managing the security risks of inter-organisational information systems. Nevertheless, few previous studies on information systems security have focused on inter-organisational information sharing, and none have studied the sharing of inferred beliefs versus factual observations. In this article, a multiagent collaborative model (MACM) is proposed as a practical solution to assess the risk level of each allied organisation's information system and support proactive security treatment by sharing beliefs on event probabilities as well as factual observations. In MACM, for each allied organisation's information system, we design four types of agents: inspection agent, analysis agent, control agent, and communication agent. By sharing soft findings (beliefs) in addition to hard findings (factual observations) among the organisations, each organisation's analysis agent is capable of dynamically predicting its security risk level using a Bayesian network. A real-world implementation illustrates how our model can be used to manage security risks in distributed information systems and that sharing soft findings leads to lower expected loss from security risks.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shah, Anuj; Castleton, Karl J.; Hoopes, Bonnie L.
2004-06-01
The study of the release and effects of chemicals in the environment and their associated risks to humans is central to public and private decision making. FRAMES 1.X, Framework for Risk Analysis in Multimedia Environmental Systems, is a systems modeling software platform, developed by PNNL, Pacific Northwest National Laboratory, that helps scientists study the release and effects of chemicals on a source to outcome basis, create environmental models for similar risk assessment and management problems. The unique aspect of FRAMES is to dynamically introduce software modules representing individual components of a risk assessment (e.g., source release of contaminants, fate andmore » transport in various environmental media, exposure, etc.) within a software framework, manipulate their attributes and run simulations to obtain results. This paper outlines the fundamental constituents of FRAMES 2.X, an enhanced version of FRAMES 1.X, that greatly improve the ability of the module developers to “plug” their self-developed software modules into the system. The basic design, the underlying principles and a discussion of the guidelines for module developers are presented.« less
Predicting urinary incontinence in women in later life: A systematic review.
Troko, Joy; Bach, Fiona; Toozs-Hobson, Philip
2016-12-01
Urinary incontinence (UI) affects 10-40% of the population and treatment costs in the UK are estimated to be £233 million per annum. A systematic review of online medical databases between July 1974 and 2016 was conducted to identify studies that had investigated risk and prediction strategies of UI in later life. Eighteen prospective longitudinal studies fulfilled the search criteria. These were analysed systematically (as per the PRISMA checklist) and bias risk through study design was minimised where possible upon data analysis. One paper proposed a predictive assessment tool called the 'continence index'. It was derived following secondary analysis of a cohort study and its predictive threshold had suboptimal sensitivity (79%) and specificity (65%) rates. Seventeen studies identified multiple strong risk factors for UI but despite a large selection of papers on the topic, no robust risk assessment tool prospectively identified patients at risk of UI in later life. Thus more research in this field is required. Clinicians should be aware particularly of modifiable UI risk factors to help reduce the clinical burden of UI in the long term. Copyright © 2016. Published by Elsevier Ireland Ltd.
NASA Astrophysics Data System (ADS)
Gebert, Niklas; Post, Joachim
2010-05-01
The development of early warning systems are one of the key domains of adaptation to global environmental change and contribute very much to the development of societal reaction and adaptive capacities to deal with extreme events. Especially, Indonesia is highly exposed to tsunami. In average every three years small and medium size tsunamis occur in the region causing damage and death. In the aftermath of the Indian Ocean Tsunami 2004, the German and Indonesian government agreed on a joint cooperation to develop a People Centered End-to-End Early Warning System (GITEWS). The analysis of risk and vulnerability, as an important step in risk (and early warning) governance, is a precondition for the design of effective early warning structures by delivering the knowledge base for developing institutionalized quick response mechanisms of organizations involved in the issuing of a tsunami warning, and of populations exposed to react to warnings and to manage evacuation before the first tsunami wave hits. Thus, a special challenge for developing countries is the governance of complex cross-sectoral and cross-scale institutional, social and spatial processes and requirements for the conceptualization, implementation and optimization of a people centered tsunami early warning system. In support of this, the risk and vulnerability assessment of the case study aims at identifying those factors that constitute the causal structure of the (dis)functionality between the technological warning and the social response system causing loss of life during an emergency situation: Which social groups are likely to be less able to receive and respond to an early warning alert? And, are people able to evacuate in due time? Here, only an interdisciplinary research approach is capable to analyze the socio-spatial and environmental conditions of vulnerability and risk and to produce valuable results for decision makers and civil society to manage tsunami risk in the early warning context. This requires the integration of natural / spatial and social science concepts, methods and data: E.g. a scenario based approach for tsunami inundation modeling was developed to provide decision makers with options to decide up to what level they aim to protect their people and territory, on the contrary household surveys were conducted for the spatial analysis of the evacuation preparedness of the population as a function of place specific hazard, risk, warning and evacuation perception; remote sensing was applied for the spatial analysis (land-use) of the socio-physical conditions of a city and region for evacuation; and existing social / population statistics were combined with land-use data for the precise spatial mapping of the population exposed to tsunami risks. Only by utilizing such a comprehensive assessment approach valuable information for risk governance can be generated. The results are mapped using GIS and designed according to the specific needs of different end-users, such as public authorities involved in the design of warning dissemination strategies, land-use planners (shelter planning, road network configuration) and NGOs mandated to provide education for the general public on tsunami risk and evacuation behavior. The case study of the city of Padang (one of the pilot areas of GITEWS), Indonesia clearly show, that only by intersecting social (vulnerability) and natural hazards research a comprehensive picture on tsunami risk can be provided with which risk governance in the early warning context can be conducted in a comprehensive, systemic and sustainable manner.
Design, Analysis, and Reporting of Crossover Trials for Inclusion in a Meta-Analysis.
Li, Tianjing; Yu, Tsung; Hawkins, Barbara S; Dickersin, Kay
2015-01-01
To evaluate the characteristics of the design, analysis, and reporting of crossover trials for inclusion in a meta-analysis of treatment for primary open-angle glaucoma and to provide empirical evidence to inform the development of tools to assess the validity of the results from crossover trials and reporting guidelines. We searched MEDLINE, EMBASE, and Cochrane's CENTRAL register for randomized crossover trials for a systematic review and network meta-analysis we are conducting. Two individuals independently screened the search results for eligibility and abstracted data from each included report. We identified 83 crossover trials eligible for inclusion. Issues affecting the risk of bias in crossover trials, such as carryover, period effects and missing data, were often ignored. Some trials failed to accommodate the within-individual differences in the analysis. For a large proportion of the trials, the authors tabulated the results as if they arose from a parallel design. Precision estimates properly accounting for the paired nature of the design were often unavailable from the study reports; consequently, to include trial findings in a meta-analysis would require further manipulation and assumptions. The high proportion of poorly reported analyses and results has the potential to affect whether crossover data should or can be included in a meta-analysis. There is pressing need for reporting guidelines for crossover trials.
Robotic Lunar Lander Development Project Status
NASA Technical Reports Server (NTRS)
Hammond, Monica; Bassler, Julie; Morse, Brian
2010-01-01
This slide presentation reviews the status of the development of a robotic lunar lander. The goal of the project is to perform engineering tests and risk reduction activities to support the development of a small lunar lander for lunar surface science. This includes: (1) risk reduction for the flight of the robotic lander, (i.e., testing and analyzing various phase of the project); (2) the incremental development for the design of the robotic lander, which is to demonstrate autonomous, controlled descent and landing on airless bodies, and design of thruster configuration for 1/6th of the gravity of earth; (3) cold gas test article in flight demonstration testing; (4) warm gas testing of the robotic lander design; (5) develop and test landing algorithms; (6) validate the algorithms through analysis and test; and (7) tests of the flight propulsion system.
Doing our best: optimization and the management of risk.
Ben-Haim, Yakov
2012-08-01
Tools and concepts of optimization are widespread in decision-making, design, and planning. There is a moral imperative to "do our best." Optimization underlies theories in physics and biology, and economic theories often presume that economic agents are optimizers. We argue that in decisions under uncertainty, what should be optimized is robustness rather than performance. We discuss the equity premium puzzle from financial economics, and explain that the puzzle can be resolved by using the strategy of satisficing rather than optimizing. We discuss design of critical technological infrastructure, showing that satisficing of performance requirements--rather than optimizing them--is a preferable design concept. We explore the need for disaster recovery capability and its methodological dilemma. The disparate domains--economics and engineering--illuminate different aspects of the challenge of uncertainty and of the significance of robust-satisficing. © 2012 Society for Risk Analysis.
Platelet aggregation inhibitors, vitamin K antagonists and risk of subarachnoid hemorrhage.
Risselada, R; Straatman, H; van Kooten, F; Dippel, D W J; van der Lugt, A; Niessen, W J; Firouzian, A; Herings, R M C; Sturkenboom, M C J M
2011-03-01
Use of platelet aggregation inhibitors and vitamin K antagonists has been associated with an increased risk of intracranial hemorrhage (ICH). Whether the use of these antithrombotic drugs is associated with an increased risk of subarachnoid hemorrhage (SAH) remains unclear, especially as confounding by indication might play a role. The aim of the present study was to investigate whether use of platelet aggregation inhibitors or vitamin K antagonists increase the risk of SAH. We applied population-based case-control, case-crossover and case-time-control designs to estimate the risk of SAH while addressing issues both of confounding by indication and time varying exposure within the PHARMO Record Linkage System database. This system includes drug dispensing records from community pharmacies and hospital discharge records of more than 3 million community-dwelling inhabitants in the Netherlands. Patients were considered a case if they were hospitalized for a first SAH (ICD-9-CM code 430) in the period between 1st January 1998 and 31st December 2006. Controls were selected from the source population, matched on age, gender and date of hospitalization. Conditional logistic regression was used to estimate multivariable adjusted odds ratios (ORs) and 95% confidence intervals (CIs) for the risk of SAH during use of platelet aggregation inhibitors or vitamin K antagonists. In the case-crossover and case-time-control designs we selected 11 control periods preceding the index date in successive steps of 1 month in the past. In all, 1004 cases of SAH were identified. In the case-control analysis the adjusted OR for the risk of SAH in current use of platelet aggregation inhibitors was 1.32 (95% CI: 1.02-1.70) and in current use of vitamin K antagonists 1.29 (95% CI: 0.89-1.87) compared with no use. In the case-crossover analysis the ORs for the risk of SAH in current use of platelet aggregation inhibitors and vitamin K antagonists were 1.04 (95% CI: 0.56-1.94) and 2.46 (95% CI: 1.04-5.82), respectively. In the case-time-control analysis the OR for platelet aggregation inhibitors was 0.50 (95% CI: 0.26-0.98) and for vitamin K antagonists 1.98 (95% CI: 0.82-4.76). The use of platelet aggregation inhibitors was not associated with an increased SAH risk; the modest increase observed in the case-control analysis could be as a result of confounding. The use of vitamin K antagonists seemed to be associated with an increased risk of SAH. The increase was most pronounced in the case-crossover analysis and therefore cannot be explained by unmeasured confounding. © 2011 International Society on Thrombosis and Haemostasis.
Chronic kidney disease in dogs in UK veterinary practices: prevalence, risk factors, and survival.
O'Neill, D G; Elliott, J; Church, D B; McGreevy, P D; Thomson, P C; Brodbelt, D C
2013-01-01
The prevalence for chronic kidney disease (CKD) in dogs varies widely (0.05-3.74%). Identified risk factors include advancing age, specific breeds, small body size, and periodontal disease. To estimate the prevalence and identify risk factors associated with CKD diagnosis and survival in dogs. Purebred dogs were hypothesized to have higher CKD risk and poorer survival characteristics than crossbred dogs. A merged clinical database of 107,214 dogs attending 89 UK veterinary practices over a 2-year period (January 2010-December 2011). A longitudinal study design estimated the apparent prevalence (AP) whereas the true prevalence (TP) was estimated using Bayesian analysis. A nested case-control study design evaluated risk factors. Survival analysis used the Kaplan-Meier survival curve method and multivariable Cox proportional hazards regression modeling. The CKD AP was 0.21% (95% CI: 0.19-0.24%) and TP was 0.37% (95% posterior credibility interval 0.02-1.44%). Significant risk factors included increasing age, being insured, and certain breeds (Cocker Spaniel, Cavalier King Charles Spaniel). Cardiac disease was a significant comorbid disorder. Significant clinical signs included halitosis, weight loss, polyuria/polydipsia, urinary incontinence, vomiting, decreased appetite, lethargy, and diarrhea. The median survival time from diagnosis was 226 days (95% CI 112-326 days). International Renal Interest Society stage and blood urea nitrogen concentration at diagnosis were significantly associated with hazard of death due to CKD. Chronic kidney disease compromises dog welfare. Increased awareness of CKD risk factors and association of blood biochemistry results with survival time should facilitate diagnosis and optimize case management to improve animal survival and welfare. Copyright © 2013 by the American College of Veterinary Internal Medicine.
Grelier, S; Thetio, M; Quentin, V; Achache, V; Sanchez, N; Leroux, V; Durand, E; Pequignot, R
2011-03-01
The National Hospital of Saint Maurice (HNSM) for Physical Medicine and Rehabilitation aims at strengthening its position as a pivot rehabilitation and physical therapy center. The opening in 2011 of a new unit for the evaluation and treatment of motor disabilities meets this objective. This project includes several parts: clinical, financial, architectural, organizational, applied clinical research as well as dealing with medical equipments and information system. This study focuses on the risk assessment of this future technical unit. This study was conducted by a group of professionals working for the hospital. It started with the design of a functional model to better comprehend the system to be analyzed. Risk assessment consists in confronting this functional model to a list of dangers in order to determine the vulnerable areas of the system. Then the team designed some scenarios to identify the causes, securities barriers and consequences in order to rank the risks. The analysis targeted various dangers, e.g. political, strategic, financial, economical, marketing, clinical and operational. The team identified more than 70 risky scenarios. For 75% of them the criticality level was deemed initially tolerable and under control or unacceptable. The implementation of an action plan for reducing the level of risks before opening this technical unit brought the system down to an acceptable level at 66%. A year prior to opening this technical unit for the evaluation and treatment of motor disabilities, conducting this preliminary risk assessment, with its exhaustive and rigorous methodology, enabled the concerned professionals to work together around an action plan for reducing the risks. 2011 Elsevier Masson SAS. All rights reserved.
Evaluating the Impact of Database Heterogeneity on Observational Study Results
Madigan, David; Ryan, Patrick B.; Schuemie, Martijn; Stang, Paul E.; Overhage, J. Marc; Hartzema, Abraham G.; Suchard, Marc A.; DuMouchel, William; Berlin, Jesse A.
2013-01-01
Clinical studies that use observational databases to evaluate the effects of medical products have become commonplace. Such studies begin by selecting a particular database, a decision that published papers invariably report but do not discuss. Studies of the same issue in different databases, however, can and do generate different results, sometimes with strikingly different clinical implications. In this paper, we systematically study heterogeneity among databases, holding other study methods constant, by exploring relative risk estimates for 53 drug-outcome pairs and 2 widely used study designs (cohort studies and self-controlled case series) across 10 observational databases. When holding the study design constant, our analysis shows that estimated relative risks range from a statistically significant decreased risk to a statistically significant increased risk in 11 of 53 (21%) of drug-outcome pairs that use a cohort design and 19 of 53 (36%) of drug-outcome pairs that use a self-controlled case series design. This exceeds the proportion of pairs that were consistent across databases in both direction and statistical significance, which was 9 of 53 (17%) for cohort studies and 5 of 53 (9%) for self-controlled case series. Our findings show that clinical studies that use observational databases can be sensitive to the choice of database. More attention is needed to consider how the choice of data source may be affecting results. PMID:23648805
Polycystic ovary syndrome (PCOS) and the risk of coronary heart disease (CHD): a meta-analysis.
Zhao, Luqian; Zhu, Zhigang; Lou, Huiling; Zhu, Guodong; Huang, Weimin; Zhang, Shaogang; Liu, Feng
2016-06-07
Some studies reported a significant association between polycystic ovary syndrome (PCOS) and risk of cardiovascular disease (CVD). However, the results are controversial. A systematic search was conducted in the PubMed, Science Direct, EMBASE, and Cochrane Library databases. Five case-control studies and 5 cohort studies were selected, involving a total of 104392 subjects in this meta-analysis. PCOS was significantly associated with the increased risk of CVD (OR = 1.30; 95% CI 1.09 - 1.56; P = 0.004). In the subgroup analysis of study design, both case-control studies and prospective cohort studies showed significant results (OR = 1.79; 95% CI 1.16 - 2.77; P = 0.009; OR = 1.20; 95% CI 1.06 - 1.37; P = 0.005), while retrospective cohort studies did not show positive result (OR = 0.91; 95% CI 0.60 - 1.40; P = 0.68). In a further stratified analysis by type of CVD, a significant association was found between PCOS and coronary heart disease (CHD) (OR = 1.44; 95% CI 1.13 - 1.84; P = 0.004). However, no significant association was observed between PCOS and myocardial infarction (MI) (OR = 1.01; 95% CI 0.68 - 1.51; P = 0.95). In conclusion, this meta-analysis suggested that PCOS is significantly associated with increased CHD risk.
Experimental Performance Evaluation of a Supersonic Turbine for Rocket Engine Applications
NASA Technical Reports Server (NTRS)
Snellgrove, Lauren M.; Griffin, Lisa W.; Sieja, James P.; Huber, Frank W.
2003-01-01
In order to mitigate the risk of rocket propulsion development, efficient, accurate, detailed fluid dynamics analysis and testing of the turbomachinery is necessary. To support this requirement, a task was developed at NASA Marshall Space Flight Center (MSFC) to improve turbine aerodynamic performance through the application of advanced design and analysis tools. These tools were applied to optimize a supersonic turbine design suitable for a reusable launch vehicle (RLV). The hot gas path and blading were redesigned-to obtain an increased efficiency. The goal of the demonstration was to increase the total-to- static efficiency of the turbine by eight points over the baseline design. A sub-scale, cold flow test article modeling the final optimized turbine was designed, manufactured, and tested in air at MSFC s Turbine Airflow Facility. Extensive on- and off- design point performance data, steady-state data, and unsteady blade loading data were collected during testing.
Mital, A
1999-01-01
Manual handling of materials continues to be a hazardous activity, leading to a very significant number of severe overexertion injuries. Designing jobs that are within the physical capabilities of workers is one approach ergonomists have adopted to redress this problem. As a result, several job design procedures have been developed over the years. However, these procedures are limited to designing or evaluating only pure lifting jobs or only the lifting aspect of a materials handling job. This paper describes a general procedure that may be used to design or analyse materials handling jobs that involve several different kinds of activities (e.g. lifting, lowering, carrying, pushing, etc). The job design/analysis procedure utilizes an elemental approach (breaking the job into elements) and relies on databases provided in A Guide to Manual Materials Handling to compute associated risk factors. The use of the procedure is demonstrated with the help of two case studies.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bloom, R.R.
1996-04-01
The objective of this report is to demonstrate that a thorough assessment of the risks associated with the operation of the Rust Geotech patented VAC*TRAX mobile treatment unit (MTU) has been performed and documented. The MTU was developed to treat mixed wastes at the US Department of Energy (DOE) Albuquerque Operations Office sites. The MTU uses an indirectly heated, batch vacuum dryer to thermally desorb organic compounds from mixed wastes. This process hazards analysis evaluated 102 potential hazards. The three significant hazards identified involved the inclusion of oxygen in a process that also included an ignition source and fuel. Changesmore » to the design of the MTU were made concurrent with the hazard identification and analysis; all hazards with initial risk rankings of 1 or 2 were reduced to acceptable risk rankings of 3 or 4. The overall risk to any population group from operation of the MTU was determined to be very low; the MTU is classified as a Radiological Facility with low hazards.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Waldhoff, Stephanie T.; Martinich, Jeremy; Sarofim, Marcus
2015-07-01
The Climate Change Impacts and Risk Analysis (CIRA) modeling exercise is a unique contribution to the scientific literature on climate change impacts, economic damages, and risk analysis that brings together multiple, national-scale models of impacts and damages in an integrated and consistent fashion to estimate climate change impacts, damages, and the benefits of greenhouse gas (GHG) mitigation actions in the United States. The CIRA project uses three consistent socioeconomic, emissions, and climate scenarios across all models to estimate the benefits of GHG mitigation policies: a Business As Usual (BAU) and two policy scenarios with radiative forcing (RF) stabilization targets ofmore » 4.5 W/m2 and 3.7 W/m2 in 2100. CIRA was also designed to specifically examine the sensitivity of results to uncertainties around climate sensitivity and differences in model structure. The goals of CIRA project are to 1) build a multi-model framework to produce estimates of multiple risks and impacts in the U.S., 2) determine to what degree risks and damages across sectors may be lowered from a BAU to policy scenarios, 3) evaluate key sources of uncertainty along the causal chain, and 4) provide information for multiple audiences and clearly communicate the risks and damages of climate change and the potential benefits of mitigation. This paper describes the motivations, goals, and design of the CIRA modeling exercise and introduces the subsequent papers in this special issue.« less
Ryan, Patrick B.; Schuemie, Martijn
2013-01-01
Background: Clinical studies that use observational databases, such as administrative claims and electronic health records, to evaluate the effects of medical products have become commonplace. These studies begin by selecting a particular study design, such as a case control, cohort, or self-controlled design, and different authors can and do choose different designs for the same clinical question. Furthermore, published papers invariably report the study design but do not discuss the rationale for the specific choice. Studies of the same clinical question with different designs, however, can generate different results, sometimes with strikingly different implications. Even within a specific study design, authors make many different analytic choices and these too can profoundly impact results. In this paper, we systematically study heterogeneity due to the type of study design and due to analytic choices within study design. Methods and findings: We conducted our analysis in 10 observational healthcare databases but mostly present our results in the context of the GE Centricity EMR database, an electronic health record database containing data for 11.2 million lives. We considered the impact of three different study design choices on estimates of associations between bisphosphonates and four particular health outcomes for which there is no evidence of an association. We show that applying alternative study designs can yield discrepant results, in terms of direction and significance of association. We also highlight that while traditional univariate sensitivity analysis may not show substantial variation, systematic assessment of all analytical choices within a study design can yield inconsistent results ranging from statistically significant decreased risk to statistically significant increased risk. Our findings show that clinical studies using observational databases can be sensitive both to study design choices and to specific analytic choices within study design. Conclusion: More attention is needed to consider how design choices may be impacting results and, when possible, investigators should examine a wide array of possible choices to confirm that significant findings are consistently identified. PMID:25083251
Smits, Christel CF; Jaddoe, Vincent WV; Hofman, Albert; Toelsie, Jerry R
2015-01-01
Background Noncommunicable diseases (NCDs) are the leading cause of death in low- and middle-income countries. Therefore, the surveillance of risk factors has become an issue of major importance for planning and implementation of preventive measures. Unfortunately, in these countries data on NCDs and their risk factors are limited. This also prevails in Suriname, a middle-income country of the Caribbean, with a multiethnic/multicultural population living in diverse residential areas. For these reasons, “The Suriname Health Study” was designed. Objective The main objective of this study is to estimate the prevalence of NCD risk factors, including metabolic syndrome, hypertension, and diabetes in Suriname. Differences between specific age groups, sexes, ethnic groups, and geographical areas will be emphasized. In addition, risk groups will be identified and targeted actions will be designed and evaluated. Methods In this study, several methodologies were combined. A stratified multistage cluster sample was used to select the participants of 6 ethnic groups (Hindustani, Creole, Javanese, Maroon, Chinese, Amerindians, and mixed) divided into 5 age groups (between 15 and 65 years) who live in urban/rural areas or the hinterland. A standardized World Health Organization STEPwise approach to surveillance questionnaire was adapted and used to obtain information about demographic characteristics, lifestyle, and risk factors. Physical examinations were performed to measure blood pressure, height, weight, and waist circumference. Biochemical analysis of collected blood samples evaluated the levels of glucose, high-density-lipoprotein cholesterol, total cholesterol, and triglycerides. Statistical analysis will be used to identify the burden of modifiable and unmodifiable risk factors in the aforementioned subgroups. Subsequently, tailor-made interventions will be prepared and their effects will be evaluated. Results The data as collected allow for national inference and valid analysis of the age, sex, and ethnicity subgroups in the Surinamese population. A publication of the basic survey results is anticipated in mid-2015. Secondary results on the effect of targeted lifestyle interventions are anticipated in late 2017. Conclusions Using the data collected in this study, the national prevalence of NCD risk factors will be approximated and described in a diverse population. This study is an entry point for formulating the structure of NCD prevention and surveillance. PMID:26085372
Asteroid Deflection Mission Design Considering On-Ground Risks
NASA Astrophysics Data System (ADS)
Rumpf, Clemens; Lewis, Hugh G.; Atkinson, Peter
The deflection of an Earth-threatening asteroid requires high transparency of the mission design process. The goal of such a mission is to move the projected point of impact over the face of Earth until the asteroid is on a miss trajectory. During the course of deflection operations, the projected point of impact will match regions that were less affected before alteration of the asteroid’s trajectory. These regions are at risk of sustaining considerable damage if the deflecting spacecraft becomes non-operational. The projected impact point would remain where the deflection mission put it at the time of mission failure. Hence, all regions that are potentially affected by the deflection campaign need to be informed about this risk and should be involved in the mission design process. A mission design compromise will have to be found that is acceptable to all affected parties (Schweickart, 2004). A software tool that assesses the on-ground risk due to deflection missions is under development. It will allow to study the accumulated on-ground risk along the path of the projected impact point. The tool will help determine a deflection mission design that minimizes the on-ground casualty and damage risk due to deflection operations. Currently, the tool is capable of simulating asteroid trajectories through the solar system and considers gravitational forces between solar system bodies. A virtual asteroid may be placed at an arbitrary point in the simulation for analysis and manipulation. Furthermore, the tool determines the asteroid’s point of impact and provides an estimate of the population at risk. Validation has been conducted against the solar system ephemeris catalogue HORIZONS by NASA’s Jet Propulsion Laboratory (JPL). Asteroids that are propagated over a period of 15 years show typical position discrepancies of 0.05 Earth radii relative to HORIZONS’ output. Ultimately, results from this research will aid in the identification of requirements for deflection missions that enable effective, minimum risk asteroid deflection. Schweickart, R. L. (2004). THE REAL DEFLECTION DILEMMA. In 2004 Planetary Defense Conference: Protecting Earth from Asteroids (pp. 1-6). Orange County, California. Retrieved from http://b612foundation.org/wp-content/uploads/2013/02/Real_Deflection_Dilemma.pdf
Lenters, Virissa; Basinas, Ioannis; Beane-Freeman, Laura; Boffetta, Paolo; Checkoway, Harvey; Coggon, David; Portengen, Lützen; Sim, Malcolm; Wouters, Inge M; Heederik, Dick; Vermeulen, Roel
2010-04-01
To examine the association between exposure to endotoxins and lung cancer risk by conducting a systematic review and meta-analysis of epidemiologic studies of workers in the cotton textile and agricultural industries; industries known for high exposure levels of endotoxins. Risk estimates were extracted from studies published before 2009 that met predefined quality criteria, including 8 cohort, 1 case-cohort, and 2 case-control studies of cotton textile industry workers, and 15 cohort and 2 case-control studies of agricultural workers. Summary risk estimates were calculated using random effects meta-analyses. Potential sources of heterogeneity were explored through subgroup analyses. The summary risk of lung cancer was 0.72 (95% CI, 0.57-0.90) for textile workers and 0.62 (0.52-0.75) for agricultural workers. The relative risk of lung cancer was below 1.0 for most subgroups defined according to sex, study design, outcome, smoking adjustment, and geographic area. Two studies provided quantitative estimates of endotoxin exposure and both studies tended to support a dose-dependent protective effect of endotoxins on lung cancer risk. Despite several limitations, this meta-analysis based on high-quality studies adds weight to the hypothesis that occupational exposure to endotoxin in cotton textile production and agriculture is protective against lung cancer.
Su, Bin; Sheng, Hui; Zhang, Manna; Bu, Le; Yang, Peng; Li, Liang; Li, Fei; Sheng, Chunjun; Han, Yuqi; Qu, Shen; Wang, Jiying
2015-02-01
Traditional anti-diabetic drugs may have negative or positive effects on risk of bone fractures. Yet the relationship between the new class glucagon-like peptide-1 receptor agonists (GLP-1 RA) and risk of bone fractures has not been established. We performed a meta-analysis including randomized controlled trials (RCT) to study the risk of bone fractures associated with liraglutide or exenatide, compared to placebo or other active drugs. We searched MEDLINE, EMBASE, and clinical trial registration websites for published or unpublished RCTs comparing the effects of liraglutide or exenatide with comparators. Only studies with disclosed bone fracture data were included. Separate pooled analysis was performed for liraglutide or exenatide, respectively, by calculating Mantel-Haenszel odds ratio (MH-OR). 16 RCTs were identified including a total of 11,206 patients. Liraglutide treatment was associated with a significant reduced risk of incident bone fractures (MH-OR=0.38, 95% CI 0.17-0.87); however, exenatide treatment was associated with an elevated risk of incident bone fractures (MH-OR=2.09, 95% CI 1.03-4.21). Publication bias and heterogeneity between studies were not observed. Our study demonstrated a divergent risk of bone fractures associated with different GLP-1 RA treatments. The current findings need to be confirmed by future well-designed prospective or RCT studies.
Steele, Jennifer A; Richter, Carsten H; Echaubard, Pierre; Saenna, Parichat; Stout, Virginia; Sithithaworn, Paiboon; Wilcox, Bruce A
2018-05-17
Cholangiocarcinoma (CCA) is a fatal bile duct cancer associated with infection by the liver fluke, Opisthorchis viverrini, in the lower Mekong region. Numerous public health interventions have focused on reducing exposure to O. viverrini, but incidence of CCA in the region remains high. While this may indicate the inefficacy of public health interventions due to complex social and cultural factors, it may further indicate other risk factors or interactions with the parasite are important in pathogenesis of CCA. This systematic review aims to provide a comprehensive analysis of described risk factors for CCA in addition to O. viverrini to guide future integrative interventions. We searched five international and seven Thai research databases to identify studies relevant to risk factors for CCA in the lower Mekong region. Selected studies were assessed for risk of bias and quality in terms of study design, population, CCA diagnostic methods, and statistical methods. The final 18 included studies reported numerous risk factors which were grouped into behaviors, socioeconomics, diet, genetics, gender, immune response, other infections, and treatment for O. viverrini. Seventeen risk factors were reported by two or more studies and were assessed with random effects models during meta-analysis. This meta-analysis indicates that the combination of alcohol and smoking (OR = 11.1, 95% CI: 5.63-21.92, P < 0.0001) is most significantly associated with increased risk for CCA and is an even greater risk factor than O. viverrini exposure. This analysis also suggests that family history of cancer, consumption of raw cyprinoid fish, consumption of high nitrate foods, and praziquantel treatment are associated with significantly increased risk. These risk factors may have complex relationships with the host, parasite, or pathogenesis of CCA, and many of these risk factors were found to interact with each other in one or more studies. Our findings suggest that a complex variety of risk factors in addition to O. viverrini infection should be addressed in future public health interventions to reduce CCA in affected regions. In particular, smoking and alcohol use, dietary patterns, and socioeconomic factors should be considered when developing intervention programs to reduce CCA.
The Application of a Residual Risk Evaluation Technique Used for Expendable Launch Vehicles
NASA Technical Reports Server (NTRS)
Latimer, John A.
2009-01-01
This presentation provides a Residual Risk Evaluation Technique (RRET) developed by Kennedy Space Center (KSC) Safety and Mission Assurance (S&MA) Launch Services Division. This technique is one of many procedures used by S&MA at KSC to evaluate residual risks for each Expendable Launch Vehicle (ELV) mission. RRET is a straight forward technique that incorporates the proven methodology of risk management, fault tree analysis, and reliability prediction. RRET derives a system reliability impact indicator from the system baseline reliability and the system residual risk reliability values. The system reliability impact indicator provides a quantitative measure of the reduction in the system baseline reliability due to the identified residual risks associated with the designated ELV mission. An example is discussed to provide insight into the application of RRET.
Risk assessment of watershed erosion at Naesung Stream, South Korea.
Ji, Un; Velleux, Mark; Julien, Pierre Y; Hwang, Manha
2014-04-01
A three-tiered approach was used to assess erosion risks within the Nakdong River Basin in South Korea and included: (1) a screening based on topography and land use; (2) a lumped parameter analysis using RUSLE; and (3) a detailed analysis using TREX, a fully distributed watershed model. These tiers span a range of spatial and temporal scales, with each tier providing increasing detail and resolution. The first two tiers were applied to the entire Nakdong River Basin and the Naesung Stream watershed was identified as having the highest soil erosion risk and potential for sedimentation problems. For the third tier, the TREX watershed model simulated runoff, channel flow, soil erosion, and stream sediment transport in the Naesung Stream watershed at very high resolution. TREX was calibrated for surface flows and sediment transport, and was used to simulate conditions for a large design storm. Highly erosive areas were identified along ridgelines in several headwater areas, with the northeast area of Songriwon having a particularly high erosion potential. Design storm simulations also indicated that sediment deposition of up to 55 cm could occur. Copyright © 2014 Elsevier Ltd. All rights reserved.
Finite element based damage assessment of composite tidal turbine blades
NASA Astrophysics Data System (ADS)
Fagan, Edward M.; Leen, Sean B.; Kennedy, Ciaran R.; Goggins, Jamie
2015-07-01
With significant interest growing in the ocean renewables sector, horizontal axis tidal current turbines are in a position to dominate the marketplace. The test devices that have been placed in operation so far have suffered from premature failures, caused by difficulties with structural strength prediction. The goal of this work is to develop methods of predicting the damage level in tidal turbines under their maximum operating tidal velocity. The analysis was conducted using the finite element software package Abaqus; shell models of three representative tidal turbine blades are produced. Different construction methods will affect the damage level in the blade and for this study models were developed with varying hydrofoil profiles. In order to determine the risk of failure, a user material subroutine (UMAT) was created. The UMAT uses the failure criteria designed by Alfred Puck to calculate the risk of fibre and inter-fibre failure in the blades. The results show that degradation of the stiffness is predicted for the operating conditions, having an effect on the overall tip deflection. The failure criteria applied via the UMAT form a useful tool for analysis of high risk regions within the blade designs investigated.
How teen mothers describe dating violence.
Herrman, Judith W
2013-07-01
To present voices of young women who were pregnant or new parents as they shared their thoughts on the risks, behaviors, and prevention of teen dating violence (TDV). Descriptive, qualitative analysis. Focus groups in schools designed expressly for the needs of young women who are pregnant and parenting. Twenty-six young mothers participated in one of three focus groups. Focus groups explored perceptions of several dimensions of intimate partner violence within the context of pregnancy and parenting. A semi-structured interview guide provided the medium to delve into young mothers' thoughts and perspectives about TDV. Data were organized in four major typologies: describing TDV, increasing the risk, why violence, and prevention of TDV. This analysis provides important insights into young mothers' lives when teen parenting and violence intersect. Rich data provide the foundation to expand awareness and inform programs and policies designed to address TDV in young women who are pregnant and parenting. Findings may be used by nurses to assess risk, identify teens in violent relationships, and provide for understanding of TDV from the perspectives of young women who are pregnant and parenting. © 2013 AWHONN, the Association of Women's Health, Obstetric and Neonatal Nurses.
Risk Evaluation in the Pre-Phase A Conceptual Design of Spacecraft
NASA Technical Reports Server (NTRS)
Fabisinski, Leo L., III; Maples, Charlotte Dauphne
2010-01-01
Typically, the most important decisions in the design of a spacecraft are made in the earliest stages of its conceptual design the Pre-Phase A stages. It is in these stages that the greatest number of design alternatives is considered, and the greatest number of alternatives is rejected. The focus of Pre-Phase A conceptual development is on the evaluation and comparison of whole concepts and the larger-scale systems comprising those concepts. This comparison typically uses general Figures of Merit (FOMs) to quantify the comparative benefits of designs and alternative design features. Along with mass, performance, and cost, risk should be one of the major FOMs in evaluating design decisions during the conceptual design phases. However, risk is often given inadequate consideration in conceptual design practice. The reasons frequently given for this lack of attention to risk include: inadequate mission definition, lack of rigorous design requirements in early concept phases, lack of fidelity in risk assessment methods, and under-evaluation of risk as a viable FOM for design evaluation. In this paper, the role of risk evaluation in early conceptual design is discussed. The various requirements of a viable risk evaluation tool at the Pre-Phase A level are considered in light of the needs of a typical spacecraft design study. A technique for risk identification and evaluation is presented. The application of the risk identification and evaluation approach to the conceptual design process is discussed. Finally, a computational tool for risk profiling is presented and applied to assess the risk for an existing Pre-Phase A proposal. The resulting profile is compared to the risks identified for the proposal by other means.
NASA Astrophysics Data System (ADS)
Modgil, Girish A.
Gas turbine engines for aerospace applications have evolved dramatically over the last 50 years through the constant pursuit for better specific fuel consumption, higher thrust-to-weight ratio, lower noise and emissions all while maintaining reliability and affordability. An important step in enabling these improvements is a forced response aeromechanics analysis involving structural dynamics and aerodynamics of the turbine. It is well documented that forced response vibration is a very critical problem in aircraft engine design, causing High Cycle Fatigue (HCF). Pushing the envelope on engine design has led to increased forced response problems and subsequently an increased risk of HCF failure. Forced response analysis is used to assess design feasibility of turbine blades for HCF using a material limit boundary set by the Goodman Diagram envelope that combines the effects of steady and vibratory stresses. Forced response analysis is computationally expensive, time consuming and requires multi-domain experts to finalize a result. As a consequence, high-fidelity aeromechanics analysis is performed deterministically and is usually done at the end of the blade design process when it is very costly to make significant changes to geometry or aerodynamic design. To address uncertainties in the system (engine operating point, temperature distribution, mistuning, etc.) and variability in material properties, designers apply conservative safety factors in the traditional deterministic approach, which leads to bulky designs. Moreover, using a deterministic approach does not provide a calculated risk of HCF failure. This thesis describes a process that begins with the optimal aerodynamic design of a turbomachinery blade developed using surrogate models of high-fidelity analyses. The resulting optimal blade undergoes probabilistic evaluation to generate aeromechanics results that provide a calculated likelihood of failure from HCF. An existing Rolls-Royce High Work Single Stage (HWSS) turbine blisk provides a baseline to demonstrate the process. The generalized polynomial chaos (gPC) toolbox which was developed includes sampling methods and constructs polynomial approximations. The toolbox provides not only the means for uncertainty quantification of the final blade design, but also facilitates construction of the surrogate models used for the blade optimization. This paper shows that gPC , with a small number of samples, achieves very fast rates of convergence and high accuracy in describing probability distributions without loss of detail in the tails . First, an optimization problem maximizes stage efficiency using turbine aerodynamic design rules as constraints; the function evaluations for this optimization are surrogate models from detailed 3D steady Computational Fluid Dynamics (CFD) analyses. The resulting optimal shape provides a starting point for the 3D high-fidelity aeromechanics (unsteady CFD and 3D Finite Element Analysis (FEA)) UQ study assuming three uncertain input parameters. This investigation seeks to find the steady and vibratory stresses associated with the first torsion mode for the HWSS turbine blisk near maximum operating speed of the engine. Using gPC to provide uncertainty estimates of the steady and vibratory stresses enables the creation of a Probabilistic Goodman Diagram, which - to the authors' best knowledge - is the first of its kind using high fidelity aeromechanics for turbomachinery blades. The Probabilistic Goodman Diagram enables turbine blade designers to make more informed design decisions and it allows the aeromechanics expert to assess quantitatively the risk associated with HCF for any mode crossing based on high fidelity simulations.
Energy efficient engine: Preliminary design and integration studies
NASA Technical Reports Server (NTRS)
Johnston, R. P.; Hirschkron, R.; Koch, C. C.; Neitzel, R. E.; Vinson, P. W.
1978-01-01
Parametric design and mission evaluations of advanced turbofan configurations were conducted for future transport aircraft application. Economics, environmental suitability and fuel efficiency were investigated and compared with goals set by NASA. Of the candidate engines which included mixed- and separate-flow, direct-drive and geared configurations, an advanced mixed-flow direct-drive configuration was selected for further design and evaluation. All goals were judged to have been met except the acoustic goal. Also conducted was a performance risk analysis and a preliminary aerodynamic design of the 10 stage 23:1 pressure ratio compressor used in the study engines.
Kodama, Satoru; Fujihara, Kazuya; Ishiguro, Hajime; Horikawa, Chika; Ohara, Nobumasa; Yachi, Yoko; Tanaka, Shiro; Shimano, Hitoshi; Kato, Kiminori; Hanyu, Osamu; Sone, Hirohito
2018-01-05
Many epidemiological studies have assessed the genetic risk of having undiagnosed or of developing type 2 diabetes mellitus (T2DM) using several single nucleotide polymorphisms (SNPs) based on findings of genome-wide association studies (GWAS). However, the quantitative association of cumulative risk alleles (RAs) of such SNPs with T2DM risk has been unclear. The aim of this meta-analysis is to review the strength of the association between cumulative RAs and T2DM risk. Systematic literature searches were conducted for cross-sectional or longitudinal studies that examined odds ratios (ORs) for T2DM in relation to genetic profiles. Logarithm of the estimated OR (log OR) of T2DM for 1 increment in RAs carried (1-ΔRA) in each study was pooled using a random-effects model. There were 46 eligible studies that included 74,880 cases among 249,365 participants. In 32 studies with a cross-sectional design, the pooled OR for T2DM morbidity for 1-ΔRA was 1.16 (95% confidence interval [CI], 1.13-1.19). In 15 studies that had a longitudinal design, the OR for incident T2DM was 1.10 (95% CI, 1.08-1.13). There was large heterogeneity in the magnitude of log OR (P < 0.001 for both cross-sectional studies and longitudinal studies). The top 10 commonly used genes significantly explained the variance in the log OR (P = 0.04 for cross-sectional studies; P = 0.006 for longitudinal studies). The current meta-analysis indicated that carrying 1-ΔRA in T2DM-associated SNPs was associated with a modest risk of prevalent or incident T2DM, although the heterogeneity in the used genes among studies requires us to interpret the results with caution.
Kodama, Satoru; Fujihara, Kazuya; Ishiguro, Hajime; Horikawa, Chika; Ohara, Nobumasa; Yachi, Yoko; Tanaka, Shiro; Shimano, Hitoshi; Kato, Kiminori; Hanyu, Osamu; Sone, Hirohito
2018-01-01
Many epidemiological studies have assessed the genetic risk of having undiagnosed or of developing type 2 diabetes mellitus (T2DM) using several single nucleotide polymorphisms (SNPs) based on findings of genome-wide association studies (GWAS). However, the quantitative association of cumulative risk alleles (RAs) of such SNPs with T2DM risk has been unclear. The aim of this meta-analysis is to review the strength of the association between cumulative RAs and T2DM risk. Systematic literature searches were conducted for cross-sectional or longitudinal studies that examined odds ratios (ORs) for T2DM in relation to genetic profiles. Logarithm of the estimated OR (log OR) of T2DM for 1 increment in RAs carried (1-ΔRA) in each study was pooled using a random-effects model. There were 46 eligible studies that included 74,880 cases among 249,365 participants. In 32 studies with a cross-sectional design, the pooled OR for T2DM morbidity for 1-ΔRA was 1.16 (95% confidence interval [CI], 1.13–1.19). In 15 studies that had a longitudinal design, the OR for incident T2DM was 1.10 (95% CI, 1.08–1.13). There was large heterogeneity in the magnitude of log OR (P < 0.001 for both cross-sectional studies and longitudinal studies). The top 10 commonly used genes significantly explained the variance in the log OR (P = 0.04 for cross-sectional studies; P = 0.006 for longitudinal studies). The current meta-analysis indicated that carrying 1-ΔRA in T2DM-associated SNPs was associated with a modest risk of prevalent or incident T2DM, although the heterogeneity in the used genes among studies requires us to interpret the results with caution. PMID:29093303
Ju, Woong; Oh, Seung-Won; Park, Sang Min; Koo, Bon-Kwon; Park, Byung-Joo
2013-01-01
Objective To assess the efficacy of vitamin and antioxidant supplements in the prevention of cardiovascular diseases. Design Meta-analysis of randomised controlled trials. Data sources and study selection PubMed, EMBASE, the Cochrane Library, Scopus, CINAHL, and ClinicalTrials.gov searched in June and November 2012. Two authors independently reviewed and selected eligible randomised controlled trials, based on predetermined selection criteria. Results Out of 2240 articles retrieved from databases and relevant bibliographies, 50 randomised controlled trials with 294 478 participants (156 663 in intervention groups and 137 815 in control groups) were included in the final analyses. In a fixed effect meta-analysis of the 50 trials, supplementation with vitamins and antioxidants was not associated with reductions in the risk of major cardiovascular events (relative risk 1.00, 95% confidence interval 0.98 to 1.02; I2=42%). Overall, there was no beneficial effect of these supplements in the subgroup meta-analyses by type of prevention, type of vitamins and antioxidants, type of cardiovascular outcomes, study design, methodological quality, duration of treatment, funding source, provider of supplements, type of control, number of participants in each trial, and supplements given singly or in combination with other supplements. Among the subgroup meta-analyses by type of cardiovascular outcomes, vitamin and antioxidant supplementation was associated with a marginally increased risk of angina pectoris, while low dose vitamin B6 supplementation was associated with a slightly decreased risk of major cardiovascular events. Those beneficial or harmful effects disappeared in subgroup meta-analysis of high quality randomised controlled trials within each category. Also, even though supplementation with vitamin B6 was associated with a decreased risk of cardiovascular death in high quality trials, and vitamin E supplementation with a decreased risk of myocardial infarction, those beneficial effects were seen only in randomised controlled trials in which the supplements were supplied by the pharmaceutical industry. Conclusion There is no evidence to support the use of vitamin and antioxidant supplements for prevention of cardiovascular diseases. PMID:23335472
Limitations of self-care in reducing the risk of lymphedema: supportive-educative systems.
Armer, Jane M; Brooks, Constance W; Stewart, Bob R
2011-01-01
The purpose of this study was to examine patient perceptions of limitations related to self-care measures to reduce lymphedema risk following breast cancer surgery. Secondary analysis of survey data from a companion study to a study piloting a behavioral-educational intervention was conducted to examine the specific limitations in performing lymphedema risk-reduction self-care measures. Findings suggest a more comprehensive approach is needed if patients are to engage in self-care actions to reduce lymphedema risk. Understanding the concepts of self-care and personal support interventions that include motivational interviewing can help nurses design supportive-educative care systems that assist patients in overcoming limitations in the estimative, transitional, and productive phases of self-care necessary to reduce lymphedema risk.
Mühlbacher, Axel; Bethge, Susanne
2016-12-01
The aim of this empirical study is to evaluate patient preferences for different characteristics of oral type 2 diabetes mellitus (T2DM) treatment. As T2DM treatment requires strict adherence, patient needs and preferences should be taken into consideration. Based on a qualitative and quantitative analysis, a discrete choice experiment (DCE) was applied to identify patient preferences. Apart from six identical attributes (adjustment of glycated hemoglobin [HbA1c], prevention of hypoglycemia, risk of genital infection, risk of gastrointestinal problems, risk of urinary tract infection, and weight change), one continuous variable of either "additional healthy life years" (AHY) or "additional costs" attribute (AC) was included. The DCE was conducted using a fractional factorial design, and the statistical data analysis used random effect logit models. In total, N = 626 (N = 318 AHY + N = 308 AC) T2DM patients participated in the survey. The estimation revealed a clear dominance for prevention of hypoglycemia (coefficient 0.937) and adjustment of HbA1c (coefficient 0.541). The attributes, "additional healthy life years" (coefficient 0.458) or "additional costs" (coefficient 0.420), were in the middle rank and both of significant impact. The side effects, risk of genital infection (coefficient 0.301), risk of gastrointestinal problems (coefficient 0.296), and risk of urinary tract infection (coefficient 0.241) followed in this respective order. Possible weight change (coefficient 0.047) was of less importance (last rank) to the patients in this evaluation. These survey results demonstrate how much a (hypothetical) T2DM oral treatment characteristic affects the treatment decision. The preference data can be used for risk-benefit assessment, cost-benefit assessment, and the establishment of patient-oriented evidence. Understanding how patients perceive and value different aspects of diabetes oral treatment is vital to the optimal design and evaluation of treatment options. The present results can be an additional source of information for design, assessment, and decision in T2DM treatment regimes. As such, more effective and efficient care of patients can be achieved, thereby increasing adherence.
Karnpean, Rossarin; Fucharoen, Goonnapa; Pansuwan, Anupong; Changtrakul, Duangrudee; Fucharoen, Supan
2013-06-01
No external quality assessment program for hemoglobin (Hb) analysis in the prevention and control of thalassemia has been established in Thailand. To improve the first line provisional diagnostics, the first proficiency testing (PT) program has been established. External Hb controls prepared at our center were sent to Hb analysis laboratories all over the country. Three cycles per year were performed in 2010 and 2011. In each cycle, two control samples with corresponding hematological parameters, designated as husband and his pregnant wife were supplied for Hb analysis. Each member analyzed the control samples in their routine practices. The results of Hb analysis, laboratory interpretation and risk assessment of the expected fetus for severe thalassemia diseases targeted for prevention and control were entered into the report form and sent back to our center. Participants reports were analyzed and classified into four different quality groups; Excellent (when all the three parameters are correct), Good (correct Hb analysis and interpretation but incorrect risk assessment), Fair (correct Hb analysis but incorrect interpretation and risk assessment) and Needs improvement (incorrect Hb analysis). It was found that most participants could report correct Hb types and quantifications but some misinterpretations and risk assessments were noted. These were clearly seen when control samples with more complexity were supplied. These results indicate a further improvement is required in the laboratory interpretation and knowledge of the laboratory diagnosis of thalassemia. The established system should facilitate the prevention and control program of thalassemia in the region.