Sample records for safety analysis computer

  1. European Workshop Industrical Computer Science Systems approach to design for safety

    NASA Technical Reports Server (NTRS)

    Zalewski, Janusz

    1992-01-01

    This paper presents guidelines on designing systems for safety, developed by the Technical Committee 7 on Reliability and Safety of the European Workshop on Industrial Computer Systems. The focus is on complementing the traditional development process by adding the following four steps: (1) overall safety analysis; (2) analysis of the functional specifications; (3) designing for safety; (4) validation of design. Quantitative assessment of safety is possible by means of a modular questionnaire covering various aspects of the major stages of system development.

  2. Investigation of safety analysis methods using computer vision techniques

    NASA Astrophysics Data System (ADS)

    Shirazi, Mohammad Shokrolah; Morris, Brendan Tran

    2017-09-01

    This work investigates safety analysis methods using computer vision techniques. The vision-based tracking system is developed to provide the trajectory of road users including vehicles and pedestrians. Safety analysis methods are developed to estimate time to collision (TTC) and postencroachment time (PET) that are two important safety measurements. Corresponding algorithms are presented and their advantages and drawbacks are shown through their success in capturing the conflict events in real time. The performance of the tracking system is evaluated first, and probability density estimation of TTC and PET are shown for 1-h monitoring of a Las Vegas intersection. Finally, an idea of an intersection safety map is introduced, and TTC values of two different intersections are estimated for 1 day from 8:00 a.m. to 6:00 p.m.

  3. Derivation of improved load transformation matrices for launchers-spacecraft coupled analysis, and direct computation of margins of safety

    NASA Technical Reports Server (NTRS)

    Klein, M.; Reynolds, J.; Ricks, E.

    1989-01-01

    Load and stress recovery from transient dynamic studies are improved upon using an extended acceleration vector in the modal acceleration technique applied to structural analysis. Extension of the normal LTM (load transformation matrices) stress recovery to automatically compute margins of safety is presented with an application to the Hubble space telescope.

  4. The Range Safety Debris Catalog Analysis in Preparation for the Pad Abort One Flight Test

    NASA Technical Reports Server (NTRS)

    Kutty, Prasad; Pratt, William

    2010-01-01

    With each flight test a Range Safety Data Package is assembled to understand the potential consequences of various failure scenarios. Debris catalog analysis considers an overpressure failure of the Abort Motor and the resulting debris field created 1. Characterize debris fragments generated by failure: weight, shape, and area 2. Compute fragment ballistic coefficients 3. Compute fragment ejection velocities.

  5. Safety analysis of interchanges

    DOT National Transportation Integrated Search

    2007-06-01

    The objectives of this research are to synthesize the current state of knowledge concerning the safety assessment of new or modified interchanges; develop a spreadsheet-based computational tool for performing safety assessments of interchanges; and i...

  6. Crash Simulation and Animation: 'A New Approach for Traffic Safety Analysis'

    DOT National Transportation Integrated Search

    2001-02-01

    This researchs objective is to present a methodology to supplement the conventional traffic safety analysis techniques. This methodology aims at using computer simulation to animate and visualize crash occurrence at high-risk locations. This methodol...

  7. Development of Flight Safety Prediction Methodology for U. S. Naval Safety Center. Revision 1

    DTIC Science & Technology

    1970-02-01

    Safety Center. The methodology develoned encompassed functional analysis of the F-4J aircraft, assessment of the importance of safety- sensitive ... Sensitivity ... ....... . 4-8 V 4.5 Model Implementation ........ ......... . 4-10 4.5.1 Functional Analysis ..... ........... . 4-11 4. 5. 2 Major...Function Sensitivity Assignment ........ ... 4-13 i 4.5.3 Link Dependency Assignment ... ......... . 4-14 4.5.4 Computer Program for Sensitivity

  8. Human performance cognitive-behavioral modeling: a benefit for occupational safety.

    PubMed

    Gore, Brian F

    2002-01-01

    Human Performance Modeling (HPM) is a computer-aided job analysis software methodology used to generate predictions of complex human-automation integration and system flow patterns with the goal of improving operator and system safety. The use of HPM tools has recently been increasing due to reductions in computational cost, augmentations in the tools' fidelity, and usefulness in the generated output. An examination of an Air Man-machine Integration Design and Analysis System (Air MIDAS) model evaluating complex human-automation integration currently underway at NASA Ames Research Center will highlight the importance to occupational safety of considering both cognitive and physical aspects of performance when researching human error.

  9. Human performance cognitive-behavioral modeling: a benefit for occupational safety

    NASA Technical Reports Server (NTRS)

    Gore, Brian F.

    2002-01-01

    Human Performance Modeling (HPM) is a computer-aided job analysis software methodology used to generate predictions of complex human-automation integration and system flow patterns with the goal of improving operator and system safety. The use of HPM tools has recently been increasing due to reductions in computational cost, augmentations in the tools' fidelity, and usefulness in the generated output. An examination of an Air Man-machine Integration Design and Analysis System (Air MIDAS) model evaluating complex human-automation integration currently underway at NASA Ames Research Center will highlight the importance to occupational safety of considering both cognitive and physical aspects of performance when researching human error.

  10. Computer vision-based analysis of foods: a non-destructive colour measurement tool to monitor quality and safety.

    PubMed

    Mogol, Burçe Ataç; Gökmen, Vural

    2014-05-01

    Computer vision-based image analysis has been widely used in food industry to monitor food quality. It allows low-cost and non-contact measurements of colour to be performed. In this paper, two computer vision-based image analysis approaches are discussed to extract mean colour or featured colour information from the digital images of foods. These types of information may be of particular importance as colour indicates certain chemical changes or physical properties in foods. As exemplified here, the mean CIE a* value or browning ratio determined by means of computer vision-based image analysis algorithms can be correlated with acrylamide content of potato chips or cookies. Or, porosity index as an important physical property of breadcrumb can be calculated easily. In this respect, computer vision-based image analysis provides a useful tool for automatic inspection of food products in a manufacturing line, and it can be actively involved in the decision-making process where rapid quality/safety evaluation is needed. © 2013 Society of Chemical Industry.

  11. Reliability/safety analysis of a fly-by-wire system

    NASA Technical Reports Server (NTRS)

    Brock, L. D.; Goddman, H. A.

    1980-01-01

    An analysis technique has been developed to estimate the reliability of a very complex, safety-critical system by constructing a diagram of the reliability equations for the total system. This diagram has many of the characteristics of a fault-tree or success-path diagram, but is much easier to construct for complex redundant systems. The diagram provides insight into system failure characteristics and identifies the most likely failure modes. A computer program aids in the construction of the diagram and the computation of reliability. Analysis of the NASA F-8 Digital Fly-by-Wire Flight Control System is used to illustrate the technique.

  12. A Study on Urban Road Traffic Safety Based on Matter Element Analysis

    PubMed Central

    Hu, Qizhou; Zhou, Zhuping; Sun, Xu

    2014-01-01

    This paper examines a new evaluation of urban road traffic safety based on a matter element analysis, avoiding the difficulties found in other traffic safety evaluations. The issue of urban road traffic safety has been investigated through the matter element analysis theory. The chief aim of the present work is to investigate the features of urban road traffic safety. Emphasis was placed on the construction of a criterion function by which traffic safety achieved a hierarchical system of objectives to be evaluated. The matter element analysis theory was used to create the comprehensive appraisal model of urban road traffic safety. The technique was used to employ a newly developed and versatile matter element analysis algorithm. The matter element matrix solves the uncertainty and incompatibility of the evaluated factors used to assess urban road traffic safety. The application results showed the superiority of the evaluation model and a didactic example was included to illustrate the computational procedure. PMID:25587267

  13. Knowledge management: Role of the the Radiation Safety Information Computational Center (RSICC)

    NASA Astrophysics Data System (ADS)

    Valentine, Timothy

    2017-09-01

    The Radiation Safety Information Computational Center (RSICC) at Oak Ridge National Laboratory (ORNL) is an information analysis center that collects, archives, evaluates, synthesizes and distributes information, data and codes that are used in various nuclear technology applications. RSICC retains more than 2,000 software packages that have been provided by code developers from various federal and international agencies. RSICC's customers (scientists, engineers, and students from around the world) obtain access to such computing codes (source and/or executable versions) and processed nuclear data files to promote on-going research, to ensure nuclear and radiological safety, and to advance nuclear technology. The role of such information analysis centers is critical for supporting and sustaining nuclear education and training programs both domestically and internationally, as the majority of RSICC's customers are students attending U.S. universities. Additionally, RSICC operates a secure CLOUD computing system to provide access to sensitive export-controlled modeling and simulation (M&S) tools that support both domestic and international activities. This presentation will provide a general review of RSICC's activities, services, and systems that support knowledge management and education and training in the nuclear field.

  14. Safety Guided Design Based on Stamp/STPA for Manned Vehicle in Concept Design Phase

    NASA Astrophysics Data System (ADS)

    Ujiie, Ryo; Katahira, Masafumi; Miyamoto, Yuko; Umeda, Hiroki; Leveson, Nancy; Hoshino, Nobuyuki

    2013-09-01

    In manned vehicles, such as the Soyuz and the Space Shuttle, the crew and computer system cooperate to succeed in returning to the earth. While computers increase the functionality of system, they also increase the complexity of the interaction between the controllers (human and computer) and the target dynamics. In some cases, the complexity can produce a serious accident. To prevent such losses, traditional hazard analysis such as FTA has been applied to system development, however it can be used after creating a detailed system because it focuses on detailed component failures. As a result, it's more difficult to eliminate hazard cause early in the process when it is most feasible.STAMP/STPA is a new hazard analysis that can be applied from the early development phase, with the analysis being refined as more detailed decisions are made. In essence, the analysis and design decisions are intertwined and go hand-in-hand. We have applied STAMP/STPA to a concept design of a new JAXA manned vehicle and tried safety guided design of the vehicle. As a result of this trial, it has been shown that STAMP/STPA can be accepted easily by system engineers and the design has been made more sophisticated from a safety viewpoint. The result also shows that the consequences of human errors on system safety can be analysed in the early development phase and the system designed to prevent them. Finally, the paper will discuss an effective way to harmonize this safety guided design approach with system engineering process based on the result of this experience in this project.

  15. A probability-based approach for assessment of roadway safety hardware.

    DOT National Transportation Integrated Search

    2017-03-14

    This report presents a general probability-based approach for assessment of roadway safety hardware (RSH). It was achieved using a reliability : analysis method and computational techniques. With the development of high-fidelity finite element (FE) m...

  16. Annotation analysis for testing drug safety signals using unstructured clinical notes

    PubMed Central

    2012-01-01

    Background The electronic surveillance for adverse drug events is largely based upon the analysis of coded data from reporting systems. Yet, the vast majority of electronic health data lies embedded within the free text of clinical notes and is not gathered into centralized repositories. With the increasing access to large volumes of electronic medical data—in particular the clinical notes—it may be possible to computationally encode and to test drug safety signals in an active manner. Results We describe the application of simple annotation tools on clinical text and the mining of the resulting annotations to compute the risk of getting a myocardial infarction for patients with rheumatoid arthritis that take Vioxx. Our analysis clearly reveals elevated risks for myocardial infarction in rheumatoid arthritis patients taking Vioxx (odds ratio 2.06) before 2005. Conclusions Our results show that it is possible to apply annotation analysis methods for testing hypotheses about drug safety using electronic medical records. PMID:22541596

  17. Reactor Operations Monitoring System

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hart, M.M.

    1989-01-01

    The Reactor Operations Monitoring System (ROMS) is a VME based, parallel processor data acquisition and safety action system designed by the Equipment Engineering Section and Reactor Engineering Department of the Savannah River Site. The ROMS will be analyzing over 8 million signal samples per minute. Sixty-eight microprocessors are used in the ROMS in order to achieve a real-time data analysis. The ROMS is composed of multiple computer subsystems. Four redundant computer subsystems monitor 600 temperatures with 2400 thermocouples. Two computer subsystems share the monitoring of 600 reactor coolant flows. Additional computer subsystems are dedicated to monitoring 400 signals from assortedmore » process sensors. Data from these computer subsystems are transferred to two redundant process display computer subsystems which present process information to reactor operators and to reactor control computers. The ROMS is also designed to carry out safety functions based on its analysis of process data. The safety functions include initiating a reactor scram (shutdown), the injection of neutron poison, and the loadshed of selected equipment. A complete development Reactor Operations Monitoring System has been built. It is located in the Program Development Center at the Savannah River Site and is currently being used by the Reactor Engineering Department in software development. The Equipment Engineering Section is designing and fabricating the process interface hardware. Upon proof of hardware and design concept, orders will be placed for the final five systems located in the three reactor areas, the reactor training simulator, and the hardware maintenance center.« less

  18. Numerical Computation of Homogeneous Slope Stability

    PubMed Central

    Xiao, Shuangshuang; Li, Kemin; Ding, Xiaohua; Liu, Tong

    2015-01-01

    To simplify the computational process of homogeneous slope stability, improve computational accuracy, and find multiple potential slip surfaces of a complex geometric slope, this study utilized the limit equilibrium method to derive expression equations of overall and partial factors of safety. This study transformed the solution of the minimum factor of safety (FOS) to solving of a constrained nonlinear programming problem and applied an exhaustive method (EM) and particle swarm optimization algorithm (PSO) to this problem. In simple slope examples, the computational results using an EM and PSO were close to those obtained using other methods. Compared to the EM, the PSO had a small computation error and a significantly shorter computation time. As a result, the PSO could precisely calculate the slope FOS with high efficiency. The example of the multistage slope analysis indicated that this slope had two potential slip surfaces. The factors of safety were 1.1182 and 1.1560, respectively. The differences between these and the minimum FOS (1.0759) were small, but the positions of the slip surfaces were completely different than the critical slip surface (CSS). PMID:25784927

  19. Numerical computation of homogeneous slope stability.

    PubMed

    Xiao, Shuangshuang; Li, Kemin; Ding, Xiaohua; Liu, Tong

    2015-01-01

    To simplify the computational process of homogeneous slope stability, improve computational accuracy, and find multiple potential slip surfaces of a complex geometric slope, this study utilized the limit equilibrium method to derive expression equations of overall and partial factors of safety. This study transformed the solution of the minimum factor of safety (FOS) to solving of a constrained nonlinear programming problem and applied an exhaustive method (EM) and particle swarm optimization algorithm (PSO) to this problem. In simple slope examples, the computational results using an EM and PSO were close to those obtained using other methods. Compared to the EM, the PSO had a small computation error and a significantly shorter computation time. As a result, the PSO could precisely calculate the slope FOS with high efficiency. The example of the multistage slope analysis indicated that this slope had two potential slip surfaces. The factors of safety were 1.1182 and 1.1560, respectively. The differences between these and the minimum FOS (1.0759) were small, but the positions of the slip surfaces were completely different than the critical slip surface (CSS).

  20. An analysis of electronic health record-related patient safety incidents.

    PubMed

    Palojoki, Sari; Mäkelä, Matti; Lehtonen, Lasse; Saranto, Kaija

    2017-06-01

    The aim of this study was to analyse electronic health record-related patient safety incidents in the patient safety incident reporting database in fully digital hospitals in Finland. We compare Finnish data to similar international data and discuss their content with regard to the literature. We analysed the types of electronic health record-related patient safety incidents that occurred at 23 hospitals during a 2-year period. A procedure of taxonomy mapping served to allow comparisons. This study represents a rare examination of patient safety risks in a fully digital environment. The proportion of electronic health record-related incidents was markedly higher in our study than in previous studies with similar data. Human-computer interaction problems were the most frequently reported. The results show the possibility of error arising from the complex interaction between clinicians and computers.

  1. A method for identifying EMI critical circuits during development of a large C3

    NASA Astrophysics Data System (ADS)

    Barr, Douglas H.

    The circuit analysis methods and process Boeing Aerospace used on a large, ground-based military command, control, and communications (C3) system are described. This analysis was designed to help identify electromagnetic interference (EMI) critical circuits. The methodology used the MIL-E-6051 equipment criticality categories as the basis for defining critical circuits, relational database technology to help sort through and account for all of the approximately 5000 system signal cables, and Macintosh Plus personal computers to predict critical circuits based on safety margin analysis. The EMI circuit analysis process systematically examined all system circuits to identify which ones were likely to be EMI critical. The process used two separate, sequential safety margin analyses to identify critical circuits (conservative safety margin analysis, and detailed safety margin analysis). These analyses used field-to-wire and wire-to-wire coupling models using both worst-case and detailed circuit parameters (physical and electrical) to predict circuit safety margins. This process identified the predicted critical circuits that could then be verified by test.

  2. OECD/NEA expert group on uncertainty analysis for criticality safety assessment: Results of benchmark on sensitivity calculation (phase III)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ivanova, T.; Laville, C.; Dyrda, J.

    2012-07-01

    The sensitivities of the k{sub eff} eigenvalue to neutron cross sections have become commonly used in similarity studies and as part of the validation algorithm for criticality safety assessments. To test calculations of the sensitivity coefficients, a benchmark study (Phase III) has been established by the OECD-NEA/WPNCS/EG UACSA (Expert Group on Uncertainty Analysis for Criticality Safety Assessment). This paper presents some sensitivity results generated by the benchmark participants using various computational tools based upon different computational methods: SCALE/TSUNAMI-3D and -1D, MONK, APOLLO2-MORET 5, DRAGON-SUSD3D and MMKKENO. The study demonstrates the performance of the tools. It also illustrates how model simplificationsmore » impact the sensitivity results and demonstrates the importance of 'implicit' (self-shielding) sensitivities. This work has been a useful step towards verification of the existing and developed sensitivity analysis methods. (authors)« less

  3. Fire safety distances for open pool fires

    NASA Astrophysics Data System (ADS)

    Sudheer, S.; Kumar, Lokendra; Manjunath, B. S.; Pasi, Amit; Meenakshi, G.; Prabhu, S. V.

    2013-11-01

    Fire accidents that carry huge loss with them have increased in the previous two decades than at any time in the history. Hence, there is a need for understanding the safety distances from different fires with different fuels. Fire safety distances are computed for different open pool fires. Diesel, gasoline and hexane are used as fuels for circular pool diameters of 0.5 m, 0.7 m and 1.0 m. A large square pool fire of 4 m × 4 m is also conducted with diesel as a fuel. All the prescribed distances in this study are purely based on the thermal analysis. IR camera is used to get the thermal images of pool fires and there by the irradiance at different locations is computed. The computed irradiance is presented with the threshold heat flux limits for human beings.

  4. Probabilistic design of fibre concrete structures

    NASA Astrophysics Data System (ADS)

    Pukl, R.; Novák, D.; Sajdlová, T.; Lehký, D.; Červenka, J.; Červenka, V.

    2017-09-01

    Advanced computer simulation is recently well-established methodology for evaluation of resistance of concrete engineering structures. The nonlinear finite element analysis enables to realistically predict structural damage, peak load, failure, post-peak response, development of cracks in concrete, yielding of reinforcement, concrete crushing or shear failure. The nonlinear material models can cover various types of concrete and reinforced concrete: ordinary concrete, plain or reinforced, without or with prestressing, fibre concrete, (ultra) high performance concrete, lightweight concrete, etc. Advanced material models taking into account fibre concrete properties such as shape of tensile softening branch, high toughness and ductility are described in the paper. Since the variability of the fibre concrete material properties is rather high, the probabilistic analysis seems to be the most appropriate format for structural design and evaluation of structural performance, reliability and safety. The presented combination of the nonlinear analysis with advanced probabilistic methods allows evaluation of structural safety characterized by failure probability or by reliability index respectively. Authors offer a methodology and computer tools for realistic safety assessment of concrete structures; the utilized approach is based on randomization of the nonlinear finite element analysis of the structural model. Uncertainty of the material properties or their randomness obtained from material tests are accounted in the random distribution. Furthermore, degradation of the reinforced concrete materials such as carbonation of concrete, corrosion of reinforcement, etc. can be accounted in order to analyze life-cycle structural performance and to enable prediction of the structural reliability and safety in time development. The results can serve as a rational basis for design of fibre concrete engineering structures based on advanced nonlinear computer analysis. The presented methodology is illustrated on results from two probabilistic studies with different types of concrete structures related to practical applications and made from various materials (with the parameters obtained from real material tests).

  5. Selecting an Architecture for a Safety-Critical Distributed Computer System with Power, Weight and Cost Considerations

    NASA Technical Reports Server (NTRS)

    Torres-Pomales, Wilfredo

    2014-01-01

    This report presents an example of the application of multi-criteria decision analysis to the selection of an architecture for a safety-critical distributed computer system. The design problem includes constraints on minimum system availability and integrity, and the decision is based on the optimal balance of power, weight and cost. The analysis process includes the generation of alternative architectures, evaluation of individual decision criteria, and the selection of an alternative based on overall value. In this example presented here, iterative application of the quantitative evaluation process made it possible to deliberately generate an alternative architecture that is superior to all others regardless of the relative importance of cost.

  6. Current implementation and future plans on new code architecture, programming language and user interface

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brun, B.

    1997-07-01

    Computer technology has improved tremendously during the last years with larger media capacity, memory and more computational power. Visual computing with high-performance graphic interface and desktop computational power have changed the way engineers accomplish everyday tasks, development and safety studies analysis. The emergence of parallel computing will permit simulation over a larger domain. In addition, new development methods, languages and tools have appeared in the last several years.

  7. Navier-Stokes flow field analysis of compressible flow in a high pressure safety relief valve

    NASA Technical Reports Server (NTRS)

    Vu, Bruce; Wang, Ten-See; Shih, Ming-Hsin; Soni, Bharat

    1993-01-01

    The objective of this study is to investigate the complex three-dimensional flowfield of an oxygen safety pressure relieve valve during an incident, with a computational fluid dynamic (CFD) analysis. Specifically, the analysis will provide a flow pattern that would lead to the expansion of the eventual erosion pattern of the hardware, so as to combine it with other findings to piece together a most likely scenario for the investigation. The CFD model is a pressure based solver. An adaptive upwind difference scheme is employed for the spatial discretization, and a predictor, multiple corrector method is used for the velocity-pressure coupling. The computational result indicated vortices formation near the opening of the valve which matched the erosion pattern of the damaged hardware.

  8. Consumer product safety: A systems problem

    NASA Technical Reports Server (NTRS)

    Clark, C. C.

    1971-01-01

    The manufacturer, tester, retailer, consumer, repairer disposer, trade and professional associations, national and international standards bodies, and governments in several roles are all involved in consumer product safety. A preliminary analysis, drawing on system safety techniques, is utilized to distinguish the inter-relations of these many groups and the responsibilities that they are or could take for product safety, including the slow accident hazards as well as the more commonly discussed fast accident hazards. The importance of interactive computer aided information flow among these groups is particularly stressed.

  9. Computational modelling of ovine critical-sized tibial defects with implanted scaffolds and prediction of the safety of fixator removal.

    PubMed

    Doyle, Heather; Lohfeld, Stefan; Dürselen, Lutz; McHugh, Peter

    2015-04-01

    Computational model geometries of tibial defects with two types of implanted tissue engineering scaffolds, β-tricalcium phosphate (β-TCP) and poly-ε-caprolactone (PCL)/β-TCP, are constructed from µ-CT scan images of the real in vivo defects. Simulations of each defect under four-point bending and under simulated in vivo axial compressive loading are performed. The mechanical stability of each defect is analysed using stress distribution analysis. The results of this analysis highlights the influence of callus volume, and both scaffold volume and stiffness, on the load-bearing abilities of these defects. Clinically-used image-based methods to predict the safety of removing external fixation are evaluated for each defect. Comparison of these measures with the results of computational analyses indicates that care must be taken in the interpretation of these measures. Copyright © 2015 Elsevier Ltd. All rights reserved.

  10. Safety Network to Detect Performance Degradation and Pilot Incapacitation (Reseau de securite pour detecter la degradation des performances et la defaillance du pilote)

    DTIC Science & Technology

    1990-09-01

    military pilot acceptance of a safety network system would be based , as always, on the following: a. Do I really need such a system and will it be a...inferring pilot state based on computer analysis of pilot control inputs (or lack of)l. Having decided that the pilot is incapacitated, PMAS would alert...the advances being made in neural network computing machinery have necessitated a complete re-thinking of the conventional serial von Neuman machine

  11. Modeling and Analysis of Mixed Synchronous/Asynchronous Systems

    NASA Technical Reports Server (NTRS)

    Driscoll, Kevin R.; Madl. Gabor; Hall, Brendan

    2012-01-01

    Practical safety-critical distributed systems must integrate safety critical and non-critical data in a common platform. Safety critical systems almost always consist of isochronous components that have synchronous or asynchronous interface with other components. Many of these systems also support a mix of synchronous and asynchronous interfaces. This report presents a study on the modeling and analysis of asynchronous, synchronous, and mixed synchronous/asynchronous systems. We build on the SAE Architecture Analysis and Design Language (AADL) to capture architectures for analysis. We present preliminary work targeted to capture mixed low- and high-criticality data, as well as real-time properties in a common Model of Computation (MoC). An abstract, but representative, test specimen system was created as the system to be modeled.

  12. The feasibility of implementing the data analysis and reporting techniques (DART) package in Virginia.

    DOT National Transportation Integrated Search

    1980-01-01

    This project was undertaken for the Virginia Department of Transportation Safety to assess the feasibility of implementing the Data Analysis and Reporting Techniques (DART) computer software system in Virginia. Following a review of available literat...

  13. 78 FR 47804 - Verification, Validation, Reviews, and Audits for Digital Computer Software Used in Safety...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-08-06

    ..., ``Configuration Management Plans for Digital Computer Software used in Safety Systems of Nuclear Power Plants... Digital Computer Software Used in Safety Systems of Nuclear Power Plants AGENCY: Nuclear Regulatory..., Reviews, and Audits for Digital Computer Software Used in Safety Systems of Nuclear Power Plants.'' This...

  14. Program Multi - A Multi-purpose Program for Computing and Graphing Roots and Values for Any Real Function -- Users/Programmers Manual

    DOT National Transportation Integrated Search

    1976-05-01

    As part of its activity under the Rail Equipment Safety Project, computer programs for track/train dynamics analysis are being developed and modified. As part of this effort, derailment behavior of trains negotiating curves under buff or draft has be...

  15. Research and technology at the Kennedy Space Center

    NASA Technical Reports Server (NTRS)

    1983-01-01

    Cryogenic engineering, hypergolic engineering, hazardous warning, structures and mechanics, computer sciences, communications, meteorology, technology applications, safety engineering, materials analysis, biomedicine, and engineering management and training aids research are reviewed.

  16. Parallel computation safety analysis irradiation targets fission product molybdenum in neutronic aspect using the successive over-relaxation algorithm

    NASA Astrophysics Data System (ADS)

    Susmikanti, Mike; Dewayatna, Winter; Sulistyo, Yos

    2014-09-01

    One of the research activities in support of commercial radioisotope production program is a safety research on target FPM (Fission Product Molybdenum) irradiation. FPM targets form a tube made of stainless steel which contains nuclear-grade high-enrichment uranium. The FPM irradiation tube is intended to obtain fission products. Fission materials such as Mo99 used widely the form of kits in the medical world. The neutronics problem is solved using first-order perturbation theory derived from the diffusion equation for four groups. In contrast, Mo isotopes have longer half-lives, about 3 days (66 hours), so the delivery of radioisotopes to consumer centers and storage is possible though still limited. The production of this isotope potentially gives significant economic value. The criticality and flux in multigroup diffusion model was calculated for various irradiation positions and uranium contents. This model involves complex computation, with large and sparse matrix system. Several parallel algorithms have been developed for the sparse and large matrix solution. In this paper, a successive over-relaxation (SOR) algorithm was implemented for the calculation of reactivity coefficients which can be done in parallel. Previous works performed reactivity calculations serially with Gauss-Seidel iteratives. The parallel method can be used to solve multigroup diffusion equation system and calculate the criticality and reactivity coefficients. In this research a computer code was developed to exploit parallel processing to perform reactivity calculations which were to be used in safety analysis. The parallel processing in the multicore computer system allows the calculation to be performed more quickly. This code was applied for the safety limits calculation of irradiated FPM targets containing highly enriched uranium. The results of calculations neutron show that for uranium contents of 1.7676 g and 6.1866 g (× 106 cm-1) in a tube, their delta reactivities are the still within safety limits; however, for 7.9542 g and 8.838 g (× 106 cm-1) the limits were exceeded.

  17. Application verification research of cloud computing technology in the field of real time aerospace experiment

    NASA Astrophysics Data System (ADS)

    Wan, Junwei; Chen, Hongyan; Zhao, Jing

    2017-08-01

    According to the requirements of real-time, reliability and safety for aerospace experiment, the single center cloud computing technology application verification platform is constructed. At the IAAS level, the feasibility of the cloud computing technology be applied to the field of aerospace experiment is tested and verified. Based on the analysis of the test results, a preliminary conclusion is obtained: Cloud computing platform can be applied to the aerospace experiment computing intensive business. For I/O intensive business, it is recommended to use the traditional physical machine.

  18. 29 CFR 1960.59 - Training of employees and employee representatives.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... specialized job safety and health training appropriate to the work performed by the employee, for example: Clerical; printing; welding; crane operation; chemical analysis, and computer operations. Such training...

  19. A novel evaluation method for building construction project based on integrated information entropy with reliability theory.

    PubMed

    Bai, Xiao-ping; Zhang, Xi-wei

    2013-01-01

    Selecting construction schemes of the building engineering project is a complex multiobjective optimization decision process, in which many indexes need to be selected to find the optimum scheme. Aiming at this problem, this paper selects cost, progress, quality, and safety as the four first-order evaluation indexes, uses the quantitative method for the cost index, uses integrated qualitative and quantitative methodologies for progress, quality, and safety indexes, and integrates engineering economics, reliability theories, and information entropy theory to present a new evaluation method for building construction project. Combined with a practical case, this paper also presents detailed computing processes and steps, including selecting all order indexes, establishing the index matrix, computing score values of all order indexes, computing the synthesis score, sorting all selected schemes, and making analysis and decision. Presented method can offer valuable references for risk computing of building construction projects.

  20. Analyses of track shift under high-speed vehicle-track interaction : safety of high speed ground transportation systems

    DOT National Transportation Integrated Search

    1997-06-01

    This report describes analysis tools to predict shift under high-speed vehicle- : track interaction. The analysis approach is based on two fundamental models : developed (as part of this research); the first model computes the track lateral : residua...

  1. CAD/CAE-technologies application for assessment of passenger safety on railway transport in emergency

    NASA Astrophysics Data System (ADS)

    Antipin, D. Ya; Shorokhov, S. G.; Bondarenko, O. I.

    2018-03-01

    A possibility of using current software products realizing CAD/CAE-technologies for the assessment of passenger safety in emergency cases on railway transport has been analyzed. On the basis of the developed solid computer model of an anthropometric dummy, the authors carried out an analysis of possible levels of passenger injury during accident collision of a train with an obstacle.

  2. Inherent Conservatism in Deterministic Quasi-Static Structural Analysis

    NASA Technical Reports Server (NTRS)

    Verderaime, V.

    1997-01-01

    The cause of the long-suspected excessive conservatism in the prevailing structural deterministic safety factor has been identified as an inherent violation of the error propagation laws when reducing statistical data to deterministic values and then combining them algebraically through successive structural computational processes. These errors are restricted to the applied stress computations, and because mean and variations of the tolerance limit format are added, the errors are positive, serially cumulative, and excessively conservative. Reliability methods circumvent these errors and provide more efficient and uniform safe structures. The document is a tutorial on the deficiencies and nature of the current safety factor and of its improvement and transition to absolute reliability.

  3. Requirements for a multifunctional code architecture

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tiihonen, O.; Juslin, K.

    1997-07-01

    The present paper studies a set of requirements for a multifunctional simulation software architecture in the light of experiences gained in developing and using the APROS simulation environment. The huge steps taken in the development of computer hardware and software during the last ten years are changing the status of the traditional nuclear safety analysis software. The affordable computing power on the safety analysts table by far exceeds the possibilities offered to him/her ten years ago. At the same time the features of everyday office software tend to set standards to the way the input data and calculational results aremore » managed.« less

  4. Implementing Software Safety in the NASA Environment

    NASA Technical Reports Server (NTRS)

    Wetherholt, Martha S.; Radley, Charles F.

    1994-01-01

    Until recently, NASA did not consider allowing computers total control of flight systems. Human operators, via hardware, have constituted the ultimate safety control. In an attempt to reduce costs, NASA has come to rely more and more heavily on computers and software to control space missions. (For example. software is now planned to control most of the operational functions of the International Space Station.) Thus the need for systematic software safety programs has become crucial for mission success. Concurrent engineering principles dictate that safety should be designed into software up front, not tested into the software after the fact. 'Cost of Quality' studies have statistics and metrics to prove the value of building quality and safety into the development cycle. Unfortunately, most software engineers are not familiar with designing for safety, and most safety engineers are not software experts. Software written to specifications which have not been safety analyzed is a major source of computer related accidents. Safer software is achieved step by step throughout the system and software life cycle. It is a process that includes requirements definition, hazard analyses, formal software inspections, safety analyses, testing, and maintenance. The greatest emphasis is placed on clearly and completely defining system and software requirements, including safety and reliability requirements. Unfortunately, development and review of requirements are the weakest link in the process. While some of the more academic methods, e.g. mathematical models, may help bring about safer software, this paper proposes the use of currently approved software methodologies, and sound software and assurance practices to show how, to a large degree, safety can be designed into software from the start. NASA's approach today is to first conduct a preliminary system hazard analysis (PHA) during the concept and planning phase of a project. This determines the overall hazard potential of the system to be built. Shortly thereafter, as the system requirements are being defined, the second iteration of hazard analyses takes place, the systems hazard analysis (SHA). During the systems requirements phase, decisions are made as to what functions of the system will be the responsibility of software. This is the most critical time to affect the safety of the software. From this point, software safety analyses as well as software engineering practices are the main focus for assuring safe software. While many of the steps proposed in this paper seem like just sound engineering practices, they are the best technical and most cost effective means to assure safe software within a safe system.

  5. Software Safety Risk in Legacy Safety-Critical Computer Systems

    NASA Technical Reports Server (NTRS)

    Hill, Janice; Baggs, Rhoda

    2007-01-01

    Safety-critical computer systems must be engineered to meet system and software safety requirements. For legacy safety-critical computer systems, software safety requirements may not have been formally specified during development. When process-oriented software safety requirements are levied on a legacy system after the fact, where software development artifacts don't exist or are incomplete, the question becomes 'how can this be done?' The risks associated with only meeting certain software safety requirements in a legacy safety-critical computer system must be addressed should such systems be selected as candidates for reuse. This paper proposes a method for ascertaining formally, a software safety risk assessment, that provides measurements for software safety for legacy systems which may or may not have a suite of software engineering documentation that is now normally required. It relies upon the NASA Software Safety Standard, risk assessment methods based upon the Taxonomy-Based Questionnaire, and the application of reverse engineering CASE tools to produce original design documents for legacy systems.

  6. Analysis on the security of cloud computing

    NASA Astrophysics Data System (ADS)

    He, Zhonglin; He, Yuhua

    2011-02-01

    Cloud computing is a new technology, which is the fusion of computer technology and Internet development. It will lead the revolution of IT and information field. However, in cloud computing data and application software is stored at large data centers, and the management of data and service is not completely trustable, resulting in safety problems, which is the difficult point to improve the quality of cloud service. This paper briefly introduces the concept of cloud computing. Considering the characteristics of cloud computing, it constructs the security architecture of cloud computing. At the same time, with an eye toward the security threats cloud computing faces, several corresponding strategies are provided from the aspect of cloud computing users and service providers.

  7. Overview of Risk Mitigation for Safety-Critical Computer-Based Systems

    NASA Technical Reports Server (NTRS)

    Torres-Pomales, Wilfredo

    2015-01-01

    This report presents a high-level overview of a general strategy to mitigate the risks from threats to safety-critical computer-based systems. In this context, a safety threat is a process or phenomenon that can cause operational safety hazards in the form of computational system failures. This report is intended to provide insight into the safety-risk mitigation problem and the characteristics of potential solutions. The limitations of the general risk mitigation strategy are discussed and some options to overcome these limitations are provided. This work is part of an ongoing effort to enable well-founded assurance of safety-related properties of complex safety-critical computer-based aircraft systems by developing an effective capability to model and reason about the safety implications of system requirements and design.

  8. The Operation of a Specialized Scientific Information and Data Analysis Center With Computer Base and Associated Communications Network.

    ERIC Educational Resources Information Center

    Cottrell, William B.; And Others

    The Nuclear Safety Information Center (NSIC) is a highly sophisticated scientific information center operated at Oak Ridge National Laboratory (ORNL) for the U.S. Atomic Energy Commission. Its information file, which consists of both data and bibliographic information, is computer stored and numerous programs have been developed to facilitate the…

  9. Traffic information computing platform for big data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Duan, Zongtao, E-mail: ztduan@chd.edu.cn; Li, Ying, E-mail: ztduan@chd.edu.cn; Zheng, Xibin, E-mail: ztduan@chd.edu.cn

    Big data environment create data conditions for improving the quality of traffic information service. The target of this article is to construct a traffic information computing platform for big data environment. Through in-depth analysis the connotation and technology characteristics of big data and traffic information service, a distributed traffic atomic information computing platform architecture is proposed. Under the big data environment, this type of traffic atomic information computing architecture helps to guarantee the traffic safety and efficient operation, more intelligent and personalized traffic information service can be used for the traffic information users.

  10. Safety Verification of the Small Aircraft Transportation System Concept of Operations

    NASA Technical Reports Server (NTRS)

    Carreno, Victor; Munoz, Cesar

    2005-01-01

    A critical factor in the adoption of any new aeronautical technology or concept of operation is safety. Traditionally, safety is accomplished through a rigorous process that involves human factors, low and high fidelity simulations, and flight experiments. As this process is usually performed on final products or functional prototypes, concept modifications resulting from this process are very expensive to implement. This paper describe an approach to system safety that can take place at early stages of a concept design. It is based on a set of mathematical techniques and tools known as formal methods. In contrast to testing and simulation, formal methods provide the capability of exhaustive state exploration analysis. We present the safety analysis and verification performed for the Small Aircraft Transportation System (SATS) Concept of Operations (ConOps). The concept of operations is modeled using discrete and hybrid mathematical models. These models are then analyzed using formal methods. The objective of the analysis is to show, in a mathematical framework, that the concept of operation complies with a set of safety requirements. It is also shown that the ConOps has some desirable characteristic such as liveness and absence of dead-lock. The analysis and verification is performed in the Prototype Verification System (PVS), which is a computer based specification language and a theorem proving assistant.

  11. Computer-based training for safety: comparing methods with older and younger workers.

    PubMed

    Wallen, Erik S; Mulloy, Karen B

    2006-01-01

    Computer-based safety training is becoming more common and is being delivered to an increasingly aging workforce. Aging results in a number of changes that make it more difficult to learn from certain types of computer-based training. Instructional designs derived from cognitive learning theories may overcome some of these difficulties. Three versions of computer-based respiratory safety training were shown to older and younger workers who then took a high and a low level learning test. Younger workers did better overall. Both older and younger workers did best with the version containing text with pictures and audio narration. Computer-based training with pictures and audio narration may be beneficial for workers over 45 years of age. Computer-based safety training has advantages but workers of different ages may benefit differently. Computer-based safety programs should be designed and selected based on their ability to effectively train older as well as younger learners.

  12. Preliminary Results Obtained in Integrated Safety Analysis of NASA Aviation Safety Program Technologies

    NASA Technical Reports Server (NTRS)

    Reveley, Mary S.

    2003-01-01

    The goal of the NASA Aviation Safety Program (AvSP) is to develop and demonstrate technologies that contribute to a reduction in the aviation fatal accident rate by a factor of 5 by the year 2007 and by a factor of 10 by the year 2022. Integrated safety analysis of day-to-day operations and risks within those operations will provide an understanding of the Aviation Safety Program portfolio. Safety benefits analyses are currently being conducted. Preliminary results for the Synthetic Vision Systems (SVS) and Weather Accident Prevention (WxAP) projects of the AvSP have been completed by the Logistics Management Institute under a contract with the NASA Glenn Research Center. These analyses include both a reliability analysis and a computer simulation model. The integrated safety analysis method comprises two principal components: a reliability model and a simulation model. In the reliability model, the results indicate how different technologies and systems will perform in normal, degraded, and failed modes of operation. In the simulation, an operational scenario is modeled. The primary purpose of the SVS project is to improve safety by providing visual-flightlike situation awareness during instrument conditions. The current analyses are an estimate of the benefits of SVS in avoiding controlled flight into terrain. The scenario modeled has an aircraft flying directly toward a terrain feature. When the flight crew determines that the aircraft is headed toward an obstruction, the aircraft executes a level turn at speed. The simulation is ended when the aircraft completes the turn.

  13. Benchmark On Sensitivity Calculation (Phase III)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ivanova, Tatiana; Laville, Cedric; Dyrda, James

    2012-01-01

    The sensitivities of the keff eigenvalue to neutron cross sections have become commonly used in similarity studies and as part of the validation algorithm for criticality safety assessments. To test calculations of the sensitivity coefficients, a benchmark study (Phase III) has been established by the OECD-NEA/WPNCS/EG UACSA (Expert Group on Uncertainty Analysis for Criticality Safety Assessment). This paper presents some sensitivity results generated by the benchmark participants using various computational tools based upon different computational methods: SCALE/TSUNAMI-3D and -1D, MONK, APOLLO2-MORET 5, DRAGON-SUSD3D and MMKKENO. The study demonstrates the performance of the tools. It also illustrates how model simplifications impactmore » the sensitivity results and demonstrates the importance of 'implicit' (self-shielding) sensitivities. This work has been a useful step towards verification of the existing and developed sensitivity analysis methods.« less

  14. Towards An Engineering Discipline of Computational Security

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mili, Ali; Sheldon, Frederick T; Jilani, Lamia Labed

    2007-01-01

    George Boole ushered the era of modern logic by arguing that logical reasoning does not fall in the realm of philosophy, as it was considered up to his time, but in the realm of mathematics. As such, logical propositions and logical arguments are modeled using algebraic structures. Likewise, we submit that security attributes must be modeled as formal mathematical propositions that are subject to mathematical analysis. In this paper, we approach this problem by attempting to model security attributes in a refinement-like framework that has traditionally been used to represent reliability and safety claims. Keywords: Computable security attributes, survivability, integrity,more » dependability, reliability, safety, security, verification, testing, fault tolerance.« less

  15. Overview of Design, Lifecycle, and Safety for Computer-Based Systems

    NASA Technical Reports Server (NTRS)

    Torres-Pomales, Wilfredo

    2015-01-01

    This document describes the need and justification for the development of a design guide for safety-relevant computer-based systems. This document also makes a contribution toward the design guide by presenting an overview of computer-based systems design, lifecycle, and safety.

  16. Older People's Perceptions of Pedestrian Friendliness and Traffic Safety: An Experiment Using Computer-Simulated Walking Environments.

    PubMed

    Kahlert, Daniela; Schlicht, Wolfgang

    2015-08-21

    Traffic safety and pedestrian friendliness are considered to be important conditions for older people's motivation to walk through their environment. This study uses an experimental study design with computer-simulated living environments to investigate the effect of micro-scale environmental factors (parking spaces and green verges with trees) on older people's perceptions of both motivational antecedents (dependent variables). Seventy-four consecutively recruited older people were randomly assigned watching one of two scenarios (independent variable) on a computer screen. The scenarios simulated a stroll on a sidewalk, as it is 'typical' for a German city. In version 'A,' the subjects take a fictive walk on a sidewalk where a number of cars are parked partially on it. In version 'B', cars are in parking spaces separated from the sidewalk by grass verges and trees. Subjects assessed their impressions of both dependent variables. A multivariate analysis of covariance showed that subjects' ratings on perceived traffic safety and pedestrian friendliness were higher for Version 'B' compared to version 'A'. Cohen's d indicates medium (d = 0.73) and large (d = 1.23) effect sizes for traffic safety and pedestrian friendliness, respectively. The study suggests that elements of the built environment might affect motivational antecedents of older people's walking behavior.

  17. A fully Bayesian before-after analysis of permeable friction course (PFC) pavement wet weather safety.

    PubMed

    Buddhavarapu, Prasad; Smit, Andre F; Prozzi, Jorge A

    2015-07-01

    Permeable friction course (PFC), a porous hot-mix asphalt, is typically applied to improve wet weather safety on high-speed roadways in Texas. In order to warrant expensive PFC construction, a statistical evaluation of its safety benefits is essential. Generally, the literature on the effectiveness of porous mixes in reducing wet-weather crashes is limited and often inconclusive. In this study, the safety effectiveness of PFC was evaluated using a fully Bayesian before-after safety analysis. First, two groups of road segments overlaid with PFC and non-PFC material were identified across Texas; the non-PFC or reference road segments selected were similar to their PFC counterparts in terms of site specific features. Second, a negative binomial data generating process was assumed to model the underlying distribution of crash counts of PFC and reference road segments to perform Bayesian inference on the safety effectiveness. A data-augmentation based computationally efficient algorithm was employed for a fully Bayesian estimation. The statistical analysis shows that PFC is not effective in reducing wet weather crashes. It should be noted that the findings of this study are in agreement with the existing literature, although these studies were not based on a fully Bayesian statistical analysis. Our study suggests that the safety effectiveness of PFC road surfaces, or any other safety infrastructure, largely relies on its interrelationship with the road user. The results suggest that the safety infrastructure must be properly used to reap the benefits of the substantial investments. Copyright © 2015 Elsevier Ltd. All rights reserved.

  18. Micro Computer Tomography for medical device and pharmaceutical packaging analysis.

    PubMed

    Hindelang, Florine; Zurbach, Raphael; Roggo, Yves

    2015-04-10

    Biomedical device and medicine product manufacturing are long processes facing global competition. As technology evolves with time, the level of quality, safety and reliability increases simultaneously. Micro Computer Tomography (Micro CT) is a tool allowing a deep investigation of products: it can contribute to quality improvement. This article presents the numerous applications of Micro CT for medical device and pharmaceutical packaging analysis. The samples investigated confirmed CT suitability for verification of integrity, measurements and defect detections in a non-destructive manner. Copyright © 2015 Elsevier B.V. All rights reserved.

  19. Computer use, language, and literacy in safety net clinic communication

    PubMed Central

    Barton, Jennifer L; Lyles, Courtney R; Wu, Michael; Yelin, Edward H; Martinez, Diana; Schillinger, Dean

    2017-01-01

    Objective: Patients with limited health literacy (LHL) and limited English proficiency (LEP) experience suboptimal communication and health outcomes. Electronic health record implementation in safety net clinics may affect communication with LHL and LEP patients. We investigated the associations between safety net clinician computer use and patient-provider communication for patients with LEP and LHL. Materials and Methods: We video-recorded encounters at 5 academically affiliated US public hospital clinics between English- and Spanish-speaking patients with chronic conditions and their primary and specialty care clinicians. We analyzed changes in communication behaviors (coded with the Roter Interaction Analysis System) with each additional point on a clinician computer use score, controlling for clinician type and visit length and stratified by English proficiency and health literacy status. Results: Greater clinician computer use was associated with more biomedical statements (+12.4, P = .03) and less positive affect (−0.6, P < .01) from LEP/LHL patients. In visits with patients with adequate English proficiency/health literacy, greater clinician computer use was associated with less positive patient affect (−0.9, P < .01), fewer clinician psychosocial statements (−3.5, P < .05), greater clinician verbal dominance (+0.09, P < .01), and lower ratings on quality of care and communication. Conclusion: Higher clinician computer use was associated with more biomedical focus with LEP/LHL patients, and clinician verbal dominance and lower ratings with patients with adequate English proficiency and health literacy. Discussion: Implementation research should explore interventions to enhance relationship-centered communication for diverse patient populations in the computer era. PMID:27274017

  20. Computer use, language, and literacy in safety net clinic communication.

    PubMed

    Ratanawongsa, Neda; Barton, Jennifer L; Lyles, Courtney R; Wu, Michael; Yelin, Edward H; Martinez, Diana; Schillinger, Dean

    2017-01-01

    Patients with limited health literacy (LHL) and limited English proficiency (LEP) experience suboptimal communication and health outcomes. Electronic health record implementation in safety net clinics may affect communication with LHL and LEP patients.We investigated the associations between safety net clinician computer use and patient-provider communication for patients with LEP and LHL. We video-recorded encounters at 5 academically affiliated US public hospital clinics between English- and Spanish-speaking patients with chronic conditions and their primary and specialty care clinicians. We analyzed changes in communication behaviors (coded with the Roter Interaction Analysis System) with each additional point on a clinician computer use score, controlling for clinician type and visit length and stratified by English proficiency and health literacy status. Greater clinician computer use was associated with more biomedical statements (+12.4, P = .03) and less positive affect (-0.6, P < .01) from LEP/LHL patients. In visits with patients with adequate English proficiency/health literacy, greater clinician computer use was associated with less positive patient affect (-0.9, P < .01), fewer clinician psychosocial statements (-3.5, P < .05), greater clinician verbal dominance (+0.09, P < .01), and lower ratings on quality of care and communication. Higher clinician computer use was associated with more biomedical focus with LEP/LHL patients, and clinician verbal dominance and lower ratings with patients with adequate English proficiency and health literacy. Implementation research should explore interventions to enhance relationship-centered communication for diverse patient populations in the computer era. © The Author 2016. Published by Oxford University Press on behalf of the American Medical Informatics Association.

  1. 77 FR 50723 - Verification, Validation, Reviews, and Audits for Digital Computer Software Used in Safety...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-08-22

    ... Digital Computer Software Used in Safety Systems of Nuclear Power Plants AGENCY: Nuclear Regulatory..., ``Verification, Validation, Reviews, and Audits for Digital Computer Software used in Safety Systems of Nuclear... NRC regulations promoting the development of, and compliance with, software verification and...

  2. Spartan Release Engagement Mechanism (REM) stress and fracture analysis

    NASA Technical Reports Server (NTRS)

    Marlowe, D. S.; West, E. J.

    1984-01-01

    The revised stress and fracture analysis of the Spartan REM hardware for current load conditions and mass properties is presented. The stress analysis was performed using a NASTRAN math model of the Spartan REM adapter, base, and payload. Appendix A contains the material properties, loads, and stress analysis of the hardware. The computer output and model description are in Appendix B. Factors of safety used in the stress analysis were 1.4 on tested items and 2.0 on all other items. Fracture analysis of the items considered fracture critical was accomplished using the MSFC Crack Growth Analysis code. Loads and stresses were obtaind from the stress analysis. The fracture analysis notes are located in Appendix A and the computer output in Appendix B. All items analyzed met design and fracture criteria.

  3. Safety of High Speed Ground Transportation Systems : Analytical Methodology for Safety Validation of Computer Controlled Subsystems : Volume 2. Development of a Safety Validation Methodology

    DOT National Transportation Integrated Search

    1995-01-01

    This report describes the development of a methodology designed to assure that a sufficiently high level of safety is achieved and maintained in computer-based systems which perform safety cortical functions in high-speed rail or magnetic levitation ...

  4. A Research Roadmap for Computation-Based Human Reliability Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Boring, Ronald; Mandelli, Diego; Joe, Jeffrey

    2015-08-01

    The United States (U.S.) Department of Energy (DOE) is sponsoring research through the Light Water Reactor Sustainability (LWRS) program to extend the life of the currently operating fleet of commercial nuclear power plants. The Risk Informed Safety Margin Characterization (RISMC) research pathway within LWRS looks at ways to maintain and improve the safety margins of these plants. The RISMC pathway includes significant developments in the area of thermalhydraulics code modeling and the development of tools to facilitate dynamic probabilistic risk assessment (PRA). PRA is primarily concerned with the risk of hardware systems at the plant; yet, hardware reliability is oftenmore » secondary in overall risk significance to human errors that can trigger or compound undesirable events at the plant. This report highlights ongoing efforts to develop a computation-based approach to human reliability analysis (HRA). This computation-based approach differs from existing static and dynamic HRA approaches in that it: (i) interfaces with a dynamic computation engine that includes a full scope plant model, and (ii) interfaces with a PRA software toolset. The computation-based HRA approach presented in this report is called the Human Unimodels for Nuclear Technology to Enhance Reliability (HUNTER) and incorporates in a hybrid fashion elements of existing HRA methods to interface with new computational tools developed under the RISMC pathway. The goal of this research effort is to model human performance more accurately than existing approaches, thereby minimizing modeling uncertainty found in current plant risk models.« less

  5. Reactivity effects of moderator voids

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ahlfeld, C.E.; Pryor, R.J.

    1975-01-01

    Reactivity worths for large moderator voids similar to those produced by steaming in postulated reactor transients were measured in the Process Development Pile (PDP) reactor. The experimental results were compared to the computed void worths obtained from techniques currently used in routine safety analyses. Neutron energy spectrum measurements were used to verify a modified lattice pattern that correctly computed the measured spectrum, and consequently, improved macroscopic cross sections. In addition, a special two-dimensional transport calculation was performed to obtain an axially defined diffusion coefficient for the void region. The combination of the modified lattice calculations and the axial diffusion coefficientmore » yielded void reactivity worths which agreed very well with experiment. It was concluded that the computational modules available in the JOSHUA system (GLASS, GRIMHX) would yield accurate void reactivity worths in SLR--SRP safety analysis studies, provided the above mentioned modifications were made.« less

  6. 76 FR 40943 - Notice of Issuance of Regulatory Guide

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-07-12

    ..., Revision 3, ``Criteria for Use of Computers in Safety Systems of Nuclear Power Plants.'' FOR FURTHER..., ``Criteria for Use of Computers in Safety Systems of Nuclear Power Plants,'' was issued with a temporary... Fuel Reprocessing Plants,'' to 10 CFR part 50 with regard to the use of computers in safety systems of...

  7. CASKS (Computer Analysis of Storage casKS): A microcomputer based analysis system for storage cask design review. User`s manual to Version 1b (including program reference)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chen, T.F.; Gerhard, M.A.; Trummer, D.J.

    CASKS (Computer Analysis of Storage casKS) is a microcomputer-based system of computer programs and databases developed at the Lawrence Livermore National Laboratory (LLNL) for evaluating safety analysis reports on spent-fuel storage casks. The bulk of the complete program and this user`s manual are based upon the SCANS (Shipping Cask ANalysis System) program previously developed at LLNL. A number of enhancements and improvements were added to the original SCANS program to meet requirements unique to storage casks. CASKS is an easy-to-use system that calculates global response of storage casks to impact loads, pressure loads and thermal conditions. This provides reviewers withmore » a tool for an independent check on analyses submitted by licensees. CASKS is based on microcomputers compatible with the IBM-PC family of computers. The system is composed of a series of menus, input programs, cask analysis programs, and output display programs. All data is entered through fill-in-the-blank input screens that contain descriptive data requests.« less

  8. Dynamic analysis methods for detecting anomalies in asynchronously interacting systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kumar, Akshat; Solis, John Hector; Matschke, Benjamin

    2014-01-01

    Detecting modifications to digital system designs, whether malicious or benign, is problematic due to the complexity of the systems being analyzed. Moreover, static analysis techniques and tools can only be used during the initial design and implementation phases to verify safety and liveness properties. It is computationally intractable to guarantee that any previously verified properties still hold after a system, or even a single component, has been produced by a third-party manufacturer. In this paper we explore new approaches for creating a robust system design by investigating highly-structured computational models that simplify verification and analysis. Our approach avoids the needmore » to fully reconstruct the implemented system by incorporating a small verification component that dynamically detects for deviations from the design specification at run-time. The first approach encodes information extracted from the original system design algebraically into a verification component. During run-time this component randomly queries the implementation for trace information and verifies that no design-level properties have been violated. If any deviation is detected then a pre-specified fail-safe or notification behavior is triggered. Our second approach utilizes a partitioning methodology to view liveness and safety properties as a distributed decision task and the implementation as a proposed protocol that solves this task. Thus the problem of verifying safety and liveness properties is translated to that of verifying that the implementation solves the associated decision task. We develop upon results from distributed systems and algebraic topology to construct a learning mechanism for verifying safety and liveness properties from samples of run-time executions.« less

  9. Older People’s Perceptions of Pedestrian Friendliness and Traffic Safety: An Experiment Using Computer-Simulated Walking Environments

    PubMed Central

    Kahlert, Daniela; Schlicht, Wolfgang

    2015-01-01

    Traffic safety and pedestrian friendliness are considered to be important conditions for older people’s motivation to walk through their environment. This study uses an experimental study design with computer-simulated living environments to investigate the effect of micro-scale environmental factors (parking spaces and green verges with trees) on older people’s perceptions of both motivational antecedents (dependent variables). Seventy-four consecutively recruited older people were randomly assigned watching one of two scenarios (independent variable) on a computer screen. The scenarios simulated a stroll on a sidewalk, as it is ‘typical’ for a German city. In version ‘A,’ the subjects take a fictive walk on a sidewalk where a number of cars are parked partially on it. In version ‘B’, cars are in parking spaces separated from the sidewalk by grass verges and trees. Subjects assessed their impressions of both dependent variables. A multivariate analysis of covariance showed that subjects’ ratings on perceived traffic safety and pedestrian friendliness were higher for Version ‘B’ compared to version ‘A’. Cohen’s d indicates medium (d = 0.73) and large (d = 1.23) effect sizes for traffic safety and pedestrian friendliness, respectively. The study suggests that elements of the built environment might affect motivational antecedents of older people’s walking behavior. PMID:26308026

  10. Air traffic surveillance and control using hybrid estimation and protocol-based conflict resolution

    NASA Astrophysics Data System (ADS)

    Hwang, Inseok

    The continued growth of air travel and recent advances in new technologies for navigation, surveillance, and communication have led to proposals by the Federal Aviation Administration (FAA) to provide reliable and efficient tools to aid Air Traffic Control (ATC) in performing their tasks. In this dissertation, we address four problems frequently encountered in air traffic surveillance and control; multiple target tracking and identity management, conflict detection, conflict resolution, and safety verification. We develop a set of algorithms and tools to aid ATC; These algorithms have the provable properties of safety, computational efficiency, and convergence. Firstly, we develop a multiple-maneuvering-target tracking and identity management algorithm which can keep track of maneuvering aircraft in noisy environments and of their identities. Secondly, we propose a hybrid probabilistic conflict detection algorithm between multiple aircraft which uses flight mode estimates as well as aircraft current state estimates. Our algorithm is based on hybrid models of aircraft, which incorporate both continuous dynamics and discrete mode switching. Thirdly, we develop an algorithm for multiple (greater than two) aircraft conflict avoidance that is based on a closed-form analytic solution and thus provides guarantees of safety. Finally, we consider the problem of safety verification of control laws for safety critical systems, with application to air traffic control systems. We approach safety verification through reachability analysis, which is a computationally expensive problem. We develop an over-approximate method for reachable set computation using polytopic approximation methods and dynamic optimization. These algorithms may be used either in a fully autonomous way, or as supporting tools to increase controllers' situational awareness and to reduce their work load.

  11. Posttest analysis of LOFT LOCE L2-3 using the ESA RELAP4 blowdown model. [PWR

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Perryman, J.L.; Samuels, T.K.; Cooper, C.H.

    A posttest analysis of the blowdown portion of Loss-of-Coolant Experiment (LOCE) L2-3, which was conducted in the Loss-of-Fluid Test (LOFT) facility, was performed using the experiment safety analysis (ESA) RELAP4/MOD5 computer model. Measured experimental parameters were compared with the calculations in order to assess the conservatisms in the ESA RELAP4/MOD5 model.

  12. 2007 Beyond SBIR Phase II: Bringing Technology Edge to the Warfighter

    DTIC Science & Technology

    2007-08-23

    Systems Trade-Off Analysis and Optimization Verification and Validation On-Board Diagnostics and Self - healing Security and Anti-Tampering Rapid...verification; Safety and reliability analysis of flight and mission critical systems On-Board Diagnostics and Self - Healing Model-based monitoring and... self - healing On-board diagnostics and self - healing ; Autonomic computing; Network intrusion detection and prevention Anti-Tampering and Trust

  13. Integrated Hybrid System Architecture for Risk Analysis

    NASA Technical Reports Server (NTRS)

    Moynihan, Gary P.; Fonseca, Daniel J.; Ray, Paul S.

    2010-01-01

    A conceptual design has been announced of an expert-system computer program, and the development of a prototype of the program, intended for use as a project-management tool. The program integrates schedule and risk data for the purpose of determining the schedule applications of safety risks and, somewhat conversely, the effects of changes in schedules on changes on safety. It is noted that the design has been delivered to a NASA client and that it is planned to disclose the design in a conference presentation.

  14. Response surface method in geotechnical/structural analysis, phase 1

    NASA Astrophysics Data System (ADS)

    Wong, F. S.

    1981-02-01

    In the response surface approach, an approximating function is fit to a long running computer code based on a limited number of code calculations. The approximating function, called the response surface, is then used to replace the code in subsequent repetitive computations required in a statistical analysis. The procedure of the response surface development and feasibility of the method are shown using a sample problem in slop stability which is based on data from centrifuge experiments of model soil slopes and involves five random soil parameters. It is shown that a response surface can be constructed based on as few as four code calculations and that the response surface is computationally extremely efficient compared to the code calculation. Potential applications of this research include probabilistic analysis of dynamic, complex, nonlinear soil/structure systems such as slope stability, liquefaction, and nuclear reactor safety.

  15. Analytical methodology for safety validation of computer controlled subsystems. Volume 1 : state-of-the-art and assessment of safety verification/validation methodologies

    DOT National Transportation Integrated Search

    1995-09-01

    This report describes the development of a methodology designed to assure that a sufficiently high level of safety is achieved and maintained in computer-based systems which perform safety critical functions in high-speed rail or magnetic levitation ...

  16. DYNALIST II : A Computer Program for Stability and Dynamic Response Analysis of Rail Vehicle Systems : Volume 4. Revised User's Manual.

    DOT National Transportation Integrated Search

    1976-07-01

    The Federal Railroad Administration (FRA) is sponsoring research, development, and demonstration programs to provide improved safety, performance, speed, reliability, and maintainability of rail transportation systems at reduced life-cycle costs. A m...

  17. An evaluation of computer-aided disproportionality analysis for post-marketing signal detection.

    PubMed

    Lehman, H P; Chen, J; Gould, A L; Kassekert, R; Beninger, P R; Carney, R; Goldberg, M; Goss, M A; Kidos, K; Sharrar, R G; Shields, K; Sweet, A; Wiholm, B E; Honig, P K

    2007-08-01

    To understand the value of computer-aided disproportionality analysis (DA) in relation to current pharmacovigilance signal detection methods, four products were retrospectively evaluated by applying an empirical Bayes method to Merck's post-marketing safety database. Findings were compared with the prior detection of labeled post-marketing adverse events. Disproportionality ratios (empirical Bayes geometric mean lower 95% bounds for the posterior distribution (EBGM05)) were generated for product-event pairs. Overall (1993-2004 data, EBGM05> or =2, individual terms) results of signal detection using DA compared to standard methods were sensitivity, 31.1%; specificity, 95.3%; and positive predictive value, 19.9%. Using groupings of synonymous labeled terms, sensitivity improved (40.9%). More of the adverse events detected by both methods were detected earlier using DA and grouped (versus individual) terms. With 1939-2004 data, diagnostic properties were similar to those from 1993 to 2004. DA methods using Merck's safety database demonstrate sufficient sensitivity and specificity to be considered for use as an adjunct to conventional signal detection methods.

  18. Parallel computation of multigroup reactivity coefficient using iterative method

    NASA Astrophysics Data System (ADS)

    Susmikanti, Mike; Dewayatna, Winter

    2013-09-01

    One of the research activities to support the commercial radioisotope production program is a safety research target irradiation FPM (Fission Product Molybdenum). FPM targets form a tube made of stainless steel in which the nuclear degrees of superimposed high-enriched uranium. FPM irradiation tube is intended to obtain fission. The fission material widely used in the form of kits in the world of nuclear medicine. Irradiation FPM tube reactor core would interfere with performance. One of the disorders comes from changes in flux or reactivity. It is necessary to study a method for calculating safety terrace ongoing configuration changes during the life of the reactor, making the code faster became an absolute necessity. Neutron safety margin for the research reactor can be reused without modification to the calculation of the reactivity of the reactor, so that is an advantage of using perturbation method. The criticality and flux in multigroup diffusion model was calculate at various irradiation positions in some uranium content. This model has a complex computation. Several parallel algorithms with iterative method have been developed for the sparse and big matrix solution. The Black-Red Gauss Seidel Iteration and the power iteration parallel method can be used to solve multigroup diffusion equation system and calculated the criticality and reactivity coeficient. This research was developed code for reactivity calculation which used one of safety analysis with parallel processing. It can be done more quickly and efficiently by utilizing the parallel processing in the multicore computer. This code was applied for the safety limits calculation of irradiated targets FPM with increment Uranium.

  19. Computational toxicity in 21st century safety sciences (China ...

    EPA Pesticide Factsheets

    presentation at the Joint Meeting of Analytical Toxicology and Computational Toxicology Committee (Chinese Society of Toxicology) International Workshop on Advanced Chemical Safety Assessment Technologies on 11 May 2016, Fuzhou University, Fuzhou China presentation at the Joint Meeting of Analytical Toxicology and Computational Toxicology Committee (Chinese Society of Toxicology) International Workshop on Advanced Chemical Safety Assessment Technologies on 11 May 2016, Fuzhou University, Fuzhou China

  20. 14 CFR 415.123 - Computing systems and software.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 14 Aeronautics and Space 4 2013-01-01 2013-01-01 false Computing systems and software. 415.123... Launch Vehicle From a Non-Federal Launch Site § 415.123 Computing systems and software. (a) An applicant's safety review document must describe all computing systems and software that perform a safety...

  1. 14 CFR 415.123 - Computing systems and software.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 14 Aeronautics and Space 4 2014-01-01 2014-01-01 false Computing systems and software. 415.123... Launch Vehicle From a Non-Federal Launch Site § 415.123 Computing systems and software. (a) An applicant's safety review document must describe all computing systems and software that perform a safety...

  2. 14 CFR 415.123 - Computing systems and software.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 14 Aeronautics and Space 4 2010-01-01 2010-01-01 false Computing systems and software. 415.123... Launch Vehicle From a Non-Federal Launch Site § 415.123 Computing systems and software. (a) An applicant's safety review document must describe all computing systems and software that perform a safety...

  3. 14 CFR 415.123 - Computing systems and software.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 14 Aeronautics and Space 4 2012-01-01 2012-01-01 false Computing systems and software. 415.123... Launch Vehicle From a Non-Federal Launch Site § 415.123 Computing systems and software. (a) An applicant's safety review document must describe all computing systems and software that perform a safety...

  4. 14 CFR 415.123 - Computing systems and software.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 14 Aeronautics and Space 4 2011-01-01 2011-01-01 false Computing systems and software. 415.123... Launch Vehicle From a Non-Federal Launch Site § 415.123 Computing systems and software. (a) An applicant's safety review document must describe all computing systems and software that perform a safety...

  5. Loosely Coupled GPS-Aided Inertial Navigation System for Range Safety

    NASA Technical Reports Server (NTRS)

    Heatwole, Scott; Lanzi, Raymond J.

    2010-01-01

    The Autonomous Flight Safety System (AFSS) aims to replace the human element of range safety operations, as well as reduce reliance on expensive, downrange assets for launches of expendable launch vehicles (ELVs). The system consists of multiple navigation sensors and flight computers that provide a highly reliable platform. It is designed to ensure that single-event failures in a flight computer or sensor will not bring down the whole system. The flight computer uses a rules-based structure derived from range safety requirements to make decisions whether or not to destroy the rocket.

  6. Validation of MCNP6 Version 1.0 with the ENDF/B-VII.1 Cross Section Library for Plutonium Metals, Oxides, and Solutions on the High Performance Computing Platform Moonlight

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chapman, Bryan Scott; Gough, Sean T.

    This report documents a validation of the MCNP6 Version 1.0 computer code on the high performance computing platform Moonlight, for operations at Los Alamos National Laboratory (LANL) that involve plutonium metals, oxides, and solutions. The validation is conducted using the ENDF/B-VII.1 continuous energy group cross section library at room temperature. The results are for use by nuclear criticality safety personnel in performing analysis and evaluation of various facility activities involving plutonium materials.

  7. Pilots of the future - Human or computer?

    NASA Technical Reports Server (NTRS)

    Chambers, A. B.; Nagel, D. C.

    1985-01-01

    In connection with the occurrence of aircraft accidents and the evolution of the air-travel system, questions arise regarding the computer's potential for making fundamental contributions to improving the safety and reliability of air travel. An important result of an analysis of the causes of aircraft accidents is the conclusion that humans - 'pilots and other personnel' - are implicated in well over half of the accidents which occur. Over 70 percent of the incident reports contain evidence of human error. In addition, almost 75 percent show evidence of an 'information-transfer' problem. Thus, the question arises whether improvements in air safety could be achieved by removing humans from control situations. In an attempt to answer this question, it is important to take into account also certain advantages which humans have in comparison to computers. Attention is given to human error and the effects of technology, the motivation to automate, aircraft automation at the crossroads, the evolution of cockpit automation, and pilot factors.

  8. Preliminary Results Obtained in Integrated Safety Analysis of NASA Aviation Safety Program Technologies

    NASA Technical Reports Server (NTRS)

    2001-01-01

    This is a listing of recent unclassified RTO technical publications processed by the NASA Center for AeroSpace Information from January 1, 2001 through March 31, 2001 available on the NASA Aeronautics and Space Database. Contents include 1) Cognitive Task Analysis; 2) RTO Educational Notes; 3) The Capability of Virtual Reality to Meet Military Requirements; 4) Aging Engines, Avionics, Subsystems and Helicopters; 5) RTO Meeting Proceedings; 6) RTO Technical Reports; 7) Low Grazing Angle Clutter...; 8) Verification and Validation Data for Computational Unsteady Aerodynamics; 9) Space Observation Technology; 10) The Human Factor in System Reliability...; 11) Flight Control Design...; 12) Commercial Off-the-Shelf Products in Defense Applications.

  9. Non-standard analysis and embedded software

    NASA Technical Reports Server (NTRS)

    Platek, Richard

    1995-01-01

    One model for computing in the future is ubiquitous, embedded computational devices analogous to embedded electrical motors. Many of these computers will control physical objects and processes. Such hidden computerized environments introduce new safety and correctness concerns whose treatment go beyond present Formal Methods. In particular, one has to begin to speak about Real Space software in analogy with Real Time software. By this we mean, computerized systems which have to meet requirements expressed in the real geometry of space. How to translate such requirements into ordinary software specifications and how to carry out proofs is a major challenge. In this talk we propose a research program based on the use of no-standard analysis. Much detail remains to be carried out. The purpose of the talk is to inform the Formal Methods community that Non-Standard Analysis provides a possible avenue to attack which we believe will be fruitful.

  10. SCALE Code System

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rearden, Bradley T.; Jessee, Matthew Anderson

    The SCALE Code System is a widely-used modeling and simulation suite for nuclear safety analysis and design that is developed, maintained, tested, and managed by the Reactor and Nuclear Systems Division (RNSD) of Oak Ridge National Laboratory (ORNL). SCALE provides a comprehensive, verified and validated, user-friendly tool set for criticality safety, reactor and lattice physics, radiation shielding, spent fuel and radioactive source term characterization, and sensitivity and uncertainty analysis. Since 1980, regulators, licensees, and research institutions around the world have used SCALE for safety analysis and design. SCALE provides an integrated framework with dozens of computational modules including three deterministicmore » and three Monte Carlo radiation transport solvers that are selected based on the desired solution strategy. SCALE includes current nuclear data libraries and problem-dependent processing tools for continuous-energy (CE) and multigroup (MG) neutronics and coupled neutron-gamma calculations, as well as activation, depletion, and decay calculations. SCALE includes unique capabilities for automated variance reduction for shielding calculations, as well as sensitivity and uncertainty analysis. SCALE’s graphical user interfaces assist with accurate system modeling, visualization of nuclear data, and convenient access to desired results.« less

  11. SCALE Code System 6.2.1

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rearden, Bradley T.; Jessee, Matthew Anderson

    The SCALE Code System is a widely-used modeling and simulation suite for nuclear safety analysis and design that is developed, maintained, tested, and managed by the Reactor and Nuclear Systems Division (RNSD) of Oak Ridge National Laboratory (ORNL). SCALE provides a comprehensive, verified and validated, user-friendly tool set for criticality safety, reactor and lattice physics, radiation shielding, spent fuel and radioactive source term characterization, and sensitivity and uncertainty analysis. Since 1980, regulators, licensees, and research institutions around the world have used SCALE for safety analysis and design. SCALE provides an integrated framework with dozens of computational modules including three deterministicmore » and three Monte Carlo radiation transport solvers that are selected based on the desired solution strategy. SCALE includes current nuclear data libraries and problem-dependent processing tools for continuous-energy (CE) and multigroup (MG) neutronics and coupled neutron-gamma calculations, as well as activation, depletion, and decay calculations. SCALE includes unique capabilities for automated variance reduction for shielding calculations, as well as sensitivity and uncertainty analysis. SCALE’s graphical user interfaces assist with accurate system modeling, visualization of nuclear data, and convenient access to desired results.« less

  12. Fault trees for decision making in systems analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lambert, Howard E.

    1975-10-09

    The application of fault tree analysis (FTA) to system safety and reliability is presented within the framework of system safety analysis. The concepts and techniques involved in manual and automated fault tree construction are described and their differences noted. The theory of mathematical reliability pertinent to FTA is presented with emphasis on engineering applications. An outline of the quantitative reliability techniques of the Reactor Safety Study is given. Concepts of probabilistic importance are presented within the fault tree framework and applied to the areas of system design, diagnosis and simulation. The computer code IMPORTANCE ranks basic events and cut setsmore » according to a sensitivity analysis. A useful feature of the IMPORTANCE code is that it can accept relative failure data as input. The output of the IMPORTANCE code can assist an analyst in finding weaknesses in system design and operation, suggest the most optimal course of system upgrade, and determine the optimal location of sensors within a system. A general simulation model of system failure in terms of fault tree logic is described. The model is intended for efficient diagnosis of the causes of system failure in the event of a system breakdown. It can also be used to assist an operator in making decisions under a time constraint regarding the future course of operations. The model is well suited for computer implementation. New results incorporated in the simulation model include an algorithm to generate repair checklists on the basis of fault tree logic and a one-step-ahead optimization procedure that minimizes the expected time to diagnose system failure.« less

  13. Improving freight fire safety : experiment testing and computer modeling to further development of mist-controlling additives for fire mitigation.

    DOT National Transportation Integrated Search

    2012-08-01

    With the purpose to minimize or prevent crash-induced fires in road and rail transportation, the : current interest in bio-derived and blended transportation fuels is increasing. Based on two years : of preliminary testing and analysis, it appears to...

  14. Trends in HFE Methods and Tools and Their Applicability to Safety Reviews

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    O'Hara, J.M.; Plott, C.; Milanski, J.

    2009-09-30

    The U.S. Nuclear Regulatory Commission's (NRC) conducts human factors engineering (HFE) safety reviews of applicant submittals for new plants and for changes to existing plants. The reviews include the evaluation of the methods and tools (M&T) used by applicants as part of their HFE program. The technology used to perform HFE activities has been rapidly evolving, resulting in a whole new generation of HFE M&Ts. The objectives of this research were to identify the current trends in HFE methods and tools, determine their applicability to NRC safety reviews, and identify topics for which the NRC may need additional guidance tomore » support the NRC's safety reviews. We conducted a survey that identified over 100 new HFE M&Ts. The M&Ts were assessed to identify general trends. Seven trends were identified: Computer Applications for Performing Traditional Analyses, Computer-Aided Design, Integration of HFE Methods and Tools, Rapid Development Engineering, Analysis of Cognitive Tasks, Use of Virtual Environments and Visualizations, and Application of Human Performance Models. We assessed each trend to determine its applicability to the NRC's review by considering (1) whether the nuclear industry is making use of M&Ts for each trend, and (2) whether M&Ts reflecting the trend can be reviewed using the current design review guidance. We concluded that M&T trends that are applicable to the commercial nuclear industry and are expected to impact safety reviews may be considered for review guidance development. Three trends fell into this category: Analysis of Cognitive Tasks, Use of Virtual Environments and Visualizations, and Application of Human Performance Models. The other trends do not need to be addressed at this time.« less

  15. Multiobjective optimisation of bogie suspension to boost speed on curves

    NASA Astrophysics Data System (ADS)

    Milad Mousavi-Bideleh, Seyed; Berbyuk, Viktor

    2016-01-01

    To improve safety and maximum admissible speed on different operational scenarios, multiobjective optimisation of bogie suspension components of a one-car railway vehicle model is considered. The vehicle model has 50 degrees of freedom and is developed in multibody dynamics software SIMPACK. Track shift force, running stability, and risk of derailment are selected as safety objective functions. The improved maximum admissible speeds of the vehicle on curves are determined based on the track plane accelerations up to 1.5 m/s2. To attenuate the number of design parameters for optimisation and improve the computational efficiency, a global sensitivity analysis is accomplished using the multiplicative dimensional reduction method (M-DRM). A multistep optimisation routine based on genetic algorithm (GA) and MATLAB/SIMPACK co-simulation is executed at three levels. The bogie conventional secondary and primary suspension components are chosen as the design parameters in the first two steps, respectively. In the last step semi-active suspension is in focus. The input electrical current to magnetorheological yaw dampers is optimised to guarantee an appropriate safety level. Semi-active controllers are also applied and the respective effects on bogie dynamics are explored. The safety Pareto optimised results are compared with those associated with in-service values. The global sensitivity analysis and multistep approach significantly reduced the number of design parameters and improved the computational efficiency of the optimisation. Furthermore, using the optimised values of design parameters give the possibility to run the vehicle up to 13% faster on curves while a satisfactory safety level is guaranteed. The results obtained can be used in Pareto optimisation and active bogie suspension design problems.

  16. Using Computational Toxicology to Enable Risk-Based ...

    EPA Pesticide Factsheets

    presentation at Drug Safety Gordon Research Conference 2016 on research efforts in NCCT to enable Computational Toxicology to support risk assessment. Slide presentation at Drug Safety Gordon Research Conference 2016 on research efforts in NCCT to enable Computational Toxicology to support risk assessment.

  17. Computer vision in the poultry industry

    USDA-ARS?s Scientific Manuscript database

    Computer vision is becoming increasingly important in the poultry industry due to increasing use and speed of automation in processing operations. Growing awareness of food safety concerns has helped add food safety inspection to the list of tasks that automated computer vision can assist. Researc...

  18. Promoting the safe and strategic use of technology for victims of intimate partner violence: evaluation of the technology safety project.

    PubMed

    Finn, Jerry; Atkinson, Teresa

    2009-11-01

    The Technology Safety Project of the Washington State Coalition Against Domestic Violence was designed to increase awareness and knowledge of technology safety issues for domestic violence victims, survivors, and advocacy staff. The project used a "train-the-trainer" model and provided computer and Internet resources to domestic violence service providers to (a) increase safe computer and Internet access for domestic violence survivors in Washington, (b) reduce the risk posed by abusers by educating survivors about technology safety and privacy, and (c) increase the ability of survivors to help themselves and their children through information technology. Evaluation of the project suggests that the program is needed, useful, and effective. Consumer satisfaction was high, and there was perceived improvement in computer confidence and knowledge of computer safety. Areas for future program development and further research are discussed.

  19. TRIGRS - A Fortran Program for Transient Rainfall Infiltration and Grid-Based Regional Slope-Stability Analysis, Version 2.0

    USGS Publications Warehouse

    Baum, Rex L.; Savage, William Z.; Godt, Jonathan W.

    2008-01-01

    The Transient Rainfall Infiltration and Grid-Based Regional Slope-Stability Model (TRIGRS) is a Fortran program designed for modeling the timing and distribution of shallow, rainfall-induced landslides. The program computes transient pore-pressure changes, and attendant changes in the factor of safety, due to rainfall infiltration. The program models rainfall infiltration, resulting from storms that have durations ranging from hours to a few days, using analytical solutions for partial differential equations that represent one-dimensional, vertical flow in isotropic, homogeneous materials for either saturated or unsaturated conditions. Use of step-function series allows the program to represent variable rainfall input, and a simple runoff routing model allows the user to divert excess water from impervious areas onto more permeable downslope areas. The TRIGRS program uses a simple infinite-slope model to compute factor of safety on a cell-by-cell basis. An approximate formula for effective stress in unsaturated materials aids computation of the factor of safety in unsaturated soils. Horizontal heterogeneity is accounted for by allowing material properties, rainfall, and other input values to vary from cell to cell. This command-line program is used in conjunction with geographic information system (GIS) software to prepare input grids and visualize model results.

  20. Development of a Reduced-Order Three-Dimensional Flow Model for Thermal Mixing and Stratification Simulation during Reactor Transients

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hu, Rui

    2017-09-03

    Mixing, thermal-stratification, and mass transport phenomena in large pools or enclosures play major roles for the safety of reactor systems. Depending on the fidelity requirement and computational resources, various modeling methods, from the 0-D perfect mixing model to 3-D Computational Fluid Dynamics (CFD) models, are available. Each is associated with its own advantages and shortcomings. It is very desirable to develop an advanced and efficient thermal mixing and stratification modeling capability embedded in a modern system analysis code to improve the accuracy of reactor safety analyses and to reduce modeling uncertainties. An advanced system analysis tool, SAM, is being developedmore » at Argonne National Laboratory for advanced non-LWR reactor safety analysis. While SAM is being developed as a system-level modeling and simulation tool, a reduced-order three-dimensional module is under development to model the multi-dimensional flow and thermal mixing and stratification in large enclosures of reactor systems. This paper provides an overview of the three-dimensional finite element flow model in SAM, including the governing equations, stabilization scheme, and solution methods. Additionally, several verification and validation tests are presented, including lid-driven cavity flow, natural convection inside a cavity, laminar flow in a channel of parallel plates. Based on the comparisons with the analytical solutions and experimental results, it is demonstrated that the developed 3-D fluid model can perform very well for a wide range of flow problems.« less

  1. Software Safety Risk in Legacy Safety-Critical Computer Systems

    NASA Technical Reports Server (NTRS)

    Hill, Janice L.; Baggs, Rhoda

    2007-01-01

    Safety Standards contain technical and process-oriented safety requirements. Technical requirements are those such as "must work" and "must not work" functions in the system. Process-Oriented requirements are software engineering and safety management process requirements. Address the system perspective and some cover just software in the system > NASA-STD-8719.13B Software Safety Standard is the current standard of interest. NASA programs/projects will have their own set of safety requirements derived from the standard. Safety Cases: a) Documented demonstration that a system complies with the specified safety requirements. b) Evidence is gathered on the integrity of the system and put forward as an argued case. [Gardener (ed.)] c) Problems occur when trying to meet safety standards, and thus make retrospective safety cases, in legacy safety-critical computer systems.

  2. 78 FR 47014 - Configuration Management Plans for Digital Computer Software Used in Safety Systems of Nuclear...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-08-02

    ... Software Used in Safety Systems of Nuclear Power Plants AGENCY: Nuclear Regulatory Commission. ACTION... Computer Software Used in Safety Systems of Nuclear Power Plants.'' This RG endorses, with clarifications... Electrical and Electronic Engineers (IEEE) Standard 828-2005, ``IEEE Standard for Software Configuration...

  3. An evaluation of The Great Escape: can an interactive computer game improve young children's fire safety knowledge and behaviors?

    PubMed

    Morrongiello, Barbara A; Schwebel, David C; Bell, Melissa; Stewart, Julia; Davis, Aaron L

    2012-07-01

    Fire is a leading cause of unintentional injury and, although young children are at particularly increased risk, there are very few evidence-based resources available to teach them fire safety knowledge and behaviors. Using a pre-post randomized design, the current study evaluated the effectiveness of a computer game (The Great Escape) for teaching fire safety information to young children (3.5-6 years). Using behavioral enactment procedures, children's knowledge and behaviors related to fire safety were compared to a control group of children before and after receiving the intervention. The results indicated significant improvements in knowledge and fire safety behaviors in the intervention group but not the control. Using computer games can be an effective way to promote young children's understanding of safety and how to react in different hazardous situations.

  4. The research of computer network security and protection strategy

    NASA Astrophysics Data System (ADS)

    He, Jian

    2017-05-01

    With the widespread popularity of computer network applications, its security is also received a high degree of attention. Factors affecting the safety of network is complex, for to do a good job of network security is a systematic work, has the high challenge. For safety and reliability problems of computer network system, this paper combined with practical work experience, from the threat of network security, security technology, network some Suggestions and measures for the system design principle, in order to make the masses of users in computer networks to enhance safety awareness and master certain network security technology.

  5. Safety Analysis of FMS/CTAS Interactions During Aircraft Arrivals

    NASA Technical Reports Server (NTRS)

    Leveson, Nancy G.

    1998-01-01

    This grant funded research on human-computer interaction design and analysis techniques, using future ATC environments as a testbed. The basic approach was to model the nominal behavior of both the automated and human procedures and then to apply safety analysis techniques to these models. Our previous modeling language, RSML, had been used to specify the system requirements for TCAS II for the FAA. Using the lessons learned from this experience, we designed a new modeling language that (among other things) incorporates features to assist in designing less error-prone human-computer interactions and interfaces and in detecting potential HCI problems, such as mode confusion. The new language, SpecTRM-RL, uses "intent" abstractions, based on Rasmussen's abstraction hierarchy, and includes both informal (English and graphical) specifications and formal, executable models for specifying various aspects of the system. One of the goals for our language was to highlight the system modes and mode changes to assist in identifying the potential for mode confusion. Three published papers resulted from this research. The first builds on the work of Degani on mode confusion to identify aspects of the system design that could lead to potential hazards. We defined and modeled modes differently than Degani and also defined design criteria for SpecTRM-RL models. Our design criteria include the Degani criteria but extend them to include more potential problems. In a second paper, Leveson and Palmer showed how the criteria for indirect mode transitions could be applied to a mode confusion problem found in several ASRS reports for the MD-88. In addition, we defined a visual task modeling language that can be used by system designers to model human-computer interaction. The visual models can be translated into SpecTRM-RL models, and then the SpecTRM-RL suite of analysis tools can be used to perform formal and informal safety analyses on the task model in isolation or integrated with the rest of the modeled system. We had hoped to be able to apply these modeling languages and analysis tools to a TAP air/ground trajectory negotiation scenario, but the development of the tools took more time than we anticipated.

  6. 78 FR 47805 - Test Documentation for Digital Computer Software Used in Safety Systems of Nuclear Power Plants

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-08-06

    ... Documents Access and Management System (ADAMS): You may access publicly available documents online in the... Management Plans for Digital Computer Software used in Safety Systems of Nuclear Power Plants,'' issued for... Used in Safety Systems of Nuclear Power Plants AGENCY: Nuclear Regulatory Commission. ACTION: Revision...

  7. The development of regulatory expectations for computer-based safety systems for the UK nuclear programme

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hughes, P. J.; Westwood, R.N; Mark, R. T.

    2006-07-01

    The Nuclear Installations Inspectorate (NII) of the UK's Health and Safety Executive (HSE) has completed a review of their Safety Assessment Principles (SAPs) for Nuclear Installations recently. During the period of the SAPs review in 2004-2005 the designers of future UK naval reactor plant were optioneering the control and protection systems that might be implemented. Because there was insufficient regulatory guidance available in the naval sector to support this activity the Defence Nuclear Safety Regulator (DNSR) invited the NII to collaborate with the production of a guidance document that provides clarity of regulatory expectations for the production of safety casesmore » for computer based safety systems. A key part of producing regulatory expectations was identifying the relevant extant standards and sector guidance that reflect good practice. The three principal sources of such good practice were: IAEA Safety Guide NS-G-1.1 (Software for Computer Based Systems Important to Safety in Nuclear Power Plants), European Commission consensus document (Common Position of European Nuclear Regulators for the Licensing of Safety Critical Software for Nuclear Reactors) and IEC nuclear sector standards such as IEC60880. A common understanding has been achieved between the NII and DNSR and regulatory guidance developed which will be used by both NII and DNSR in the assessment of computer-based safety systems and in the further development of more detailed joint technical assessment guidance for both regulatory organisations. (authors)« less

  8. Exploring the sociotechnical intersection of patient safety and electronic health record implementation

    PubMed Central

    Meeks, Derek W; Takian, Amirhossein; Sittig, Dean F; Singh, Hardeep; Barber, Nick

    2014-01-01

    Objective The intersection of electronic health records (EHR) and patient safety is complex. To examine the applicability of two previously developed conceptual models comprehensively to understand safety implications of EHR implementation in the English National Health Service (NHS). Methods We conducted a secondary analysis of interview data from a 30-month longitudinal, prospective, case study-based evaluation of EHR implementation in 12 NHS hospitals. We used a framework analysis approach to apply conceptual models developed by Sittig and Singh to understand better EHR implementation and use: an eight-dimension sociotechnical model and a three-phase patient safety model (safe technology, safe use of technology, and use of technology to improve safety). Results The intersection of patient safety and EHR implementation and use was characterized by risks involving technology (hardware and software, clinical content, and human–computer interfaces), the interaction of technology with non-technological factors, and improper or unsafe use of technology. Our data support that patient safety improvement activities as well as patient safety hazards change as an organization evolves from concerns about safe EHR functionality, ensuring safe and appropriate EHR use, to using the EHR itself to provide ongoing surveillance and monitoring of patient safety. Discussion We demonstrate the face validity of two models for understanding the sociotechnical aspects of safe EHR implementation and the complex interactions of technology within a healthcare system evolving from paper to integrated EHR. Conclusions Using sociotechnical models, including those presented in this paper, may be beneficial to help stakeholders understand, synthesize, and anticipate risks at the intersection of patient safety and health information technology. PMID:24052536

  9. Exploring the sociotechnical intersection of patient safety and electronic health record implementation.

    PubMed

    Meeks, Derek W; Takian, Amirhossein; Sittig, Dean F; Singh, Hardeep; Barber, Nick

    2014-02-01

    The intersection of electronic health records (EHR) and patient safety is complex. To examine the applicability of two previously developed conceptual models comprehensively to understand safety implications of EHR implementation in the English National Health Service (NHS). We conducted a secondary analysis of interview data from a 30-month longitudinal, prospective, case study-based evaluation of EHR implementation in 12 NHS hospitals. We used a framework analysis approach to apply conceptual models developed by Sittig and Singh to understand better EHR implementation and use: an eight-dimension sociotechnical model and a three-phase patient safety model (safe technology, safe use of technology, and use of technology to improve safety). The intersection of patient safety and EHR implementation and use was characterized by risks involving technology (hardware and software, clinical content, and human-computer interfaces), the interaction of technology with non-technological factors, and improper or unsafe use of technology. Our data support that patient safety improvement activities as well as patient safety hazards change as an organization evolves from concerns about safe EHR functionality, ensuring safe and appropriate EHR use, to using the EHR itself to provide ongoing surveillance and monitoring of patient safety. We demonstrate the face validity of two models for understanding the sociotechnical aspects of safe EHR implementation and the complex interactions of technology within a healthcare system evolving from paper to integrated EHR. Using sociotechnical models, including those presented in this paper, may be beneficial to help stakeholders understand, synthesize, and anticipate risks at the intersection of patient safety and health information technology.

  10. EXPERIENCES FROM THE SOURCE-TERM ANALYSIS OF A LOW AND INTERMEDIATE LEVEL RADWASTE DISPOSAL FACILITY

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Park,Jin Beak; Park, Joo-Wan; Lee, Eun-Young

    2003-02-27

    Enhancement of a computer code SAGE for evaluation of the Korean concept for a LILW waste disposal facility is discussed. Several features of source term analysis are embedded into SAGE to analyze: (1) effects of degradation mode of an engineered barrier, (2) effects of dispersion phenomena in the unsaturated zone and (3) effects of time dependent sorption coefficient in the unsaturated zone. IAEA's Vault Safety Case (VSC) approach is used to demonstrate the ability of this assessment code. Results of MASCOT are used for comparison purposes. These enhancements of the safety assessment code, SAGE, can contribute to realistic evaluation ofmore » the Korean concept of the LILW disposal project in the near future.« less

  11. CFD - Mature Technology?

    NASA Technical Reports Server (NTRS)

    Kwak, Dochan

    2005-01-01

    Over the past 30 years, numerical methods and simulation tools for fluid dynamic problems have advanced as a new discipline, namely, computational fluid dynamics (CFD). Although a wide spectrum of flow regimes are encountered in many areas of science and engineering, simulation of compressible flow has been the major driver for developing computational algorithms and tools. This is probably due to a large demand for predicting the aerodynamic performance characteristics of flight vehicles, such as commercial, military, and space vehicles. As flow analysis is required to be more accurate and computationally efficient for both commercial and mission-oriented applications (such as those encountered in meteorology, aerospace vehicle development, general fluid engineering and biofluid analysis) CFD tools for engineering become increasingly important for predicting safety, performance and cost. This paper presents the author's perspective on the maturity of CFD, especially from an aerospace engineering point of view.

  12. Computer codes for checking, plotting and processing of neutron cross-section covariance data and their application

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sartori, E.; Roussin, R.W.

    This paper presents a brief review of computer codes concerned with checking, plotting, processing and using of covariances of neutron cross-section data. It concentrates on those available from the computer code information centers of the United States and the OECD/Nuclear Energy Agency. Emphasis will be placed also on codes using covariances for specific applications such as uncertainty analysis, data adjustment and data consistency analysis. Recent evaluations contain neutron cross section covariance information for all isotopes of major importance for technological applications of nuclear energy. It is therefore important that the available software tools needed for taking advantage of this informationmore » are widely known as hey permit the determination of better safety margins and allow the optimization of more economic, I designs of nuclear energy systems.« less

  13. Multimedia for occupational safety and health training: a pilot study examining a multimedia learning theory.

    PubMed

    Wallen, Erik S; Mulloy, Karen B

    2006-10-01

    Occupational diseases are a significant problem affecting public health. Safety training is an important method of preventing occupational illness. Training is increasingly being delivered by computer although theories of learning from computer-based multimedia have been tested almost entirely on college students. This study was designed to determine whether these theories might also be applied to safety training applications for working adults. Participants viewed either computer-based multimedia respirator use training with concurrent narration, narration prior to the animation, or unrelated safety training. Participants then took a five-item transfer test which measured their ability to use their knowledge in new and creative ways. Participants who viewed the computer-based multimedia trainings both did significantly better than the control group on the transfer test. The results of this pilot study suggest that design guidelines developed for younger learners may be effective for training workers in occupational safety and health although more investigation is needed.

  14. [Medication error management climate and perception for system use according to construction of medication error prevention system].

    PubMed

    Kim, Myoung Soo

    2012-08-01

    The purpose of this cross-sectional study was to examine current status of IT-based medication error prevention system construction and the relationships among system construction, medication error management climate and perception for system use. The participants were 124 patient safety chief managers working for 124 hospitals with over 300 beds in Korea. The characteristics of the participants, construction status and perception of systems (electric pharmacopoeia, electric drug dosage calculation system, computer-based patient safety reporting and bar-code system) and medication error management climate were measured in this study. The data were collected between June and August 2011. Descriptive statistics, partial Pearson correlation and MANCOVA were used for data analysis. Electric pharmacopoeia were constructed in 67.7% of participating hospitals, computer-based patient safety reporting systems were constructed in 50.8%, electric drug dosage calculation systems were in use in 32.3%. Bar-code systems showed up the lowest construction rate at 16.1% of Korean hospitals. Higher rates of construction of IT-based medication error prevention systems resulted in greater safety and a more positive error management climate prevailed. The supportive strategies for improving perception for use of IT-based systems would add to system construction, and positive error management climate would be more easily promoted.

  15. Integrated optimisation technique based on computer-aided capacity and safety evaluation for managing downstream lane-drop merging area of signalised junctions

    NASA Astrophysics Data System (ADS)

    Chen, CHAI; Yiik Diew, WONG

    2017-02-01

    This study provides an integrated strategy, encompassing microscopic simulation, safety assessment, and multi-attribute decision-making, to optimize traffic performance at downstream merging area of signalized intersections. A Fuzzy Cellular Automata (FCA) model is developed to replicate microscopic movement and merging behavior. Based on simulation experiment, the proposed FCA approach is able to provide capacity and safety evaluation of different traffic scenarios. The results are then evaluated through data envelopment analysis (DEA) and analytic hierarchy process (AHP). Optimized geometric layout and control strategies are then suggested for various traffic conditions. An optimal lane-drop distance that is dependent on traffic volume and speed limit can thus be established at the downstream merging area.

  16. Covariance Applications in Criticality Safety, Light Water Reactor Analysis, and Spent Fuel Characterization

    DOE PAGES

    Williams, M. L.; Wiarda, D.; Ilas, G.; ...

    2014-06-15

    Recently, we processed a new covariance data library based on ENDF/B-VII.1 for the SCALE nuclear analysis code system. The multigroup covariance data are discussed here, along with testing and application results for critical benchmark experiments. Moreover, the cross section covariance library, along with covariances for fission product yields and decay data, is used to compute uncertainties in the decay heat produced by a burned reactor fuel assembly.

  17. 77 FR 50722 - Software Unit Testing for Digital Computer Software Used in Safety Systems of Nuclear Power Plants

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-08-22

    ... NUCLEAR REGULATORY COMMISSION [NRC-2012-0195] Software Unit Testing for Digital Computer Software...) is issuing for public comment draft regulatory guide (DG), DG-1208, ``Software Unit Testing for Digital Computer Software used in Safety Systems of Nuclear Power Plants.'' The DG-1208 is proposed...

  18. 78 FR 47011 - Software Unit Testing for Digital Computer Software Used in Safety Systems of Nuclear Power Plants

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-08-02

    ... NUCLEAR REGULATORY COMMISSION [NRC-2012-0195] Software Unit Testing for Digital Computer Software... revised regulatory guide (RG), revision 1 of RG 1.171, ``Software Unit Testing for Digital Computer Software Used in Safety Systems of Nuclear Power Plants.'' This RG endorses American National Standards...

  19. Mathematics and statistics research progress report, period ending June 30, 1983

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Beauchamp, J. J.; Denson, M. V.; Heath, M. T.

    1983-08-01

    This report is the twenty-sixth in the series of progress reports of Mathematics and Statistics Research of the Computer Sciences organization, Union Carbide Corporation Nuclear Division. Part A records research progress in analysis of large data sets, applied analysis, biometrics research, computational statistics, materials science applications, numerical linear algebra, and risk analysis. Collaboration and consulting with others throughout the Oak Ridge Department of Energy complex are recorded in Part B. Included are sections on biological sciences, energy, engineering, environmental sciences, health and safety, and safeguards. Part C summarizes the various educational activities in which the staff was engaged. Part Dmore » lists the presentations of research results, and Part E records the staff's other professional activities during the report period.« less

  20. An updated numerical simulation of the ground-water flow system for the Castle Lake debris dam, Mount St. Helens, Washington, and implications for dam stability against heave

    USGS Publications Warehouse

    Roeloffs, Evelyn A.

    1994-01-01

    A numerical simulation of the ground-water flow system in the Castle Lake debris dam, calibrated to data from the 1991 and 1992 water years, was used to estimate factors of safety against heave and internal erosion. The Castle Lake debris dam, 5 miles northwest of the summit of Mount St. Helens, impounds 19,000 acre-ft of water that could pose a flood hazard in the event of a lake breakout. A new topographic map of the Castle Lake area prior to the 1980 eruption of Mount St. Helens was prepared and used to calculate the thickness of the debris avalanche deposits that compose the dam. Water levels in 22 piezometers and discharges from seeps on the dam face measured several times per year beginning in 1990 supplemented measurements in 11 piezometers and less frequent seep discharge measurements made since 1983. Observations in one group of piezometers reveal heads above the land surface and head gradients favoring upward flow that correspond to factors of safety only slightly greater than 2. The steady-state ground-water flow system in the debris dam was simulated using a threedimensional finite difference computer program. A uniform, isotropic model having the same shape as the dam and a hydraulic conductivity of 1.55 ft/day simulates the correct water level at half the observation points, but is in error by 10 ft or more at other points. Spatial variations of hydraulic conductivity were required to calibrate the model. The model analysis suggests that ground water flows in both directions between the debris dam and Castle Lake. Factors of safety against heave and internal erosion were calculated where the model simulated upward flow of ground water. A critical gradient analysis yields factors of safety as low as 2 near the piezometers where water level observations indicate low factors of safety. Low safety factors are also computed near Castle Creek where slumping was caused by a storm in January, 1990. If hydraulic property contrasts are present in areas of the debris dam unsampled by piezometers, then low safety factors may exist that are not evident in the numerical model analysis. Numerical model simulations showed that lowering Castle Lake by 40 feet increases many factors of safety by 0.1, but increases greater than 1 are limited to the area of 1990 slumping.

  1. Markov Jump-Linear Performance Models for Recoverable Flight Control Computers

    NASA Technical Reports Server (NTRS)

    Zhang, Hong; Gray, W. Steven; Gonzalez, Oscar R.

    2004-01-01

    Single event upsets in digital flight control hardware induced by atmospheric neutrons can reduce system performance and possibly introduce a safety hazard. One method currently under investigation to help mitigate the effects of these upsets is NASA Langley s Recoverable Computer System. In this paper, a Markov jump-linear model is developed for a recoverable flight control system, which will be validated using data from future experiments with simulated and real neutron environments. The method of tracking error analysis and the plan for the experiments are also described.

  2. Probabilistic assessment of dynamic system performance. Part 3

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Belhadj, Mohamed

    1993-01-01

    Accurate prediction of dynamic system failure behavior can be important for the reliability and risk analyses of nuclear power plants, as well as for their backfitting to satisfy given constraints on overall system reliability, or optimization of system performance. Global analysis of dynamic systems through investigating the variations in the structure of the attractors of the system and the domains of attraction of these attractors as a function of the system parameters is also important for nuclear technology in order to understand the fault-tolerance as well as the safety margins of the system under consideration and to insure a safemore » operation of nuclear reactors. Such a global analysis would be particularly relevant to future reactors with inherent or passive safety features that are expected to rely on natural phenomena rather than active components to achieve and maintain safe shutdown. Conventionally, failure and global analysis of dynamic systems necessitate the utilization of different methodologies which have computational limitations on the system size that can be handled. Using a Chapman-Kolmogorov interpretation of system dynamics, a theoretical basis is developed that unifies these methodologies as special cases and which can be used for a comprehensive safety and reliability analysis of dynamic systems.« less

  3. A Comparison of Computer-based and Instructor-led Training for Long-term Care Staff.

    ERIC Educational Resources Information Center

    Harrington, Susan S.; Walker, Bonnie L.

    2002-01-01

    Fire safety training was provided to long-term care staff by computer (n=47) or a print-based, instructor-led program (n=47). Compared to 47 controls, both treatment groups significantly increased knowledge. The computer-trained staff were enthusiastic about the learning method and expressed greater interest in additional safety topics. (SK)

  4. RICIS Symposium 1992: Mission and Safety Critical Systems Research and Applications

    NASA Technical Reports Server (NTRS)

    1992-01-01

    This conference deals with computer systems which control systems whose failure to operate correctly could produce the loss of life and or property, mission and safety critical systems. Topics covered are: the work of standards groups, computer systems design and architecture, software reliability, process control systems, knowledge based expert systems, and computer and telecommunication protocols.

  5. Engineering Design Handbook: Analysis and Design of Automotive Brake Systems.

    DTIC Science & Technology

    1976-12-01

    Highway Safety Research institute, Uni- versity of Michigan, September 15, 1972. IF’vn = (I - #)WT’,Kk I1, J. E. Bernard , et al,, A Computer Based...systems involve the reduction in brake line pres- 4. E. L. Cornwell , "Automatic Load-Sensitive Air sure for a given pedal force, the pedal force/de

  6. Architecture-led Requirements and Safety Analysis of an Aircraft Survivability Situational Awareness System

    DTIC Science & Technology

    2015-05-01

    quality attributes. Prioritization of the utility tree leafs driven by mission goals help the user ensure that critical requirements are well-specified...Methods: State of the Art and Future Directions”, ACM Computing Surveys. 1996. 10 Laitenberger, Oliver , “A Survey of Software Inspection Technologies, Handbook on Software Engineering and Knowledge Engineering”. 2002.

  7. Peer Review of “LDT Weight Reduction Study with Crash Model, Feasibility and Detailed Cost Analyses – Chevrolet Silverado 1500 Pickup”

    EPA Science Inventory

    The contractor will conduct an independent peer review of FEV’s light-duty truck (LDT) mass safety study, “Light-Duty Vehicle Weight Reduction Study with Crash Model, Feasibility and Detailed Cost Analysis – Silverado 1500”, and its corresponding computer-aided engineering (CAE) ...

  8. Computational toxicity in 21st century safety sciences (China talk - Fuzhou China)

    EPA Science Inventory

    presentation at the Joint Meeting of Analytical Toxicology and Computational Toxicology Committee (Chinese Society of Toxicology) International Workshop on Advanced Chemical Safety Assessment Technologies on 11 May 2016, Fuzhou University, Fuzhou China

  9. Faster Aerodynamic Simulation With Cart3D

    NASA Technical Reports Server (NTRS)

    2003-01-01

    A NASA-developed aerodynamic simulation tool is ensuring the safety of future space operations while providing designers and engineers with an automated, highly accurate computer simulation suite. Cart3D, co-winner of NASA's 2002 Software of the Year award, is the result of over 10 years of research and software development conducted by Michael Aftosmis and Dr. John Melton of Ames Research Center and Professor Marsha Berger of the Courant Institute at New York University. Cart3D offers a revolutionary approach to computational fluid dynamics (CFD), the computer simulation of how fluids and gases flow around an object of a particular design. By fusing technological advancements in diverse fields such as mineralogy, computer graphics, computational geometry, and fluid dynamics, the software provides a new industrial geometry processing and fluid analysis capability with unsurpassed automation and efficiency.

  10. Bibliography for computer security, integrity, and safety

    NASA Technical Reports Server (NTRS)

    Bown, Rodney L.

    1991-01-01

    A bibliography of computer security, integrity, and safety issues is given. The bibliography is divided into the following sections: recent national publications; books; journal, magazine articles, and miscellaneous reports; conferences, proceedings, and tutorials; and government documents and contractor reports.

  11. SCALE: A modular code system for performing Standardized Computer Analyses for Licensing Evaluation. Volume 1, Part 2: Control modules S1--H1; Revision 5

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    NONE

    SCALE--a modular code system for Standardized Computer Analyses Licensing Evaluation--has been developed by Oak Ridge National Laboratory at the request of the US Nuclear Regulatory Commission. The SCALE system utilizes well-established computer codes and methods within standard analysis sequences that (1) allow an input format designed for the occasional user and/or novice, (2) automated the data processing and coupling between modules, and (3) provide accurate and reliable results. System development has been directed at problem-dependent cross-section processing and analysis of criticality safety, shielding, heat transfer, and depletion/decay problems. Since the initial release of SCALE in 1980, the code system hasmore » been heavily used for evaluation of nuclear fuel facility and package designs. This revision documents Version 4.3 of the system.« less

  12. Automated Detection of Events of Scientific Interest

    NASA Technical Reports Server (NTRS)

    James, Mark

    2007-01-01

    A report presents a slightly different perspective of the subject matter of Fusing Symbolic and Numerical Diagnostic Computations (NPO-42512), which appears elsewhere in this issue of NASA Tech Briefs. Briefly, the subject matter is the X-2000 Anomaly Detection Language, which is a developmental computing language for fusing two diagnostic computer programs one implementing a numerical analysis method, the other implementing a symbolic analysis method into a unified event-based decision analysis software system for real-time detection of events. In the case of the cited companion NASA Tech Briefs article, the contemplated events that one seeks to detect would be primarily failures or other changes that could adversely affect the safety or success of a spacecraft mission. In the case of the instant report, the events to be detected could also include natural phenomena that could be of scientific interest. Hence, the use of X- 2000 Anomaly Detection Language could contribute to a capability for automated, coordinated use of multiple sensors and sensor-output-data-processing hardware and software to effect opportunistic collection and analysis of scientific data.

  13. Time series modeling in traffic safety research.

    PubMed

    Lavrenz, Steven M; Vlahogianni, Eleni I; Gkritza, Konstantina; Ke, Yue

    2018-08-01

    The use of statistical models for analyzing traffic safety (crash) data has been well-established. However, time series techniques have traditionally been underrepresented in the corresponding literature, due to challenges in data collection, along with a limited knowledge of proper methodology. In recent years, new types of high-resolution traffic safety data, especially in measuring driver behavior, have made time series modeling techniques an increasingly salient topic of study. Yet there remains a dearth of information to guide analysts in their use. This paper provides an overview of the state of the art in using time series models in traffic safety research, and discusses some of the fundamental techniques and considerations in classic time series modeling. It also presents ongoing and future opportunities for expanding the use of time series models, and explores newer modeling techniques, including computational intelligence models, which hold promise in effectively handling ever-larger data sets. The information contained herein is meant to guide safety researchers in understanding this broad area of transportation data analysis, and provide a framework for understanding safety trends that can influence policy-making. Copyright © 2017 Elsevier Ltd. All rights reserved.

  14. New Challenges in Computational Thermal Hydraulics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yadigaroglu, George; Lakehal, Djamel

    New needs and opportunities drive the development of novel computational methods for the design and safety analysis of light water reactors (LWRs). Some new methods are likely to be three dimensional. Coupling is expected between system codes, computational fluid dynamics (CFD) modules, and cascades of computations at scales ranging from the macro- or system scale to the micro- or turbulence scales, with the various levels continuously exchanging information back and forth. The ISP-42/PANDA and the international SETH project provide opportunities for testing applications of single-phase CFD methods to LWR safety problems. Although industrial single-phase CFD applications are commonplace, computational multifluidmore » dynamics is still under development. However, first applications are appearing; the state of the art and its potential uses are discussed. The case study of condensation of steam/air mixtures injected from a downward-facing vent into a pool of water is a perfect illustration of a simulation cascade: At the top of the hierarchy of scales, system behavior can be modeled with a system code; at the central level, the volume-of-fluid method can be applied to predict large-scale bubbling behavior; at the bottom of the cascade, direct-contact condensation can be treated with direct numerical simulation, in which turbulent flow (in both the gas and the liquid), interfacial dynamics, and heat/mass transfer are directly simulated without resorting to models.« less

  15. Are Technology Interruptions Impacting Your Bottom Line? An Innovative Proposal for Change.

    PubMed

    Ledbetter, Tamera; Shultz, Sarah; Beckham, Roxanne

    2017-10-01

    Nursing interruptions are a costly and dangerous variable in acute care hospitals. Malfunctioning technology equipment interrupts nursing care and prevents full utilization of computer safety systems to prevent patient care errors. This paper identifies an innovative approach to nursing interruptions related to computer and computer cart malfunctions. The impact on human resources is defined and outcome measures were proposed. A multifaceted proposal, based on a literature review, aimed at reducing nursing interruptions is presented. This proposal is expected to increase patient safety, as well as patient and nurse satisfaction. Acute care hospitals utilizing electronic medical records and bar-coded medication administration technology. Nurses, information technology staff, nursing informatics staff, and all leadership teams affected by technology problems and their proposed solutions. Literature from multiple fields was reviewed to evaluate research related to computer/computer cart failures, and the approaches used to resolve these issues. Outcome measured strategic goals related to patient safety, and nurse and patient satisfaction. Specific help desk metrics will demonstrate the effect of interventions. This paper addresses a gap in the literature and proposes practical and innovative solutions. A comprehensive computer and computer cart repair program is essential for patient safety, financial stewardship, and utilization of resources. © 2015 Wiley Periodicals, Inc.

  16. Improved Safety Margin Characterization of Risk from Loss of Offsite Power

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nelson, Paul

    Original intent: The original intent of this task was “support of the Risk-Informed Safety Margin Characteristic (RISMC) methodology in order” “to address … efficiency of computation so that more accurate and cost-effective techniques can be used to address safety margin characterizations” (S. M. Hess et al., “Risk-Informed Safety Margin Characterization,” Procs. ICONE17, Brussels, July 2009, CD format). It was intended that “in Task 1 itself this improvement will be directed toward upon the very important issue of Loss of Offsite Power (LOOP) events,” more specifically toward the challenge of efficient computation of the multidimensional nonrecovery integral that has been discussedmore » by many previous contributors to the theory of nuclear safety. It was further envisioned that “three different computational approaches will be explored,” corresponding to the three subtasks listed below; deliverables were tied to the individual subtasks.« less

  17. Influences of geological parameters to probabilistic assessment of slope stability of embankment

    NASA Astrophysics Data System (ADS)

    Nguyen, Qui T.; Le, Tuan D.; Konečný, Petr

    2018-04-01

    This article considers influences of geological parameters to slope stability of the embankment in probabilistic analysis using SLOPE/W computational system. Stability of a simple slope is evaluated with and without pore–water pressure on the basis of variation of soil properties. Normal distributions of unit weight, cohesion and internal friction angle are assumed. Monte Carlo simulation technique is employed to perform analysis of critical slip surface. Sensitivity analysis is performed to observe the variation of the geological parameters and their effects on safety factors of the slope stability.

  18. EMC analysis of MOS-1

    NASA Astrophysics Data System (ADS)

    Ishizawa, Y.; Abe, K.; Shirako, G.; Takai, T.; Kato, H.

    The electromagnetic compatibility (EMC) control method, system EMC analysis method, and system test method which have been applied to test the components of the MOS-1 satellite are described. The merits and demerits of the problem solving, specification, and system approaches to EMC control are summarized, and the data requirements of the SEMCAP (specification and electromagnetic compatibility analysis program) computer program for verifying the EMI safety margin of the components are sumamrized. Examples of EMC design are mentioned, and the EMC design process and selection method for EMC critical points are shown along with sample EMC test results.

  19. General aviation crash safety program at Langley Research Center

    NASA Technical Reports Server (NTRS)

    Thomson, R. G.

    1976-01-01

    The purpose of the crash safety program is to support development of the technology to define and demonstrate new structural concepts for improved crash safety and occupant survivability in general aviation aircraft. The program involves three basic areas of research: full-scale crash simulation testing, nonlinear structural analyses necessary to predict failure modes and collapse mechanisms of the vehicle, and evaluation of energy absorption concepts for specific component design. Both analytical and experimental methods are being used to develop expertise in these areas. Analyses include both simplified procedures for estimating energy absorption capabilities and more complex computer programs for analysis of general airframe response. Full-scale tests of typical structures as well as tests on structural components are being used to verify the analyses and to demonstrate improved design concepts.

  20. An analysis of computer-related patient safety incidents to inform the development of a classification.

    PubMed

    Magrabi, Farah; Ong, Mei-Sing; Runciman, William; Coiera, Enrico

    2010-01-01

    To analyze patient safety incidents associated with computer use to develop the basis for a classification of problems reported by health professionals. Incidents submitted to a voluntary incident reporting database across one Australian state were retrieved and a subset (25%) was analyzed to identify 'natural categories' for classification. Two coders independently classified the remaining incidents into one or more categories. Free text descriptions were analyzed to identify contributing factors. Where available medical specialty, time of day and consequences were examined. Descriptive statistics; inter-rater reliability. A search of 42,616 incidents from 2003 to 2005 yielded 123 computer related incidents. After removing duplicate and unrelated incidents, 99 incidents describing 117 problems remained. A classification with 32 types of computer use problems was developed. Problems were grouped into information input (31%), transfer (20%), output (20%) and general technical (24%). Overall, 55% of problems were machine related and 45% were attributed to human-computer interaction. Delays in initiating and completing clinical tasks were a major consequence of machine related problems (70%) whereas rework was a major consequence of human-computer interaction problems (78%). While 38% (n=26) of the incidents were reported to have a noticeable consequence but no harm, 34% (n=23) had no noticeable consequence. Only 0.2% of all incidents reported were computer related. Further work is required to expand our classification using incident reports and other sources of information about healthcare IT problems. Evidence based user interface design must focus on the safe entry and retrieval of clinical information and support users in detecting and correcting errors and malfunctions.

  1. SCANS (Shipping Cask ANalysis System) a microcomputer-based analysis system for shipping cask design review: User`s manual to Version 3a. Volume 1, Revision 2

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mok, G.C.; Thomas, G.R.; Gerhard, M.A.

    SCANS (Shipping Cask ANalysis System) is a microcomputer-based system of computer programs and databases developed at the Lawrence Livermore National Laboratory (LLNL) for evaluating safety analysis reports on spent fuel shipping casks. SCANS is an easy-to-use system that calculates the global response to impact loads, pressure loads and thermal conditions, providing reviewers with an independent check on analyses submitted by licensees. SCANS is based on microcomputers compatible with the IBM-PC family of computers. The system is composed of a series of menus, input programs, cask analysis programs, and output display programs. All data is entered through fill-in-the-blank input screens thatmore » contain descriptive data requests. Analysis options are based on regulatory cases described in the Code of Federal Regulations 10 CFR 71 and Regulatory Guides published by the US Nuclear Regulatory Commission in 1977 and 1978.« less

  2. AN OVERVIEW OF REDUCED ORDER MODELING TECHNIQUES FOR SAFETY APPLICATIONS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mandelli, D.; Alfonsi, A.; Talbot, P.

    2016-10-01

    The RISMC project is developing new advanced simulation-based tools to perform Computational Risk Analysis (CRA) for the existing fleet of U.S. nuclear power plants (NPPs). These tools numerically model not only the thermal-hydraulic behavior of the reactors primary and secondary systems, but also external event temporal evolution and component/system ageing. Thus, this is not only a multi-physics problem being addressed, but also a multi-scale problem (both spatial, µm-mm-m, and temporal, seconds-hours-years). As part of the RISMC CRA approach, a large amount of computationally-expensive simulation runs may be required. An important aspect is that even though computational power is growing, themore » overall computational cost of a RISMC analysis using brute-force methods may be not viable for certain cases. A solution that is being evaluated to assist the computational issue is the use of reduced order modeling techniques. During the FY2015, we investigated and applied reduced order modeling techniques to decrease the RISMC analysis computational cost by decreasing the number of simulation runs; for this analysis improvement we used surrogate models instead of the actual simulation codes. This article focuses on the use of reduced order modeling techniques that can be applied to RISMC analyses in order to generate, analyze, and visualize data. In particular, we focus on surrogate models that approximate the simulation results but in a much faster time (microseconds instead of hours/days).« less

  3. Safety Metrics for Human-Computer Controlled Systems

    NASA Technical Reports Server (NTRS)

    Leveson, Nancy G; Hatanaka, Iwao

    2000-01-01

    The rapid growth of computer technology and innovation has played a significant role in the rise of computer automation of human tasks in modem production systems across all industries. Although the rationale for automation has been to eliminate "human error" or to relieve humans from manual repetitive tasks, various computer-related hazards and accidents have emerged as a direct result of increased system complexity attributed to computer automation. The risk assessment techniques utilized for electromechanical systems are not suitable for today's software-intensive systems or complex human-computer controlled systems.This thesis will propose a new systemic model-based framework for analyzing risk in safety-critical systems where both computers and humans are controlling safety-critical functions. A new systems accident model will be developed based upon modem systems theory and human cognitive processes to better characterize system accidents, the role of human operators, and the influence of software in its direct control of significant system functions Better risk assessments will then be achievable through the application of this new framework to complex human-computer controlled systems.

  4. Six sigma tools for a patient safety-oriented, quality-checklist driven radiation medicine department.

    PubMed

    Kapur, Ajay; Potters, Louis

    2012-01-01

    The purpose of this work was to develop and implement six sigma practices toward the enhancement of patient safety in an electronic, quality checklist-driven, multicenter, paperless radiation medicine department. A quality checklist process map (QPM), stratified into consultation through treatment-completion stages was incorporated into an oncology information systems platform. A cross-functional quality management team conducted quality-function-deployment and define-measure-analyze-improve-control (DMAIC) six sigma exercises with a focus on patient safety. QPM procedures were Pareto-sorted in order of decreasing patient safety risk with failure mode and effects analysis (FMEA). Quantitative metrics for a grouped set of highest risk procedures were established. These included procedural delays, associated standard deviations and six sigma Z scores. Baseline performance of the QPM was established over the previous year of usage. Data-driven analysis led to simplification, standardization, and refinement of the QPM with standard deviation, slip-day reduction, and Z-score enhancement goals. A no-fly policy (NFP) for patient safety was introduced at the improve-control DMAIC phase, with a process map interlock imposed on treatment initiation in the event of FMEA-identified high-risk tasks being delayed or not completed. The NFP was introduced in a pilot phase with specific stopping rules and the same metrics used for performance assessments. A custom root-cause analysis database was deployed to monitor patient safety events. Relative to the baseline period, average slip days and standard deviations for the risk-enhanced QPM procedures improved by over threefold factors in the NFP period. The Z scores improved by approximately 20%. A trend for proactive delays instead of reactive hard stops was observed with no adverse effects of the NFP. The number of computed potential no-fly delays per month dropped from 60 to 20 over a total of 520 cases. The fraction of computed potential no-fly cases that were delayed in NFP compliance rose from 28% to 45%. Proactive delays rose to 80% of all delayed cases. For potential no-fly cases, event reporting rose from 18% to 50%, while for actually delayed cases, event reporting rose from 65% to 100%. With complex technologies, resource-compromised staff, and pressures to hasten treatment initiation, the use of the six sigma driven process interlocks may mitigate potential patient safety risks as demonstrated in this study. Copyright © 2012 American Society for Radiation Oncology. Published by Elsevier Inc. All rights reserved.

  5. A procedure for analysis of guyline tension.

    Treesearch

    Ward W. Carson; Jens E. Jorgensen; Stephen E. Reutebuch; William J. Bramwell

    1982-01-01

    Most cable logging operations use a spar held in place near the landing by a system of guylines and anchors. Safety and economic considerations require that overloads be avoided and that the spar remain stable. This paper presents a procedure and a computer program to estimate the guyline and anchor loads on a particular system configuration by a specific set of...

  6. The role of the research simulator in the systems development of rotorcraft

    NASA Technical Reports Server (NTRS)

    Statler, I. C.; Deel, A.

    1981-01-01

    The potential application of the research simulator to future rotorcraft systems design, development, product improvement evaluations, and safety analysis is examined. Current simulation capabilities for fixed-wing aircraft are reviewed and the requirements of a rotorcraft simulator are defined. The visual system components, vertical motion simulator, cab, and computation system for a research simulator under development are described.

  7. Sensitivity-Uncertainty Based Nuclear Criticality Safety Validation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brown, Forrest B.

    2016-09-20

    These are slides from a seminar given to the University of Mexico Nuclear Engineering Department. Whisper is a statistical analysis package developed to support nuclear criticality safety validation. It uses the sensitivity profile data for an application as computed by MCNP6 along with covariance files for the nuclear data to determine a baseline upper-subcritical-limit for the application. Whisper and its associated benchmark files are developed and maintained as part of MCNP6, and will be distributed with all future releases of MCNP6. Although sensitivity-uncertainty methods for NCS validation have been under development for 20 years, continuous-energy Monte Carlo codes such asmore » MCNP could not determine the required adjoint-weighted tallies for sensitivity profiles. The recent introduction of the iterated fission probability method into MCNP led to the rapid development of sensitivity analysis capabilities for MCNP6 and the development of Whisper. Sensitivity-uncertainty based methods represent the future for NCS validation – making full use of today’s computer power to codify past approaches based largely on expert judgment. Validation results are defensible, auditable, and repeatable as needed with different assumptions and process models. The new methods can supplement, support, and extend traditional validation approaches.« less

  8. General aviation air traffic pattern safety analysis

    NASA Technical Reports Server (NTRS)

    Parker, L. C.

    1973-01-01

    A concept is described for evaluating the general aviation mid-air collision hazard in uncontrolled terminal airspace. Three-dimensional traffic pattern measurements were conducted at uncontrolled and controlled airports. Computer programs for data reduction, storage retrieval and statistical analysis have been developed. Initial general aviation air traffic pattern characteristics are presented. These preliminary results indicate that patterns are highly divergent from the expected standard pattern, and that pattern procedures observed can affect the ability of pilots to see and avoid each other.

  9. Validation of a common data model for active safety surveillance research

    PubMed Central

    Ryan, Patrick B; Reich, Christian G; Hartzema, Abraham G; Stang, Paul E

    2011-01-01

    Objective Systematic analysis of observational medical databases for active safety surveillance is hindered by the variation in data models and coding systems. Data analysts often find robust clinical data models difficult to understand and ill suited to support their analytic approaches. Further, some models do not facilitate the computations required for systematic analysis across many interventions and outcomes for large datasets. Translating the data from these idiosyncratic data models to a common data model (CDM) could facilitate both the analysts' understanding and the suitability for large-scale systematic analysis. In addition to facilitating analysis, a suitable CDM has to faithfully represent the source observational database. Before beginning to use the Observational Medical Outcomes Partnership (OMOP) CDM and a related dictionary of standardized terminologies for a study of large-scale systematic active safety surveillance, the authors validated the model's suitability for this use by example. Validation by example To validate the OMOP CDM, the model was instantiated into a relational database, data from 10 different observational healthcare databases were loaded into separate instances, a comprehensive array of analytic methods that operate on the data model was created, and these methods were executed against the databases to measure performance. Conclusion There was acceptable representation of the data from 10 observational databases in the OMOP CDM using the standardized terminologies selected, and a range of analytic methods was developed and executed with sufficient performance to be useful for active safety surveillance. PMID:22037893

  10. Probabilistic Structural Analysis Program

    NASA Technical Reports Server (NTRS)

    Pai, Shantaram S.; Chamis, Christos C.; Murthy, Pappu L. N.; Stefko, George L.; Riha, David S.; Thacker, Ben H.; Nagpal, Vinod K.; Mital, Subodh K.

    2010-01-01

    NASA/NESSUS 6.2c is a general-purpose, probabilistic analysis program that computes probability of failure and probabilistic sensitivity measures of engineered systems. Because NASA/NESSUS uses highly computationally efficient and accurate analysis techniques, probabilistic solutions can be obtained even for extremely large and complex models. Once the probabilistic response is quantified, the results can be used to support risk-informed decisions regarding reliability for safety-critical and one-of-a-kind systems, as well as for maintaining a level of quality while reducing manufacturing costs for larger-quantity products. NASA/NESSUS has been successfully applied to a diverse range of problems in aerospace, gas turbine engines, biomechanics, pipelines, defense, weaponry, and infrastructure. This program combines state-of-the-art probabilistic algorithms with general-purpose structural analysis and lifting methods to compute the probabilistic response and reliability of engineered structures. Uncertainties in load, material properties, geometry, boundary conditions, and initial conditions can be simulated. The structural analysis methods include non-linear finite-element methods, heat-transfer analysis, polymer/ceramic matrix composite analysis, monolithic (conventional metallic) materials life-prediction methodologies, boundary element methods, and user-written subroutines. Several probabilistic algorithms are available such as the advanced mean value method and the adaptive importance sampling method. NASA/NESSUS 6.2c is structured in a modular format with 15 elements.

  11. Is Model-Based Development a Favorable Approach for Complex and Safety-Critical Computer Systems on Commercial Aircraft?

    NASA Technical Reports Server (NTRS)

    Torres-Pomales, Wilfredo

    2014-01-01

    A system is safety-critical if its failure can endanger human life or cause significant damage to property or the environment. State-of-the-art computer systems on commercial aircraft are highly complex, software-intensive, functionally integrated, and network-centric systems of systems. Ensuring that such systems are safe and comply with existing safety regulations is costly and time-consuming as the level of rigor in the development process, especially the validation and verification activities, is determined by considerations of system complexity and safety criticality. A significant degree of care and deep insight into the operational principles of these systems is required to ensure adequate coverage of all design implications relevant to system safety. Model-based development methodologies, methods, tools, and techniques facilitate collaboration and enable the use of common design artifacts among groups dealing with different aspects of the development of a system. This paper examines the application of model-based development to complex and safety-critical aircraft computer systems. Benefits and detriments are identified and an overall assessment of the approach is given.

  12. SCALE Code System 6.2.2

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rearden, Bradley T.; Jessee, Matthew Anderson

    The SCALE Code System is a widely used modeling and simulation suite for nuclear safety analysis and design that is developed, maintained, tested, and managed by the Reactor and Nuclear Systems Division (RNSD) of Oak Ridge National Laboratory (ORNL). SCALE provides a comprehensive, verified and validated, user-friendly tool set for criticality safety, reactor physics, radiation shielding, radioactive source term characterization, and sensitivity and uncertainty analysis. Since 1980, regulators, licensees, and research institutions around the world have used SCALE for safety analysis and design. SCALE provides an integrated framework with dozens of computational modules including 3 deterministic and 3 Monte Carlomore » radiation transport solvers that are selected based on the desired solution strategy. SCALE includes current nuclear data libraries and problem-dependent processing tools for continuous-energy (CE) and multigroup (MG) neutronics and coupled neutron-gamma calculations, as well as activation, depletion, and decay calculations. SCALE includes unique capabilities for automated variance reduction for shielding calculations, as well as sensitivity and uncertainty analysis. SCALE’s graphical user interfaces assist with accurate system modeling, visualization of nuclear data, and convenient access to desired results. SCALE 6.2 represents one of the most comprehensive revisions in the history of SCALE, providing several new capabilities and significant improvements in many existing features.« less

  13. Structural analysis of a rehabilitative training system based on a ceiling rail for safety of hemiplegia patients.

    PubMed

    Kim, Kyong; Song, Won Kyung; Chong, Woo Suk; Yu, Chang Ho

    2018-04-17

    The body-weight support (BWS) function, which helps to decrease load stresses on a user, is an effective tool for gait and balance rehabilitation training for elderly people with weakened lower-extremity muscular strength, hemiplegic patients, etc. This study conducts structural analysis to secure user safety in order to develop a rail-type gait and balance rehabilitation training system (RRTS). The RRTS comprises a rail, trolley, and brain-machine interface. The rail (platform) is connected to the ceiling structure, bearing the loads of the RRTS and of the user and allowing locomobility. The trolley consists of a smart drive unit (SDU) that assists the user with forward and backward mobility and a body-weight support (BWS) unit that helps the user to control his/her body-weight load, depending on the severity of his/her hemiplegia. The brain-machine interface estimates and measures on a real-time basis the body-weight (load) of the user and the intended direction of his/her movement. Considering the weight of the system and the user, the mechanical safety performance of the system frame under an applied 250-kg static load is verified through structural analysis using ABAQUS (6.14-3) software. The maximum stresses applied on the rail and trolley under the given gravity load of 250 kg, respectively, are 18.52 MPa and 48.44 MPa. The respective safety factors are computed to be 7.83 and 5.26, confirming the RRTS's mechanical safety. An RRTS with verified structural safety could be utilized for gait movement and balance rehabilitation and training for patients with hemiplegia.

  14. Computers and Health--Individual and Institutional Protective Measures.

    ERIC Educational Resources Information Center

    Updegrove, Daniel A.; Updegrove, Kimberly H.

    1991-01-01

    Two issues related to computers and health are discussed: ergonomics/work habits and radiation hazards. Several approaches that colleges and universities might use to promote workplace safety are suggested, including education, training, and more informed purchasing. San Francisco's new worker safety ordinance is presented, and carpal tunnel…

  15. The role of the PIRT process in identifying code improvements and executing code development

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wilson, G.E.; Boyack, B.E.

    1997-07-01

    In September 1988, the USNRC issued a revised ECCS rule for light water reactors that allows, as an option, the use of best estimate (BE) plus uncertainty methods in safety analysis. The key feature of this licensing option relates to quantification of the uncertainty in the determination that an NPP has a {open_quotes}low{close_quotes} probability of violating the safety criteria specified in 10 CFR 50. To support the 1988 licensing revision, the USNRC and its contractors developed the CSAU evaluation methodology to demonstrate the feasibility of the BE plus uncertainty approach. The PIRT process, Step 3 in the CSAU methodology, wasmore » originally formulated to support the BE plus uncertainty licensing option as executed in the CSAU approach to safety analysis. Subsequent work has shown the PIRT process to be a much more powerful tool than conceived in its original form. Through further development and application, the PIRT process has shown itself to be a robust means to establish safety analysis computer code phenomenological requirements in their order of importance to such analyses. Used early in research directed toward these objectives, PIRT results also provide the technical basis and cost effective organization for new experimental programs needed to improve the safety analysis codes for new applications. The primary purpose of this paper is to describe the generic PIRT process, including typical and common illustrations from prior applications. The secondary objective is to provide guidance to future applications of the process to help them focus, in a graded approach, on systems, components, processes and phenomena that have been common in several prior applications.« less

  16. Using supercomputers for the time history analysis of old gravity dams

    NASA Astrophysics Data System (ADS)

    Rouve, G.; Peters, A.

    Some of the old masonry dams that were built in Germany at the beginning of this century are a matter of concern today. In the course of time certain deterioration caused or amplified by aging has appeared and raised questions about the safety of these old dams. The Finite Element Method, which in the past two decades has found a widespread application, offers a suitable tool to re-evaluate the safety of these old gravity dams. The reliability of the results, however, strongly depends on the knowledge of the material parameters. Using historical records and observations a numerical back-analysis models has been developed to simulate the behaviour of these old masonry structures and to estimate their material properties by calibration. Only an implementation on a fourth generation vector computer made the application of this large model possible in practice.

  17. Improving food safety within the dairy chain: an application of conjoint analysis.

    PubMed

    Valeeva, N I; Meuwissen, M P M; Lansink, A G J M Oude; Huirne, R B M

    2005-04-01

    This study determined the relative importance of attributes of food safety improvement in the production chain of fluid pasteurized milk. The chain was divided into 4 blocks: "feed" (compound feed production and its transport), "farm" (dairy farm), "dairy processing" (transport and processing of raw milk, delivery of pasteurized milk), and "consumer" (retailer/catering establishment and pasteurized milk consumption). The concept of food safety improvement focused on 2 main groups of hazards: chemical (antibiotics and dioxin) and microbiological (Salmonella, Escherichia coli, Mycobacterium paratuberculosis, and Staphylococcus aureus). Adaptive conjoint analysis was used to investigate food safety experts' perceptions of the attributes' importance. Preference data from individual experts (n = 24) on 101 attributes along the chain were collected in a computer-interactive mode. Experts perceived the attributes from the "feed" and "farm" blocks as being more vital for controlling the chemical hazards; whereas the attributes from the "farm" and "dairy processing" were considered more vital for controlling the microbiological hazards. For the chemical hazards, "identification of treated cows" and "quality assurance system of compound feed manufacturers" were considered the most important attributes. For the microbiological hazards, these were "manure supply source" and "action in salmonellosis and M. paratuberculosis cases". The rather high importance of attributes relating to quality assurance and traceability systems of the chain participants indicates that participants look for food safety assurance from the preceding participants. This information has substantial decision-making implications for private businesses along the chain and for the government regarding the food safety improvement of fluid pasteurized milk.

  18. On-board computer progress in development of A 310 flight testing program

    NASA Technical Reports Server (NTRS)

    Reau, P.

    1981-01-01

    Onboard computer progress in development of an Airbus A 310 flight testing program is described. Minicomputers were installed onboard three A 310 airplanes in 1979 in order to: (1) assure the flight safety by exercising a limit check of a given set of parameters; (2) improve the efficiency of flight tests and allow cost reduction; and (3) perform test analysis on an external basis by utilizing onboard flight types. The following program considerations are discussed: (1) conclusions based on simulation of an onboard computer system; (2) brief descriptions of A 310 airborne computer equipment, specifically the onboard universal calculator (CUB) consisting of a ROLM 1666 system and visualization system using an AFIGRAF CRT; (3) the ground system and flight information inputs; and (4) specifications and execution priorities for temporary and permanent programs.

  19. Safety of inhaled glycopyrronium in patients with COPD: a comprehensive analysis of clinical studies and post-marketing data.

    PubMed

    D'Urzo, Anthony D; Kerwin, Edward M; Chapman, Kenneth R; Decramer, Marc; DiGiovanni, Robert; D'Andrea, Peter; Hu, Huilin; Goyal, Pankaj; Altman, Pablo

    2015-01-01

    Chronic use of inhaled anticholinergics by patients with chronic obstructive pulmonary disease (COPD) has raised long-term safety concerns, particularly cardiovascular. Glycopyrronium is a once-daily anticholinergic with greater receptor selectivity than previously available agents. We assessed the safety of inhaled glycopyrronium using data pooled from two analysis sets, involving six clinical studies and over 4,000 patients with COPD who received one of the following treatments: glycopyrronium 50 μg, placebo (both delivered via the Breezhaler device), or tiotropium 18 μg (delivered via the HandiHaler device). Data were pooled from studies that varied in their duration and severity of COPD of the patients (ie, ≤12 weeks duration with patients having moderate or severe COPD; and >1 year duration with patients having severe and very severe COPD). Safety comparisons were made for glycopyrronium vs tiotropium or placebo. Poisson regression was used to assess the relative risk for either active drug or placebo (and between drugs where placebo was not available) for assessing the incidence of safety events. During post-marketing surveillance (PMS), safety was assessed by obtaining reports from various sources, and disproportionality scores were computed using EMPIRICA. In particular, the cardiac safety of glycopyrronium during the post-marketing phase was evaluated. The overall incidence of adverse events and deaths was similar across groups, while the incidence of serious adverse events was numerically higher in placebo. Furthermore, glycopyrronium did not result in an increased risk of cerebro-cardiovascular events vs placebo. There were no new safety reports during the PMS phase that suggested an increased risk compared to results from the clinical studies. Moreover, the cardiac safety of glycopyrronium during the PMS phase was also consistent with the clinical data. The overall safety profile of glycopyrronium was similar to its comparators indicating no increase in the overall risk for any of the investigated safety end points.

  20. 14 CFR 417.123 - Computing systems and software.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 14 Aeronautics and Space 4 2012-01-01 2012-01-01 false Computing systems and software. 417.123... systems and software. (a) A launch operator must document a system safety process that identifies the... systems and software. (b) A launch operator must identify all safety-critical functions associated with...

  1. 14 CFR 417.123 - Computing systems and software.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 14 Aeronautics and Space 4 2013-01-01 2013-01-01 false Computing systems and software. 417.123... systems and software. (a) A launch operator must document a system safety process that identifies the... systems and software. (b) A launch operator must identify all safety-critical functions associated with...

  2. 14 CFR 417.123 - Computing systems and software.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 14 Aeronautics and Space 4 2011-01-01 2011-01-01 false Computing systems and software. 417.123... systems and software. (a) A launch operator must document a system safety process that identifies the... systems and software. (b) A launch operator must identify all safety-critical functions associated with...

  3. 14 CFR 417.123 - Computing systems and software.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 14 Aeronautics and Space 4 2010-01-01 2010-01-01 false Computing systems and software. 417.123... systems and software. (a) A launch operator must document a system safety process that identifies the... systems and software. (b) A launch operator must identify all safety-critical functions associated with...

  4. 14 CFR 417.123 - Computing systems and software.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 14 Aeronautics and Space 4 2014-01-01 2014-01-01 false Computing systems and software. 417.123... systems and software. (a) A launch operator must document a system safety process that identifies the... systems and software. (b) A launch operator must identify all safety-critical functions associated with...

  5. 78 FR 47015 - Software Requirement Specifications for Digital Computer Software Used in Safety Systems of...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-08-02

    ... NUCLEAR REGULATORY COMMISSION [NRC-2012-0195] Software Requirement Specifications for Digital Computer Software Used in Safety Systems of Nuclear Power Plants AGENCY: Nuclear Regulatory Commission... issuing a revised regulatory guide (RG), revision 1 of RG 1.172, ``Software Requirement Specifications for...

  6. 77 FR 50724 - Developing Software Life Cycle Processes for Digital Computer Software Used in Safety Systems of...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-08-22

    ... NUCLEAR REGULATORY COMMISSION [NRC-2012-0195] Developing Software Life Cycle Processes for Digital... Software Life Cycle Processes for Digital Computer Software used in Safety Systems of Nuclear Power Plants... clarifications, the enhanced consensus practices for developing software life-cycle processes for digital...

  7. System cost/performance analysis (study 2.3). Volume 1: Executive summary

    NASA Technical Reports Server (NTRS)

    Kazangey, T.

    1973-01-01

    The relationships between performance, safety, cost, and schedule parameters were identified and quantified in support of an overall effort to generate program models and methodology that provide insight into a total space vehicle program. A specific space vehicle system, the attitude control system (ACS), was used, and a modeling methodology was selected that develops a consistent set of quantitative relationships among performance, safety, cost, and schedule, based on the characteristics of the components utilized in candidate mechanisms. These descriptive equations were developed for a three-axis, earth-pointing, mass expulsion ACS. A data base describing typical candidate ACS components was implemented, along with a computer program to perform sample calculations. This approach, implemented on a computer, is capable of determining the effect of a change in functional requirements to the ACS mechanization and the resulting cost and schedule. By a simple extension of this modeling methodology to the other systems in a space vehicle, a complete space vehicle model can be developed. Study results and recommendations are presented.

  8. Unsteady Probabilistic Analysis of a Gas Turbine System

    NASA Technical Reports Server (NTRS)

    Brown, Marilyn

    2003-01-01

    In this work, we have considered an annular cascade configuration subjected to unsteady inflow conditions. The unsteady response calculation has been implemented into the time marching CFD code, MSUTURBO. The computed steady state results for the pressure distribution demonstrated good agreement with experimental data. We have computed results for the amplitudes of the unsteady pressure over the blade surfaces. With the increase in gas turbine engine structural complexity and performance over the past 50 years, structural engineers have created an array of safety nets to ensure against component failures in turbine engines. In order to reduce what is now considered to be excessive conservatism and yet maintain the same adequate margins of safety, there is a pressing need to explore methods of incorporating probabilistic design procedures into engine development. Probabilistic methods combine and prioritize the statistical distributions of each design variable, generate an interactive distribution and offer the designer a quantified relationship between robustness, endurance and performance. The designer can therefore iterate between weight reduction, life increase, engine size reduction, speed increase etc.

  9. A stochastic multicriteria model for evidence-based decision making in drug benefit-risk analysis.

    PubMed

    Tervonen, Tommi; van Valkenhoef, Gert; Buskens, Erik; Hillege, Hans L; Postmus, Douwe

    2011-05-30

    Drug benefit-risk (BR) analysis is based on firm clinical evidence regarding various safety and efficacy outcomes. In this paper, we propose a new and more formal approach for constructing a supporting multi-criteria model that fully takes into account the evidence on efficacy and adverse drug reactions. Our approach is based on the stochastic multi-criteria acceptability analysis methodology, which allows us to compute the typical value judgments that support a decision, to quantify decision uncertainty, and to compute a comprehensive BR profile. We construct a multi-criteria model for the therapeutic group of second-generation antidepressants. We assess fluoxetine and venlafaxine together with placebo according to incidence of treatment response and three common adverse drug reactions by using data from a published study. Our model shows that there are clear trade-offs among the treatment alternatives. Copyright © 2011 John Wiley & Sons, Ltd.

  10. The change of radial power factor distribution due to RCCA insertion at the first cycle core of AP1000

    NASA Astrophysics Data System (ADS)

    Susilo, J.; Suparlina, L.; Deswandri; Sunaryo, G. R.

    2018-02-01

    The using of a computer program for the PWR type core neutronic design parameters analysis has been carried out in some previous studies. These studies included a computer code validation on the neutronic parameters data values resulted from measurements and benchmarking calculation. In this study, the AP1000 first cycle core radial power peaking factor validation and analysis were performed using CITATION module of the SRAC2006 computer code. The computer code has been also validated with a good result to the criticality values of VERA benchmark core. The AP1000 core power distribution calculation has been done in two-dimensional X-Y geometry through ¼ section modeling. The purpose of this research is to determine the accuracy of the SRAC2006 code, and also the safety performance of the AP1000 core first cycle operating. The core calculations were carried out with the several conditions, those are without Rod Cluster Control Assembly (RCCA), by insertion of a single RCCA (AO, M1, M2, MA, MB, MC, MD) and multiple insertion RCCA (MA + MB, MA + MB + MC, MA + MB + MC + MD, and MA + MB + MC + MD + M1). The maximum power factor of the fuel rods value in the fuel assembly assumedapproximately 1.406. The calculation results analysis showed that the 2-dimensional CITATION module of SRAC2006 code is accurate in AP1000 power distribution calculation without RCCA and with MA+MB RCCA insertion.The power peaking factor on the first operating cycle of the AP1000 core without RCCA, as well as with single and multiple RCCA are still below in the safety limit values (less then about 1.798). So in terms of thermal power generated by the fuel assembly, then it can be considered that the AP100 core at the first operating cycle is safe.

  11. Comparative Evaluation of a Four-Implant-Supported Polyetherketoneketone Framework Prosthesis: A Three-Dimensional Finite Element Analysis Based on Cone Beam Computed Tomography and Computer-Aided Design.

    PubMed

    Lee, Ki-Sun; Shin, Sang-Wan; Lee, Sang-Pyo; Kim, Jong-Eun; Kim, Jee-Hwan; Lee, Jeong-Yol

    The purpose of this pilot study was to evaluate and compare polyetherketoneketone (PEKK) with different framework materials for implant-supported prostheses by means of a three-dimensional finite element analysis (3D-FEA) based on cone beam computed tomography (CBCT) and computer-aided design (CAD) data. A geometric model that consisted of four maxillary implants supporting a prosthesis framework was constructed from CBCT and CAD data of a treated patient. Three different materials (zirconia, titanium, and PEKK) were selected, and their material properties were simulated using FEA software in the generated geometric model. In the PEKK framework (ie, low elastic modulus) group, the stress transferred to the implant and simulated adjacent tissue was reduced when compressive stress was dominant, but increased when tensile stress was dominant. This study suggests that the shock-absorbing effects of a resilient implant-supported framework are limited in some areas and that rigid framework material shows a favorable stress distribution and safety of overall components of the prosthesis.

  12. People’s Republic of China Scientific Abstracts, Number 195.

    DTIC Science & Technology

    1978-08-01

    regarding China. 17. Key Word« and Document Analysis. 17a. Descriptors China x Agricultural Science and Technology x Bio-Medical Sciences x...Chemistry Cybernetics, Computers, and Automation Technology x Earth Sciences 17b. ldentifiers/Open-Eoded Terms X X Engineering and Equipment...safety in astronautics, especially in reentry; 3. Producible with current technology ; *t. Provides an increased range of astronautical acti

  13. Image Analysis Based on Soft Computing and Applied on Space Shuttle During the Liftoff Process

    NASA Technical Reports Server (NTRS)

    Dominquez, Jesus A.; Klinko, Steve J.

    2007-01-01

    Imaging techniques based on Soft Computing (SC) and developed at Kennedy Space Center (KSC) have been implemented on a variety of prototype applications related to the safety operation of the Space Shuttle during the liftoff process. These SC-based prototype applications include detection and tracking of moving Foreign Objects Debris (FOD) during the Space Shuttle liftoff, visual anomaly detection on slidewires used in the emergency egress system for the Space Shuttle at the laJlIlch pad, and visual detection of distant birds approaching the Space Shuttle launch pad. This SC-based image analysis capability developed at KSC was also used to analyze images acquired during the accident of the Space Shuttle Columbia and estimate the trajectory and velocity of the foam that caused the accident.

  14. Relative effectiveness of worker safety and health training methods.

    PubMed

    Burke, Michael J; Sarpy, Sue Ann; Smith-Crowe, Kristin; Chan-Serafin, Suzanne; Salvador, Rommel O; Islam, Gazi

    2006-02-01

    We sought to determine the relative effectiveness of different methods of worker safety and health training aimed at improving safety knowledge and performance and reducing negative outcomes (accidents, illnesses, and injuries). Ninety-five quasi-experimental studies (n=20991) were included in the analysis. Three types of intervention methods were distinguished on the basis of learners' participation in the training process: least engaging (lecture, pamphlets, videos), moderately engaging (programmed instruction, feedback interventions), and most engaging (training in behavioral modeling, hands-on training). As training methods became more engaging (i.e., requiring trainees' active participation), workers demonstrated greater knowledge acquisition, and reductions were seen in accidents, illnesses, and injuries. All methods of training produced meaningful behavioral performance improvements. Training involving behavioral modeling, a substantial amount of practice, and dialogue is generally more effective than other methods of safety and health training. The present findings challenge the current emphasis on more passive computer-based and distance training methods within the public health workforce.

  15. Relative Effectiveness of Worker Safety and Health Training Methods

    PubMed Central

    Burke, Michael J.; Sarpy, Sue Ann; Smith-Crowe, Kristin; Chan-Serafin, Suzanne; Salvador, Rommel O.; Islam, Gazi

    2006-01-01

    Objectives. We sought to determine the relative effectiveness of different methods of worker safety and health training aimed at improving safety knowledge and performance and reducing negative outcomes (accidents, illnesses, and injuries). Methods. Ninety-five quasi-experimental studies (n=20991) were included in the analysis. Three types of intervention methods were distinguished on the basis of learners’ participation in the training process: least engaging (lecture, pamphlets, videos), moderately engaging (programmed instruction, feedback interventions), and most engaging (training in behavioral modeling, hands-on training). Results. As training methods became more engaging (i.e., requiring trainees’ active participation), workers demonstrated greater knowledge acquisition, and reductions were seen in accidents, illnesses, and injuries. All methods of training produced meaningful behavioral performance improvements. Conclusions. Training involving behavioral modeling, a substantial amount of practice, and dialogue is generally more effective than other methods of safety and health training. The present findings challenge the current emphasis on more passive computer-based and distance training methods within the public health workforce. PMID:16380566

  16. Database for Safety-Oriented Tracking of Chemicals

    NASA Technical Reports Server (NTRS)

    Stump, Jacob; Carr, Sandra; Plumlee, Debrah; Slater, Andy; Samson, Thomas M.; Holowaty, Toby L.; Skeete, Darren; Haenz, Mary Alice; Hershman, Scot; Raviprakash, Pushpa

    2010-01-01

    SafetyChem is a computer program that maintains a relational database for tracking chemicals and associated hazards at Johnson Space Center (JSC) by use of a Web-based graphical user interface. The SafetyChem database is accessible to authorized users via a JSC intranet. All new chemicals pass through a safety office, where information on hazards, required personal protective equipment (PPE), fire-protection warnings, and target organ effects (TOEs) is extracted from material safety data sheets (MSDSs) and recorded in the database. The database facilitates real-time management of inventory with attention to such issues as stability, shelf life, reduction of waste through transfer of unused chemicals to laboratories that need them, quantification of chemical wastes, and identification of chemicals for which disposal is required. Upon searching the database for a chemical, the user receives information on physical properties of the chemical, hazard warnings, required PPE, a link to the MSDS, and references to the applicable International Standards Organization (ISO) 9000 standard work instructions and the applicable job hazard analysis. Also, to reduce the labor hours needed to comply with reporting requirements of the Occupational Safety and Health Administration, the data can be directly exported into the JSC hazardous- materials database.

  17. An analysis of computer-related patient safety incidents to inform the development of a classification

    PubMed Central

    Ong, Mei-Sing; Runciman, William; Coiera, Enrico

    2010-01-01

    Objective To analyze patient safety incidents associated with computer use to develop the basis for a classification of problems reported by health professionals. Design Incidents submitted to a voluntary incident reporting database across one Australian state were retrieved and a subset (25%) was analyzed to identify ‘natural categories’ for classification. Two coders independently classified the remaining incidents into one or more categories. Free text descriptions were analyzed to identify contributing factors. Where available medical specialty, time of day and consequences were examined. Measurements Descriptive statistics; inter-rater reliability. Results A search of 42 616 incidents from 2003 to 2005 yielded 123 computer related incidents. After removing duplicate and unrelated incidents, 99 incidents describing 117 problems remained. A classification with 32 types of computer use problems was developed. Problems were grouped into information input (31%), transfer (20%), output (20%) and general technical (24%). Overall, 55% of problems were machine related and 45% were attributed to human–computer interaction. Delays in initiating and completing clinical tasks were a major consequence of machine related problems (70%) whereas rework was a major consequence of human–computer interaction problems (78%). While 38% (n=26) of the incidents were reported to have a noticeable consequence but no harm, 34% (n=23) had no noticeable consequence. Conclusion Only 0.2% of all incidents reported were computer related. Further work is required to expand our classification using incident reports and other sources of information about healthcare IT problems. Evidence based user interface design must focus on the safe entry and retrieval of clinical information and support users in detecting and correcting errors and malfunctions. PMID:20962128

  18. 2007 international meeting on Reduced Enrichment for Research and Test Reactors (RERTR). Abstracts and available papers presented at the meeting

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    NONE

    2008-07-15

    The Meeting papers discuss research and test reactor fuel performance, manufacturing and testing. Some of the main topics are: conversion from HEU to LEU in different reactors and corresponding problems and activities; flux performance and core lifetime analysis with HEU and LEU fuels; physics and safety characteristics; measurement of gamma field parameters in core with LEU fuel; nondestructive analysis of RERTR fuel; thermal hydraulic analysis; fuel interactions; transient analyses and thermal hydraulics for HEU and LEU cores; microstructure research reactor fuels; post irradiation analysis and performance; computer codes and other related problems.

  19. Thermal and fluid-dynamics behavior of circulating systems in the case of pressure relief

    NASA Astrophysics Data System (ADS)

    Moeller, L.

    Aspects of safety in the case of large-scale installations with operational high-pressure conditions must be an important consideration already during the design of such installations, taking into account all conceivable disturbances. Within an analysis of such disturbances, studies related to pressure relief processes will have to occupy a central position. For such studies, it is convenient to combine experiments involving small-scale models of the actual installation with suitable computational programs. The experiments can be carried out at lower pressures and temperatures if the actual fluid is replaced by another medium, such as, for instance, a refrigerant. This approach has been used in the present investigation. The obtained experimental data are employed as a basis for a verification of the results provided by the computational model 'Frelap-UK' which has been expressly developed for the analysis of system behavior in the case of pressure relief. It is found that the computer fluid-dynamics characteristics agree with the experimental results.

  20. Ontology-supported research on vaccine efficacy, safety and integrative biological networks.

    PubMed

    He, Yongqun

    2014-07-01

    While vaccine efficacy and safety research has dramatically progressed with the methods of in silico prediction and data mining, many challenges still exist. A formal ontology is a human- and computer-interpretable set of terms and relations that represent entities in a specific domain and how these terms relate to each other. Several community-based ontologies (including Vaccine Ontology, Ontology of Adverse Events and Ontology of Vaccine Adverse Events) have been developed to support vaccine and adverse event representation, classification, data integration, literature mining of host-vaccine interaction networks, and analysis of vaccine adverse events. The author further proposes minimal vaccine information standards and their ontology representations, ontology-based linked open vaccine data and meta-analysis, an integrative One Network ('OneNet') Theory of Life, and ontology-based approaches to study and apply the OneNet theory. In the Big Data era, these proposed strategies provide a novel framework for advanced data integration and analysis of fundamental biological networks including vaccine immune mechanisms.

  1. Ontology-supported Research on Vaccine Efficacy, Safety, and Integrative Biological Networks

    PubMed Central

    He, Yongqun

    2016-01-01

    Summary While vaccine efficacy and safety research has dramatically progressed with the methods of in silico prediction and data mining, many challenges still exist. A formal ontology is a human- and computer-interpretable set of terms and relations that represent entities in a specific domain and how these terms relate to each other. Several community-based ontologies (including the Vaccine Ontology, Ontology of Adverse Events, and Ontology of Vaccine Adverse Events) have been developed to support vaccine and adverse event representation, classification, data integration, literature mining of host-vaccine interaction networks, and analysis of vaccine adverse events. The author further proposes minimal vaccine information standards and their ontology representations, ontology-based linked open vaccine data and meta-analysis, an integrative One Network (“OneNet”) Theory of Life, and ontology-based approaches to study and apply the OneNet theory. In the Big Data era, these proposed strategies provide a novel framework for advanced data integration and analysis of fundamental biological networks including vaccine immune mechanisms. PMID:24909153

  2. Evaluation of radiological dispersion/consequence codes supporting DOE nuclear facility SARs

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    O`Kula, K.R.; Paik, I.K.; Chung, D.Y.

    1996-12-31

    Since the early 1990s, the authorization basis documentation of many U.S. Department of Energy (DOE) nuclear facilities has been upgraded to comply with DOE orders and standards. In this process, many safety analyses have been revised. Unfortunately, there has been nonuniform application of software, and the most appropriate computer and engineering methodologies often are not applied. A DOE Accident Phenomenology and Consequence (APAC) Methodology Evaluation Program was originated at the request of DOE Defense Programs to evaluate the safety analysis methodologies used in nuclear facility authorization basis documentation and to define future cost-effective support and development initiatives. Six areas, includingmore » source term development (fire, spills, and explosion analysis), in-facility transport, and dispersion/ consequence analysis (chemical and radiological) are contained in the APAC program. The evaluation process, codes considered, key results, and recommendations for future model and software development of the Radiological Dispersion/Consequence Working Group are summarized in this paper.« less

  3. Interface design of VSOP'94 computer code for safety analysis

    NASA Astrophysics Data System (ADS)

    Natsir, Khairina; Yazid, Putranto Ilham; Andiwijayakusuma, D.; Wahanani, Nursinta Adi

    2014-09-01

    Today, most software applications, also in the nuclear field, come with a graphical user interface. VSOP'94 (Very Superior Old Program), was designed to simplify the process of performing reactor simulation. VSOP is a integrated code system to simulate the life history of a nuclear reactor that is devoted in education and research. One advantage of VSOP program is its ability to calculate the neutron spectrum estimation, fuel cycle, 2-D diffusion, resonance integral, estimation of reactors fuel costs, and integrated thermal hydraulics. VSOP also can be used to comparative studies and simulation of reactor safety. However, existing VSOP is a conventional program, which was developed using Fortran 65 and have several problems in using it, for example, it is only operated on Dec Alpha mainframe platforms and provide text-based output, difficult to use, especially in data preparation and interpretation of results. We develop a GUI-VSOP, which is an interface program to facilitate the preparation of data, run the VSOP code and read the results in a more user friendly way and useable on the Personal 'Computer (PC). Modifications include the development of interfaces on preprocessing, processing and postprocessing. GUI-based interface for preprocessing aims to provide a convenience way in preparing data. Processing interface is intended to provide convenience in configuring input files and libraries and do compiling VSOP code. Postprocessing interface designed to visualized the VSOP output in table and graphic forms. GUI-VSOP expected to be useful to simplify and speed up the process and analysis of safety aspects.

  4. Exploiting heterogeneous publicly available data sources for drug safety surveillance: computational framework and case studies.

    PubMed

    Koutkias, Vassilis G; Lillo-Le Louët, Agnès; Jaulent, Marie-Christine

    2017-02-01

    Driven by the need of pharmacovigilance centres and companies to routinely collect and review all available data about adverse drug reactions (ADRs) and adverse events of interest, we introduce and validate a computational framework exploiting dominant as well as emerging publicly available data sources for drug safety surveillance. Our approach relies on appropriate query formulation for data acquisition and subsequent filtering, transformation and joint visualization of the obtained data. We acquired data from the FDA Adverse Event Reporting System (FAERS), PubMed and Twitter. In order to assess the validity and the robustness of the approach, we elaborated on two important case studies, namely, clozapine-induced cardiomyopathy/myocarditis versus haloperidol-induced cardiomyopathy/myocarditis, and apixaban-induced cerebral hemorrhage. The analysis of the obtained data provided interesting insights (identification of potential patient and health-care professional experiences regarding ADRs in Twitter, information/arguments against an ADR existence across all sources), while illustrating the benefits (complementing data from multiple sources to strengthen/confirm evidence) and the underlying challenges (selecting search terms, data presentation) of exploiting heterogeneous information sources, thereby advocating the need for the proposed framework. This work contributes in establishing a continuous learning system for drug safety surveillance by exploiting heterogeneous publicly available data sources via appropriate support tools.

  5. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gharibyan, N.

    In order to fully characterize the NIF neutron spectrum, SAND-II-SNL software was requested/received from the Radiation Safety Information Computational Center. The software is designed to determine the neutron energy spectrum through analysis of experimental activation data. However, given that the source code was developed in Sparcstation 10, it is not compatible with current version of FORTRAN. Accounts have been established through the Lawrence Livermore National Laboratory’s High Performance Computing in order to access different compiles for FORTRAN (e.g. pgf77, pgf90). Additionally, several of the subroutines included in the SAND-II-SNL package have required debugging efforts to allow for proper compiling ofmore » the code.« less

  6. Advanced Computational Methods for Thermal Radiative Heat Transfer

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tencer, John; Carlberg, Kevin Thomas; Larsen, Marvin E.

    2016-10-01

    Participating media radiation (PMR) in weapon safety calculations for abnormal thermal environments are too costly to do routinely. This cost may be s ubstantially reduced by applying reduced order modeling (ROM) techniques. The application of ROM to PMR is a new and unique approach for this class of problems. This approach was investigated by the authors and shown to provide significant reductions in the computational expense associated with typical PMR simulations. Once this technology is migrated into production heat transfer analysis codes this capability will enable the routine use of PMR heat transfer in higher - fidelity simulations of weaponmore » resp onse in fire environments.« less

  7. Enhancing point of care vigilance using computers.

    PubMed

    St Jacques, Paul; Rothman, Brian

    2011-09-01

    Information technology has the potential to provide a tremendous step forward in perioperative patient safety. Through automated delivery of information through fixed and portable computer resources, clinicians may achieve improved situational awareness of the overall operation of the operating room suite and the state of individual patients in various stages of surgical care. Coupling the raw, but integrated, information with decision support and alerting algorithms enables clinicians to achieve high reliability in documentation compliance and response to care protocols. Future studies and outcomes analysis are needed to quantify the degree of benefit of these new components of perioperative information systems. Copyright © 2011 Elsevier Inc. All rights reserved.

  8. Development and analysis of the Software Implemented Fault-Tolerance (SIFT) computer

    NASA Technical Reports Server (NTRS)

    Goldberg, J.; Kautz, W. H.; Melliar-Smith, P. M.; Green, M. W.; Levitt, K. N.; Schwartz, R. L.; Weinstock, C. B.

    1984-01-01

    SIFT (Software Implemented Fault Tolerance) is an experimental, fault-tolerant computer system designed to meet the extreme reliability requirements for safety-critical functions in advanced aircraft. Errors are masked by performing a majority voting operation over the results of identical computations, and faulty processors are removed from service by reassigning computations to the nonfaulty processors. This scheme has been implemented in a special architecture using a set of standard Bendix BDX930 processors, augmented by a special asynchronous-broadcast communication interface that provides direct, processor to processor communication among all processors. Fault isolation is accomplished in hardware; all other fault-tolerance functions, together with scheduling and synchronization are implemented exclusively by executive system software. The system reliability is predicted by a Markov model. Mathematical consistency of the system software with respect to the reliability model has been partially verified, using recently developed tools for machine-aided proof of program correctness.

  9. Proceedings of the 1992 topical meeting on advances in reactor physics. Volume 2

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1992-04-01

    This document, Volume 2, presents proceedings of the 1992 Topical Meeting on Advances in Reactor Physics on March 8--11, 1992 at Charleston, SC. Session topics were as follows: Transport Theory; Fast Reactors; Plant Analyzers; Integral Experiments/Measurements & Analysis; Core Computational Systems; Reactor Physics; Monte Carlo; Safety Aspects of Heavy Water Reactors; and Space-Time Core Kinetics. The individual reports have been cataloged separately. (FI)

  10. A trend analysis of ergonomic research themes in Taiwan.

    PubMed

    Lin, Chih-Long

    2015-01-01

    This paper examines the development of ergonomics in Taiwan by analysing 1404 scientific articles published by 113 permanent members of the Ergonomics Society of Taiwan (EST). Each article was classified by key words and abstract content. Each article was also coded by period of publication (1971-1992 (first period), 1993-1997 (second period), 1998-2002 (third period), 2003-2007 (fourth period), and 2008-2012 (fifth period), and against 13 topic categories. The results show that rate of publication has increased by approximately 100 articles every five years since 1993.The most popular topic was ergonomics assessment and analysis techniques in the first period, force exertion-related research in the second period, product design and evaluation in the third period, occupational safety and health in the fourth period and human-computer interface in the fifth period. Each of these is highly relevant to current contemporary issues around the world. Finally, potential areas for future ergonomics research in Taiwan are discussed. This study investigates the trends in academic papers published by members of the EST. Over time, topics have shifted from ergonomics evaluation methods to occupational safety and health, and human–computer interaction. The findings should be considered as important references for planning the future of ergonomics in Taiwan.

  11. Using tablet technology in operational radiation safety applications.

    PubMed

    Phillips, Andrew; Linsley, Mark; Houser, Mike

    2013-11-01

    Tablet computers have become a mainstream product in today's personal, educational, and business worlds. These tablets offer computing power, storage, and a wide range of available products to meet nearly every user need. To take advantage of this new computing technology, a system was developed for the Apple iPad (Apple Inc. 1 Infinite Loop Cupertino, CA 95014) to perform health and safety inspections in the field using editable PDFs and saving them to a database while keeping the process easy and paperless.

  12. MELCOR Analysis of OSU Multi-Application Small Light Water Reactor (MASLWR) Experiment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yoon, Dhongik S.; Jo, HangJin; Fu, Wen

    A multi-application small light water reactor (MASLWR) conceptual design was developed by Oregon State University (OSU) with emphasis on passive safety systems. The passive containment safety system employs condensation and natural circulation to achieve the necessary heat removal from the containment in case of postulated accidents. Containment condensation experiments at the MASLWR test facility at OSU are modeled and analyzed with MELCOR, a system-level reactor accident analysis computer code. The analysis assesses its ability to predict condensation heat transfer in the presence of noncondensable gas for accidents where high-energy steam is released into the containment. This work demonstrates MELCOR’s abilitymore » to predict the pressure-temperature response of the scaled containment. Our analysis indicates that the heat removal rates are underestimated in the experiment due to the limited locations of the thermocouples and applies corrections to these measurements by conducting integral energy analyses along with CFD simulation for confirmation. Furthermore, the corrected heat removal rate measurements and the MELCOR predictions on the heat removal rate from the containment show good agreement with the experimental data.« less

  13. MELCOR Analysis of OSU Multi-Application Small Light Water Reactor (MASLWR) Experiment

    DOE PAGES

    Yoon, Dhongik S.; Jo, HangJin; Fu, Wen; ...

    2017-05-23

    A multi-application small light water reactor (MASLWR) conceptual design was developed by Oregon State University (OSU) with emphasis on passive safety systems. The passive containment safety system employs condensation and natural circulation to achieve the necessary heat removal from the containment in case of postulated accidents. Containment condensation experiments at the MASLWR test facility at OSU are modeled and analyzed with MELCOR, a system-level reactor accident analysis computer code. The analysis assesses its ability to predict condensation heat transfer in the presence of noncondensable gas for accidents where high-energy steam is released into the containment. This work demonstrates MELCOR’s abilitymore » to predict the pressure-temperature response of the scaled containment. Our analysis indicates that the heat removal rates are underestimated in the experiment due to the limited locations of the thermocouples and applies corrections to these measurements by conducting integral energy analyses along with CFD simulation for confirmation. Furthermore, the corrected heat removal rate measurements and the MELCOR predictions on the heat removal rate from the containment show good agreement with the experimental data.« less

  14. 14 CFR 1274.936 - Breach of safety or security.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    .... (a) Security is the condition of safeguarding against espionage, sabotage, crime (including computer... 14 Aeronautics and Space 5 2013-01-01 2013-01-01 false Breach of safety or security. 1274.936... security. Breach of Safety or Security July 2002 Safety is the freedom from those conditions that can cause...

  15. 14 CFR 1274.936 - Breach of safety or security.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    .... (a) Security is the condition of safeguarding against espionage, sabotage, crime (including computer... 14 Aeronautics and Space 5 2010-01-01 2010-01-01 false Breach of safety or security. 1274.936... security. Breach of Safety or Security July 2002 Safety is the freedom from those conditions that can cause...

  16. 14 CFR 1274.936 - Breach of safety or security.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    .... (a) Security is the condition of safeguarding against espionage, sabotage, crime (including computer... 14 Aeronautics and Space 5 2011-01-01 2010-01-01 true Breach of safety or security. 1274.936... security. Breach of Safety or Security July 2002 Safety is the freedom from those conditions that can cause...

  17. A Software Safety Risk Taxonomy for Use in Retrospective Safety Cases

    NASA Technical Reports Server (NTRS)

    Hill, Janice L.

    2007-01-01

    Safety standards contain technical and process-oriented safely requirements. The best time to include these requirements is early in the development lifecycle of the system. When software safety requirements are levied on a legacy system after the fact, a retrospective safety case will need to be constructed for the software in the system. This can be a difficult task because there may be few to no art facts available to show compliance to the software safely requirements. The risks associated with not meeting safely requirements in a legacy safely-critical computer system must be addressed to give confidence for reuse. This paper introduces a proposal for a software safely risk taxonomy for legacy safely-critical computer systems, by specializing the Software Engineering Institute's 'Software Development Risk Taxonomy' with safely elements and attributes.

  18. Demonstration of emulator-based Bayesian calibration of safety analysis codes: Theory and formulation

    DOE PAGES

    Yurko, Joseph P.; Buongiorno, Jacopo; Youngblood, Robert

    2015-05-28

    System codes for simulation of safety performance of nuclear plants may contain parameters whose values are not known very accurately. New information from tests or operating experience is incorporated into safety codes by a process known as calibration, which reduces uncertainty in the output of the code and thereby improves its support for decision-making. The work reported here implements several improvements on classic calibration techniques afforded by modern analysis techniques. The key innovation has come from development of code surrogate model (or code emulator) construction and prediction algorithms. Use of a fast emulator makes the calibration processes used here withmore » Markov Chain Monte Carlo (MCMC) sampling feasible. This study uses Gaussian Process (GP) based emulators, which have been used previously to emulate computer codes in the nuclear field. The present work describes the formulation of an emulator that incorporates GPs into a factor analysis-type or pattern recognition-type model. This “function factorization” Gaussian Process (FFGP) model allows overcoming limitations present in standard GP emulators, thereby improving both accuracy and speed of the emulator-based calibration process. Calibration of a friction-factor example using a Method of Manufactured Solution is performed to illustrate key properties of the FFGP based process.« less

  19. Mathematics and Statistics Research Department progress report, period ending June 30, 1982

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Denson, M.V.; Funderlic, R.E.; Gosslee, D.G.

    1982-08-01

    This report is the twenty-fifth in the series of progress reports of the Mathematics and Statistics Research Department of the Computer Sciences Division, Union Carbide Corporation Nuclear Division (UCC-ND). Part A records research progress in analysis of large data sets, biometrics research, computational statistics, materials science applications, moving boundary problems, numerical linear algebra, and risk analysis. Collaboration and consulting with others throughout the UCC-ND complex are recorded in Part B. Included are sections on biology, chemistry, energy, engineering, environmental sciences, health and safety, materials science, safeguards, surveys, and the waste storage program. Part C summarizes the various educational activities inmore » which the staff was engaged. Part D lists the presentations of research results, and Part E records the staff's other professional activities during the report period.« less

  20. Randomized controlled trial to determine the effectiveness of an interactive multimedia food safety education program for clients of the special supplemental nutrition program for women, infants, and children.

    PubMed

    Trepka, Mary Jo; Newman, Frederick L; Davila, Evelyn P; Matthew, Karen J; Dixon, Zisca; Huffman, Fatma G

    2008-06-01

    Pregnant women and the very young are among those most susceptible to foodborne infections and at high risk of a severe outcome from foodborne infections. To determine if interactive multimedia is a more effective method than pamphlets for delivering food safety education to Special Supplemental Nutrition Program for Women, Infants, and Children (WIC) clients. A randomized controlled trial of WIC clients was conducted. Self-reported food safety practices were compared between pre- and postintervention questionnaires completed >or=2 months after the intervention. Pregnant WIC clients or female caregivers (usually mothers) of WIC clients who were 18 years of age or older and able to speak and read English were recruited from an inner-city WIC clinic. Participants were randomized to receive food safety pamphlets or complete an interactive multimedia food safety education program on a computer kiosk. Change from pre- to postintervention food safety scores. A mean food safety score was determined for each participant for the pre- and postintervention questionnaires. The scores were used in a two-group repeated measures analysis of variance. Of the 394 participants, 255 (64.7%) completed the postintervention questionnaire. Satisfaction with the program was high especially among those with no education beyond high school. When considering a repeated measures analysis of variance model with the two fixed between-subject effects of group and age, a larger improvement in score in the interactive multimedia group than in the pamphlet group (P=0.005) was found, but the size of the group effect was small (partial eta(2)=0.033). Women aged 35 years or older in the interactive multimedia group had the largest increase in score. The interactive multimedia was well-accepted and resulted in improved self-reported food safety practices, suggesting that interactive multimedia is an effective option for food safety education in WIC clinics.

  1. Safety culture in a pharmacy setting using a pharmacy survey on patient safety culture: a cross-sectional study in China.

    PubMed

    Jia, P L; Zhang, L H; Zhang, M M; Zhang, L L; Zhang, C; Qin, S F; Li, X L; Liu, K X

    2014-06-30

    To explore the attitudes and perceptions of patient safety culture for pharmacy workers in China by using a Pharmacy Survey on Patient Safety Culture (PSOPSC), and to assess the psychometric properties of the translated Chinese language version of the PSOPSC. Cross-sectional study. Data were obtained from 20 hospital pharmacies in the southwest part of China. We performed χ(2) test to explore the differences on pharmacy staff in different hospital and qualification levels and countries towards patient safety culture. We also computed descriptive statistics, internal consistency coefficients and intersubscale correlation analysis, and then conducted an exploratory factor analysis. A test-retest was performed to assess reproducibility of the items. A total of 630 questionnaires were distributed of which 527 were responded to validly (response rate 84%). The positive response rate for each item ranged from 37% to 90%. The positive response rate on three dimensions ('Teamwork', 'Staff Training and Skills' and 'Staffing, Work Pressure and Pace') was higher than that of Agency for Healthcare Research and Quality (AHRQ) data (p<0.05). There was a statistical difference in the perception of patient safety culture at different hospital and qualification levels. The internal consistency of the total survey was comparatively satisfied (Cronbach's α=0.89). The results demonstrated that among the pharmacy staffs surveyed in China, there was a positive attitude towards patient safety culture in their organisations. Identifying perspectives of patient safety culture from pharmacists in different hospital and qualification levels are important, since this can help support decisions about action to improve safety culture in pharmacy settings. The Chinese translation of the PSOPSC questionnaire (V.2012) applied in our study is acceptable. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.

  2. Intelligent Hardware-Enabled Sensor and Software Safety and Health Management for Autonomous UAS

    NASA Technical Reports Server (NTRS)

    Rozier, Kristin Y.; Schumann, Johann; Ippolito, Corey

    2015-01-01

    Unmanned Aerial Systems (UAS) can only be deployed if they can effectively complete their mission and respond to failures and uncertain environmental conditions while maintaining safety with respect to other aircraft as well as humans and property on the ground. We propose to design a real-time, onboard system health management (SHM) capability to continuously monitor essential system components such as sensors, software, and hardware systems for detection and diagnosis of failures and violations of safety or performance rules during the ight of a UAS. Our approach to SHM is three-pronged, providing: (1) real-time monitoring of sensor and software signals; (2) signal analysis, preprocessing, and advanced on-the- y temporal and Bayesian probabilistic fault diagnosis; (3) an unobtrusive, lightweight, read-only, low-power hardware realization using Field Programmable Gate Arrays (FPGAs) in order to avoid overburdening limited computing resources or costly re-certi cation of ight software due to instrumentation. No currently available SHM capabilities (or combinations of currently existing SHM capabilities) come anywhere close to satisfying these three criteria yet NASA will require such intelligent, hardwareenabled sensor and software safety and health management for introducing autonomous UAS into the National Airspace System (NAS). We propose a novel approach of creating modular building blocks for combining responsive runtime monitoring of temporal logic system safety requirements with model-based diagnosis and Bayesian network-based probabilistic analysis. Our proposed research program includes both developing this novel approach and demonstrating its capabilities using the NASA Swift UAS as a demonstration platform.

  3. The Impact of Visibility on Teamwork, Collaborative Communication, and Security in Emergency Departments: An Exploratory Study.

    PubMed

    Gharaveis, Arsalan; Hamilton, D Kirk; Pati, Debajyoti; Shepley, Mardelle

    2017-01-01

    The aim of this study was to examine the influence of visibility on teamwork, collaborative communication, and security issues in emergency departments (EDs). This research explored whether with high visibility in EDs, teamwork and collaborative communication can be improved while the security issues will be reduced. Visibility has been regarded as a critical design consideration and can be directly and considerably impacted by ED's physical design. Teamwork is one of the major related operational outcomes of visibility and involves nurses, support staff, and physicians. The collaborative communication in an ED is another important factor in the process of care delivery and affects efficiency and safety. Furthermore, security is a behavioral factor in ED designs, which includes all types of safety including staff safety, patient safety, and the safety of visitors and family members. This qualitative study investigated the impact of visibility on teamwork, collaborative communication, and security issues in the ED. One-on-one interviews and on-site observation sessions were conducted in a community hospital. Corresponding data analysis was implemented by using computer plan analysis, observation and interview content, and theme analyses. The findings of this exploratory study provided a framework to identify visibility as an influential factor in ED design. High levels of visibility impact productivity and efficiency of teamwork and communication and improve the chance of lowering security issues. The findings of this study also contribute to the general body of knowledge about the effect of physical design on teamwork, collaborative communication, and security.

  4. Computer Starters!

    ERIC Educational Resources Information Center

    Instructor, 1983

    1983-01-01

    Instructor's Computer-Using Teachers Board members give practical tips on how to get a classroom ready for a new computer, introduce students to the machine, and help them learn about programing and computer literacy. Safety, scheduling, and supervision requirements are noted. (PP)

  5. Proof-of-Concept Demonstrations for Computation-Based Human Reliability Analysis. Modeling Operator Performance During Flooding Scenarios

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Joe, Jeffrey Clark; Boring, Ronald Laurids; Herberger, Sarah Elizabeth Marie

    The United States (U.S.) Department of Energy (DOE) Light Water Reactor Sustainability (LWRS) program has the overall objective to help sustain the existing commercial nuclear power plants (NPPs). To accomplish this program objective, there are multiple LWRS “pathways,” or research and development (R&D) focus areas. One LWRS focus area is called the Risk-Informed Safety Margin and Characterization (RISMC) pathway. Initial efforts under this pathway to combine probabilistic and plant multi-physics models to quantify safety margins and support business decisions also included HRA, but in a somewhat simplified manner. HRA experts at Idaho National Laboratory (INL) have been collaborating with othermore » experts to develop a computational HRA approach, called the Human Unimodel for Nuclear Technology to Enhance Reliability (HUNTER), for inclusion into the RISMC framework. The basic premise of this research is to leverage applicable computational techniques, namely simulation and modeling, to develop and then, using RAVEN as a controller, seamlessly integrate virtual operator models (HUNTER) with 1) the dynamic computational MOOSE runtime environment that includes a full-scope plant model, and 2) the RISMC framework PRA models already in use. The HUNTER computational HRA approach is a hybrid approach that leverages past work from cognitive psychology, human performance modeling, and HRA, but it is also a significant departure from existing static and even dynamic HRA methods. This report is divided into five chapters that cover the development of an external flooding event test case and associated statistical modeling considerations.« less

  6. Achieving benefit for patients in primary care informatics: the report of a international consensus workshop at Medinfo 2007.

    PubMed

    de Lusignan, Simon; Teasdale, Sheila

    2007-01-01

    Landmark reports suggest that sharing health data between clinical computer systems should improve patient safety and the quality of care. Enhancing the use of informatics in primary care is usually a key part of these strategies. To synthesise the learning from the international use of informatics in primary care. The workshop was attended by 21 delegates drawn from all continents. There were presentations from USA, UK and the Netherlands, and informal updates from Australia, Argentina, and Sweden and the Nordic countries. These presentations were discussed in a workshop setting to identify common issues. Key principles were synthesised through a post-workshop analysis and then sorted into themes. Themes emerged about the deployment of informatics which can be applied at health service, practice and individual clinical consultation level: 1 At the health service or provider level, success appeared proportional to the extent of collaboration between a broad range of stakeholders and identification of leaders. 2 Within the practice much is currently being achieved with legacy computer systems and apparently outdated coding systems. This includes prescribing safety alerts, clinical audit and promoting computer data recording and quality. 3 In the consultation the computer is a 'big player' and may make traditional models of the consultation redundant. We should make more efforts to share learning; develop clear internationally acceptable definitions; highlight gaps between pockets of excellence and real-world practice, and most importantly suggest how they might be bridged. Knowledge synthesis from different health systems may provide a greater understanding of how the third actor (the computer) is best used in primary care.

  7. Engineering and Computing Portal to Solve Environmental Problems

    NASA Astrophysics Data System (ADS)

    Gudov, A. M.; Zavozkin, S. Y.; Sotnikov, I. Y.

    2018-01-01

    This paper describes architecture and services of the Engineering and Computing Portal, which is considered to be a complex solution that provides access to high-performance computing resources, enables to carry out computational experiments, teach parallel technologies and solve computing tasks, including technogenic safety ones.

  8. Principles for the wise use of computers by children.

    PubMed

    Straker, L; Pollock, C; Maslen, B

    2009-11-01

    Computer use by children at home and school is now common in many countries. Child computer exposure varies with the type of computer technology available and the child's age, gender and social group. This paper reviews the current exposure data and the evidence for positive and negative effects of computer use by children. Potential positive effects of computer use by children include enhanced cognitive development and school achievement, reduced barriers to social interaction, enhanced fine motor skills and visual processing and effective rehabilitation. Potential negative effects include threats to child safety, inappropriate content, exposure to violence, bullying, Internet 'addiction', displacement of moderate/vigorous physical activity, exposure to junk food advertising, sleep displacement, vision problems and musculoskeletal problems. The case for child specific evidence-based guidelines for wise use of computers is presented based on children using computers differently to adults, being physically, cognitively and socially different to adults, being in a state of change and development and the potential to impact on later adult risk. Progress towards child-specific guidelines is reported. Finally, a set of guideline principles is presented as the basis for more detailed guidelines on the physical, cognitive and social impact of computer use by children. The principles cover computer literacy, technology safety, child safety and privacy and appropriate social, cognitive and physical development. The majority of children in affluent communities now have substantial exposure to computers. This is likely to have significant effects on child physical, cognitive and social development. Ergonomics can provide and promote guidelines for wise use of computers by children and by doing so promote the positive effects and reduce the negative effects of computer-child, and subsequent computer-adult, interaction.

  9. SPLASH program for three dimensional fluid dynamics with free surface boundaries

    NASA Astrophysics Data System (ADS)

    Yamaguchi, A.

    1996-05-01

    This paper describes a three dimensional computer program SPLASH that solves Navier-Stokes equations based on the Arbitrary Lagrangian Eulerian (ALE) finite element method. SPLASH has been developed for application to the fluid dynamics problems including the moving boundary of a liquid metal cooled Fast Breeder Reactor (FBR). To apply SPLASH code to the free surface behavior analysis, a capillary model using a cubic Spline function has been developed. Several sample problems, e.g., free surface oscillation, vortex shedding development, and capillary tube phenomena, are solved to verify the computer program. In the analyses, the numerical results are in good agreement with the theoretical value or experimental observance. Also SPLASH code has been applied to an analysis of a free surface sloshing experiment coupled with forced circulation flow in a rectangular tank. This is a simplified situation of the flow field in a reactor vessel of the FBR. The computational simulation well predicts the general behavior of the fluid flow inside and the free surface behavior. Analytical capability of the SPLASH code has been verified in this study and the application to more practical problems such as FBR design and safety analysis is under way.

  10. ASME V\\&V challenge problem: Surrogate-based V&V

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Beghini, Lauren L.; Hough, Patricia D.

    2015-12-18

    The process of verification and validation can be resource intensive. From the computational model perspective, the resource demand typically arises from long simulation run times on multiple cores coupled with the need to characterize and propagate uncertainties. In addition, predictive computations performed for safety and reliability analyses have similar resource requirements. For this reason, there is a tradeoff between the time required to complete the requisite studies and the fidelity or accuracy of the results that can be obtained. At a high level, our approach is cast within a validation hierarchy that provides a framework in which we perform sensitivitymore » analysis, model calibration, model validation, and prediction. The evidence gathered as part of these activities is mapped into the Predictive Capability Maturity Model to assess credibility of the model used for the reliability predictions. With regard to specific technical aspects of our analysis, we employ surrogate-based methods, primarily based on polynomial chaos expansions and Gaussian processes, for model calibration, sensitivity analysis, and uncertainty quantification in order to reduce the number of simulations that must be done. The goal is to tip the tradeoff balance to improving accuracy without increasing the computational demands.« less

  11. Decision support methods for the detection of adverse events in post-marketing data.

    PubMed

    Hauben, M; Bate, A

    2009-04-01

    Spontaneous reporting is a crucial component of post-marketing drug safety surveillance despite its significant limitations. The size and complexity of some spontaneous reporting system databases represent a challenge for drug safety professionals who traditionally have relied heavily on the scientific and clinical acumen of the prepared mind. Computer algorithms that calculate statistical measures of reporting frequency for huge numbers of drug-event combinations are increasingly used to support pharamcovigilance analysts screening large spontaneous reporting system databases. After an overview of pharmacovigilance and spontaneous reporting systems, we discuss the theory and application of contemporary computer algorithms in regular use, those under development, and the practical considerations involved in the implementation of computer algorithms within a comprehensive and holistic drug safety signal detection program.

  12. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Harris, S.; Gross, R.; Goble, W

    The safety integrity level (SIL) of equipment used in safety instrumented functions is determined by the average probability of failure on demand (PFDavg) computed at the time of periodic inspection and maintenance, i.e., the time of proof testing. The computation of PFDavg is generally based solely on predictions or estimates of the assumed constant failure rate of the equipment. However, PFDavg is also affected by maintenance actions (or lack thereof) taken by the end user. This paper shows how maintenance actions can affect the PFDavg of spring operated pressure relief valves (SOPRV) and how these maintenance actions may be accountedmore » for in the computation of the PFDavg metric. The method provides a means for quantifying the effects of changes in maintenance practices and shows how these changes impact plant safety.« less

  13. A job safety program for construction workers designed to reduce the potential for occupational injury using tool box training sessions and computer-assisted biofeedback stress management techniques.

    PubMed

    Johnson, Kenneth A; Ruppe, Joan

    2002-01-01

    This project was conducted with a multicultural construction company in Hawaii, USA. The job duties performed included drywall and carpentry work. The following objectives were selected for this project: (a) fire prevention training and inspection of first aid equipment; (b) blood-borne pathogen training and risk evaluation; (c) ergonomic and risk evaluation intervention program; (d) electrical safety training and inspection program; (e) slips, trips, and falls safety training; (f) stress assessment and Personal Profile System; (g) safety and health program survey; (h) improving employee relations and morale by emphasizing spirituality; and (i) computer-assisted biofeedback stress management training. Results of the project indicated that observed safety hazards, reported injuries, and levels of perceived stress. were reduced for the majority of the population.

  14. Proceedings of the Sixth NASA Langley Formal Methods (LFM) Workshop

    NASA Technical Reports Server (NTRS)

    Rozier, Kristin Yvonne (Editor)

    2008-01-01

    Today's verification techniques are hard-pressed to scale with the ever-increasing complexity of safety critical systems. Within the field of aeronautics alone, we find the need for verification of algorithms for separation assurance, air traffic control, auto-pilot, Unmanned Aerial Vehicles (UAVs), adaptive avionics, automated decision authority, and much more. Recent advances in formal methods have made verifying more of these problems realistic. Thus we need to continually re-assess what we can solve now and identify the next barriers to overcome. Only through an exchange of ideas between theoreticians and practitioners from academia to industry can we extend formal methods for the verification of ever more challenging problem domains. This volume contains the extended abstracts of the talks presented at LFM 2008: The Sixth NASA Langley Formal Methods Workshop held on April 30 - May 2, 2008 in Newport News, Virginia, USA. The topics of interest that were listed in the call for abstracts were: advances in formal verification techniques; formal models of distributed computing; planning and scheduling; automated air traffic management; fault tolerance; hybrid systems/hybrid automata; embedded systems; safety critical applications; safety cases; accident/safety analysis.

  15. Regulator Loss Functions and Hierarchical Modeling for Safety Decision Making.

    PubMed

    Hatfield, Laura A; Baugh, Christine M; Azzone, Vanessa; Normand, Sharon-Lise T

    2017-07-01

    Regulators must act to protect the public when evidence indicates safety problems with medical devices. This requires complex tradeoffs among risks and benefits, which conventional safety surveillance methods do not incorporate. To combine explicit regulator loss functions with statistical evidence on medical device safety signals to improve decision making. In the Hospital Cost and Utilization Project National Inpatient Sample, we select pediatric inpatient admissions and identify adverse medical device events (AMDEs). We fit hierarchical Bayesian models to the annual hospital-level AMDE rates, accounting for patient and hospital characteristics. These models produce expected AMDE rates (a safety target), against which we compare the observed rates in a test year to compute a safety signal. We specify a set of loss functions that quantify the costs and benefits of each action as a function of the safety signal. We integrate the loss functions over the posterior distribution of the safety signal to obtain the posterior (Bayes) risk; the preferred action has the smallest Bayes risk. Using simulation and an analysis of AMDE data, we compare our minimum-risk decisions to a conventional Z score approach for classifying safety signals. The 2 rules produced different actions for nearly half of hospitals (45%). In the simulation, decisions that minimize Bayes risk outperform Z score-based decisions, even when the loss functions or hierarchical models are misspecified. Our method is sensitive to the choice of loss functions; eliciting quantitative inputs to the loss functions from regulators is challenging. A decision-theoretic approach to acting on safety signals is potentially promising but requires careful specification of loss functions in consultation with subject matter experts.

  16. Impact of model-based risk analysis for liver surgery planning.

    PubMed

    Hansen, C; Zidowitz, S; Preim, B; Stavrou, G; Oldhafer, K J; Hahn, H K

    2014-05-01

    A model-based risk analysis for oncologic liver surgery was described in previous work (Preim et al. in Proceedings of international symposium on computer assisted radiology and surgery (CARS), Elsevier, Amsterdam, pp. 353–358, 2002; Hansen et al. Int I Comput Assist Radiol Surg 4(5):469–474, 2009). In this paper, we present an evaluation of this method. To prove whether and how the risk analysis facilitates the process of liver surgery planning, an explorative user study with 10 liver experts was conducted. The purpose was to compare and analyze their decision-making. The results of the study show that model-based risk analysis enhances the awareness of surgical risk in the planning stage. Participants preferred smaller resection volumes and agreed more on the safety margins’ width in case the risk analysis was available. In addition, time to complete the planning task and confidence of participants were not increased when using the risk analysis. This work shows that the applied model-based risk analysis may influence important planning decisions in liver surgery. It lays a basis for further clinical evaluations and points out important fields for future research.

  17. Towards Real-time, On-board, Hardware-Supported Sensor and Software Health Management for Unmanned Aerial Systems

    NASA Technical Reports Server (NTRS)

    Schumann, Johann; Rozier, Kristin Y.; Reinbacher, Thomas; Mengshoel, Ole J.; Mbaya, Timmy; Ippolito, Corey

    2013-01-01

    Unmanned aerial systems (UASs) can only be deployed if they can effectively complete their missions and respond to failures and uncertain environmental conditions while maintaining safety with respect to other aircraft as well as humans and property on the ground. In this paper, we design a real-time, on-board system health management (SHM) capability to continuously monitor sensors, software, and hardware components for detection and diagnosis of failures and violations of safety or performance rules during the flight of a UAS. Our approach to SHM is three-pronged, providing: (1) real-time monitoring of sensor and/or software signals; (2) signal analysis, preprocessing, and advanced on the- fly temporal and Bayesian probabilistic fault diagnosis; (3) an unobtrusive, lightweight, read-only, low-power realization using Field Programmable Gate Arrays (FPGAs) that avoids overburdening limited computing resources or costly re-certification of flight software due to instrumentation. Our implementation provides a novel approach of combining modular building blocks, integrating responsive runtime monitoring of temporal logic system safety requirements with model-based diagnosis and Bayesian network-based probabilistic analysis. We demonstrate this approach using actual data from the NASA Swift UAS, an experimental all-electric aircraft.

  18. Computer Series, 13: Bits and Pieces, 11.

    ERIC Educational Resources Information Center

    Moore, John W., Ed.

    1982-01-01

    Describes computer programs (with ordering information) on various topics including, among others, modeling of thermodynamics and economics of solar energy, radioactive decay simulation, stoichiometry drill/tutorial (in Spanish), computer-generated safety quiz, medical chemistry computer game, medical biochemistry question bank, generation of…

  19. Integration of Active and Passive Safety Technologies--A Method to Study and Estimate Field Capability.

    PubMed

    Hu, Jingwen; Flannagan, Carol A; Bao, Shan; McCoy, Robert W; Siasoco, Kevin M; Barbat, Saeed

    2015-11-01

    The objective of this study is to develop a method that uses a combination of field data analysis, naturalistic driving data analysis, and computational simulations to explore the potential injury reduction capabilities of integrating passive and active safety systems in frontal impact conditions. For the purposes of this study, the active safety system is actually a driver assist (DA) feature that has the potential to reduce delta-V prior to a crash, in frontal or other crash scenarios. A field data analysis was first conducted to estimate the delta-V distribution change based on an assumption of 20% crash avoidance resulting from a pre-crash braking DA feature. Analysis of changes in driver head location during 470 hard braking events in a naturalistic driving study found that drivers' head positions were mostly in the center position before the braking onset, while the percentage of time drivers leaning forward or backward increased significantly after the braking onset. Parametric studies with a total of 4800 MADYMO simulations showed that both delta-V and occupant pre-crash posture had pronounced effects on occupant injury risks and on the optimal restraint designs. By combining the results for the delta-V and head position distribution changes, a weighted average of injury risk reduction of 17% and 48% was predicted by the 50th percentile Anthropomorphic Test Device (ATD) model and human body model, respectively, with the assumption that the restraint system can adapt to the specific delta-V and pre-crash posture. This study demonstrated the potential for further reducing occupant injury risk in frontal crashes by the integration of a passive safety system with a DA feature. Future analyses considering more vehicle models, various crash conditions, and variations of occupant characteristics, such as age, gender, weight, and height, are necessary to further investigate the potential capability of integrating passive and DA or active safety systems.

  20. Classifying health information technology patient safety related incidents - an approach used in Wales.

    PubMed

    Warm, D; Edwards, P

    2012-01-01

    Interest in the field of patient safety incident reporting and analysis with respect to Health Information Technology (HIT) has been growing over recent years as the development, implementation and reliance on HIT systems becomes ever more prevalent. One of the rationales for capturing patient safety incidents is to learn from failures in the delivery of care and must form part of a feedback loop which also includes analysis; investigation and monitoring. With the advent of new technologies and organizational programs of delivery the emphasis is increasingly upon analyzing HIT incidents. This thematic review had two objectives, to test the applicability of a framework specifically designed to categorize HIT incidents and to review the Welsh incidents as communicated via the national incident reporting system in order to understand their implications for healthcare. The incidents were those reported as IT/ telecommunications failure/ overload. Incidents were searched for within a national reporting system using a standardized search strategy for incidents occurring between 1(st) January 2009 and 31(st) May 2011. 149 incident reports were identified and classified. The majority (77%) of which were machine related (technical problems) such as access problems; computer system down/too slow; display issues; and software malfunctions. A further 10% (n = 15) of incidents were down to human-computer interaction issues and 13% (n = 19) incidents, mainly telephone related, could not be classified using the framework being tested. On the basis of this review of incidents, it is recommended that the framework be expanded to include hardware malfunctions and the wrong record retrieved/missing data associated with a machine output error (as opposed to human error). In terms of the implications for clinical practice, the incidents reviewed highlighted critical issues including the access problems particularly relating to the use of mobile technologies.

  1. [Validation of a method for notifying and monitoring medication errors in pediatrics].

    PubMed

    Guerrero-Aznar, M D; Jiménez-Mesa, E; Cotrina-Luque, J; Villalba-Moreno, A; Cumplido-Corbacho, R; Fernández-Fernández, L

    2014-12-01

    To analyze the impact of a multidisciplinary and decentralized safety committee in the pediatric management unit, and the joint implementation of a computing network application for reporting medication errors, monitoring the follow-up of the errors, and an analysis of the improvements introduced. An observational, descriptive, cross-sectional, pre-post intervention study was performed. An analysis was made of medication errors reported to the central safety committee in the twelve months prior to introduction, and those reported to the decentralized safety committee in the management unit in the nine months after implementation, using the computer application, and the strategies generated by the analysis of reported errors. Number of reported errors/10,000 days of stay, number of reported errors with harm per 10,000 days of stay, types of error, categories based on severity, stage of the process, and groups involved in the notification of medication errors. Reported medication errors increased 4.6 -fold, from 7.6 notifications of medication errors per 10,000 days of stay in the pre-intervention period to 36 in the post-intervention, rate ratio 0.21 (95% CI; 0.11-0.39) (P<.001). The medication errors with harm or requiring monitoring reported per 10,000 days of stay, was virtually unchanged from one period to the other ratio rate 0,77 (95% IC; 0,31-1,91) (P>.05). The notification of potential errors or errors without harm per 10,000 days of stay increased 17.4-fold (rate ratio 0.005., 95% CI; 0.001-0.026, P<.001). The increase in medication errors notified in the post-intervention period is a reflection of an increase in the motivation of health professionals to report errors through this new method. Copyright © 2013 Asociación Española de Pediatría. Published by Elsevier Espana. All rights reserved.

  2. An Open Source Rapid Computer Aided Control System Design Toolchain Using Scilab, Scicos and RTAI Linux

    NASA Astrophysics Data System (ADS)

    Bouchpan-Lerust-Juéry, L.

    2007-08-01

    Current and next generation on-board computer systems tend to implement real-time embedded control applications (e.g. Attitude and Orbit Control Subsystem (AOCS), Packet Utililization Standard (PUS), spacecraft autonomy . . . ) which must meet high standards of Reliability and Predictability as well as Safety. All these requirements require a considerable amount of effort and cost for Space Sofware Industry. This paper, in a first part, presents a free Open Source integrated solution to develop RTAI applications from analysis, design, simulation and direct implementation using code generation based on Open Source and in its second part summarises this suggested approach, its results and the conclusion for further work.

  3. Items Supporting the Hanford Internal Dosimetry Program Implementation of the IMBA Computer Code

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Carbaugh, Eugene H.; Bihl, Donald E.

    2008-01-07

    The Hanford Internal Dosimetry Program has adopted the computer code IMBA (Integrated Modules for Bioassay Analysis) as its primary code for bioassay data evaluation and dose assessment using methodologies of ICRP Publications 60, 66, 67, 68, and 78. The adoption of this code was part of the implementation plan for the June 8, 2007 amendments to 10 CFR 835. This information release includes action items unique to IMBA that were required by PNNL quality assurance standards for implementation of safety software. Copie of the IMBA software verification test plan and the outline of the briefing given to new users aremore » also included.« less

  4. Role of High-End Computing in Meeting NASA's Science and Engineering Challenges

    NASA Technical Reports Server (NTRS)

    Biswas, Rupak

    2006-01-01

    High-End Computing (HEC) has always played a major role in meeting the modeling and simulation needs of various NASA missions. With NASA's newest 62 teraflops Columbia supercomputer, HEC is having an even greater impact within the Agency and beyond. Significant cutting-edge science and engineering simulations in the areas of space exploration, Shuttle operations, Earth sciences, and aeronautics research, are already occurring on Columbia, demonstrating its ability to accelerate NASA s exploration vision. The talk will describe how the integrated supercomputing production environment is being used to reduce design cycle time, accelerate scientific discovery, conduct parametric analysis of multiple scenarios, and enhance safety during the life cycle of NASA missions.

  5. A PC-based simulation of the National Transonic Facitity's safety microprocessor

    NASA Technical Reports Server (NTRS)

    Thibodeaux, J. J.; Kilgore, W. A.; Balakrishna, S.

    1993-01-01

    A brief study was undertaken to demonstrate the feasibility of using a state-of-the-art off-the-shelf high speed personal computer for simulating a microprocessor presently used for wind tunnel safety purposes at Langley Research Center's National Transonic Facility (NTF). Currently, there is no active display of tunnel alarm/alert safety information provided to the tunnel operators, but rather such information is periodically recorded on a process monitoring computer printout. This does not provide on-line situational information nor permit rapid identification of safety operational violations which are able to halt tunnel operations. It was therefore decided to simulate the existing algorithms and briefly evaluate a real-time display which could provide both position and trouble shooting information.

  6. Socio-technical issues and challenges in implementing safe patient handovers: insights from ethnographic case studies.

    PubMed

    Balka, Ellen; Tolar, Marianne; Coates, Shannon; Whitehouse, Sandra

    2013-12-01

    Ineffective handovers in patient care, including those where information loss occurs between care providers, have been identified as a risk to patient safety. Computerization of health information is often offered as a solution to improve the quality of care handovers and decrease adverse events related to patient safety. The purpose of this paper is to broaden our understanding of clinical handover as a patient safety issue, and to identify socio-technical issues which may come to bear on the success of computer based handover tools. Three in depth ethnographic case studies were undertaken. Field notes were transcribed and analyzed with the aid of qualitative data analysis software. Within case analysis was performed on each case, and subsequently, cross case analyses were performed. We identified five types of socio-technical issues which must be addressed if electronic handover tools are to succeed. The inter-dependencies of these issues are addressed in relation to arenas in which health care work takes place. We suggest that the contextual nature of information, ethical and medico-legal issues arising in relation to information handover, and issues related to data standards and system interoperability must be addressed if computerized health information systems are to achieve improvements in patient safety related to handovers in care. Copyright © 2012 Elsevier Ireland Ltd. All rights reserved.

  7. National Conference on Campus Safety (15th, University of Vermont, Burlington, June 21-26, 1968).

    ERIC Educational Resources Information Center

    Green, Jack N., Ed.

    Presentations made at the fifteenth National Conference on Campus Safety. The following topics are dealt with--(1) Occupational Health on Campus, (2) Teacher Liability in School Accidents, (3) Indoctrinating Students in Fire Safety, (4) Computer Installations Safety and Fire Protection, (5) The Design of Laboratory Buildings, (6) A Uniform System…

  8. Overview of Threats and Failure Models for Safety-Relevant Computer-Based Systems

    NASA Technical Reports Server (NTRS)

    Torres-Pomales, Wilfredo

    2015-01-01

    This document presents a high-level overview of the threats to safety-relevant computer-based systems, including (1) a description of the introduction and activation of physical and logical faults; (2) the propagation of their effects; and (3) function-level and component-level error and failure mode models. These models can be used in the definition of fault hypotheses (i.e., assumptions) for threat-risk mitigation strategies. This document is a contribution to a guide currently under development that is intended to provide a general technical foundation for designers and evaluators of safety-relevant systems.

  9. Thermo-hydro-mechanical-chemical processes in fractured-porous media: Benchmarks and examples

    NASA Astrophysics Data System (ADS)

    Kolditz, O.; Shao, H.; Görke, U.; Kalbacher, T.; Bauer, S.; McDermott, C. I.; Wang, W.

    2012-12-01

    The book comprises an assembly of benchmarks and examples for porous media mechanics collected over the last twenty years. Analysis of thermo-hydro-mechanical-chemical (THMC) processes is essential to many applications in environmental engineering, such as geological waste deposition, geothermal energy utilisation, carbon capture and storage, water resources management, hydrology, even climate change. In order to assess the feasibility as well as the safety of geotechnical applications, process-based modelling is the only tool to put numbers, i.e. to quantify future scenarios. This charges a huge responsibility concerning the reliability of computational tools. Benchmarking is an appropriate methodology to verify the quality of modelling tools based on best practices. Moreover, benchmarking and code comparison foster community efforts. The benchmark book is part of the OpenGeoSys initiative - an open source project to share knowledge and experience in environmental analysis and scientific computation.

  10. Systems Engineering and Integration (SE and I)

    NASA Technical Reports Server (NTRS)

    Chevers, ED; Haley, Sam

    1990-01-01

    The issue of technology advancement and future space transportation vehicles is addressed. The challenge is to develop systems which can be evolved and improved in small incremental steps where each increment reduces present cost, improves, reliability, or does neither but sets the stage for a second incremental upgrade that does. Future requirements are interface standards for commercial off the shelf products to aid in the development of integrated facilities; enhanced automated code generation system slightly coupled to specification and design documentation; modeling tools that support data flow analysis; and shared project data bases consisting of technical characteristics cast information, measurement parameters, and reusable software programs. Topics addressed include: advanced avionics development strategy; risk analysis and management; tool quality management; low cost avionics; cost estimation and benefits; computer aided software engineering; computer systems and software safety; system testability; and advanced avionics laboratories - and rapid prototyping. This presentation is represented by viewgraphs only.

  11. Liquefied natural gas (LNG) safety

    NASA Technical Reports Server (NTRS)

    Ordin, P. M.

    1977-01-01

    Bibliography, assembled from computer search of NASA Aerospace Safety Data Bank, including title of report, author, abstract, source, description of figures, key references, and key words or subject terms. Publication is indexed by key subjects and by authors. Items are relevant to design engineers and safety specialists.

  12. The Effectiveness of a Bicycle Safety Program for Improving Safety-Related Knowledge and Behavior in Young Elementary Students

    PubMed Central

    Glang, Ann

    2010-01-01

    Objective The purpose of this study was to evaluate the “Bike Smart” program, an eHealth software program that teaches bicycle safety behaviors to young children. Methods Participants were 206 elementary students in grades kindergarten to 3. A random control design was employed to evaluate the program, with students assigned to either the treatment condition (Bike Smart) or the control condition (a video on childhood safety). Outcome measures included computer-based knowledge items (safety rules, helmet placement, hazard discrimination) and a behavioral measure of helmet placement. Results Results demonstrated that regardless of gender, cohort, and grade the participants in the treatment group showed greater gains than control participants in both the computer-presented knowledge items (p > .01) and the observational helmet measure (p > .05). Conclusions Findings suggest that the Bike Smart program can be a low cost, effective component of safety training packages that include both skills-based and experiential training. PMID:19755497

  13. Road safety risk evaluation and target setting using data envelopment analysis and its extensions.

    PubMed

    Shen, Yongjun; Hermans, Elke; Brijs, Tom; Wets, Geert; Vanhoof, Koen

    2012-09-01

    Currently, comparison between countries in terms of their road safety performance is widely conducted in order to better understand one's own safety situation and to learn from those best-performing countries by indicating practical targets and formulating action programmes. In this respect, crash data such as the number of road fatalities and casualties are mostly investigated. However, the absolute numbers are not directly comparable between countries. Therefore, the concept of risk, which is defined as the ratio of road safety outcomes and some measure of exposure (e.g., the population size, the number of registered vehicles, or distance travelled), is often used in the context of benchmarking. Nevertheless, these risk indicators are not consistent in most cases. In other words, countries may have different evaluation results or ranking positions using different exposure information. In this study, data envelopment analysis (DEA) as a performance measurement technique is investigated to provide an overall perspective on a country's road safety situation, and further assess whether the road safety outcomes registered in a country correspond to the numbers that can be expected based on the level of exposure. In doing so, three model extensions are considered, which are the DEA based road safety model (DEA-RS), the cross-efficiency method, and the categorical DEA model. Using the measures of exposure to risk as the model's input and the number of road fatalities as output, an overall road safety efficiency score is computed for the 27 European Union (EU) countries based on the DEA-RS model, and the ranking of countries in accordance with their cross-efficiency scores is evaluated. Furthermore, after applying clustering analysis to group countries with inherent similarity in their practices, the categorical DEA-RS model is adopted to identify best-performing and underperforming countries in each cluster, as well as the reference sets or benchmarks for those underperforming ones. More importantly, the extent to which each reference set could be learned from is specified, and practical yet challenging targets are given for each underperforming country, which enables policymakers to recognize the gap with those best-performing countries and further develop their own road safety policy. Copyright © 2012 Elsevier Ltd. All rights reserved.

  14. Large-Scale NASA Science Applications on the Columbia Supercluster

    NASA Technical Reports Server (NTRS)

    Brooks, Walter

    2005-01-01

    Columbia, NASA's newest 61 teraflops supercomputer that became operational late last year, is a highly integrated Altix cluster of 10,240 processors, and was named to honor the crew of the Space Shuttle lost in early 2003. Constructed in just four months, Columbia increased NASA's computing capability ten-fold, and revitalized the Agency's high-end computing efforts. Significant cutting-edge science and engineering simulations in the areas of space and Earth sciences, as well as aeronautics and space operations, are already occurring on this largest operational Linux supercomputer, demonstrating its capacity and capability to accelerate NASA's space exploration vision. The presentation will describe how an integrated environment consisting not only of next-generation systems, but also modeling and simulation, high-speed networking, parallel performance optimization, and advanced data analysis and visualization, is being used to reduce design cycle time, accelerate scientific discovery, conduct parametric analysis of multiple scenarios, and enhance safety during the life cycle of NASA missions. The talk will conclude by discussing how NAS partnered with various NASA centers, other government agencies, computer industry, and academia, to create a national resource in large-scale modeling and simulation.

  15. SCALE: A modular code system for performing standardized computer analyses for licensing evaluation. Functional modules F1--F8 -- Volume 2, Part 1, Revision 4

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Greene, N.M.; Petrie, L.M.; Westfall, R.M.

    SCALE--a modular code system for Standardized Computer Analyses Licensing Evaluation--has been developed by Oak Ridge National Laboratory at the request of the US Nuclear Regulatory Commission. The SCALE system utilizes well-established computer codes and methods within standard analysis sequences that (1) allow an input format designed for the occasional user and/or novice, (2) automate the data processing and coupling between modules, and (3) provide accurate and reliable results. System development has been directed at problem-dependent cross-section processing and analysis of criticality safety, shielding, heat transfer, and depletion/decay problems. Since the initial release of SCALE in 1980, the code system hasmore » been heavily used for evaluation of nuclear fuel facility and package designs. This revision documents Version 4.2 of the system. The manual is divided into three volumes: Volume 1--for the control module documentation; Volume 2--for functional module documentation; and Volume 3--for documentation of the data libraries and subroutine libraries.« less

  16. Insights and Perspectives on Emerging Inputs to Weight of Evidence Determinations for Food Safety: Workshop Proceedings

    PubMed Central

    Bialk, Heidi; Llewellyn, Craig; Kretser, Alison; Canady, Richard; Lane, Richard; Barach, Jeffrey

    2013-01-01

    This workshop aimed to elucidate the contribution of computational and emerging in vitro methods to the weight of evidence used by risk assessors in food safety assessments. The following issues were discussed: using in silico and high-throughput screening (HTS) data to confirm the safety of approved food ingredients, applying in silico and HTS data in the process of assessing the safety of a new food ingredient, and utilizing in silico and HTS data in communicating the safety of food ingredients while enhancing the public’s trust in the food supply. Perspectives on integrating computational modeling and HTS assays as well as recommendations for optimizing predictive methods for risk assessment were also provided. Given the need to act quickly or proceed cautiously as new data emerge, this workshop also focused on effectively identifying a path forward in communicating in silico and in vitro data. PMID:24296863

  17. Information collection and processing of dam distortion in digital reservoir system

    NASA Astrophysics Data System (ADS)

    Liang, Yong; Zhang, Chengming; Li, Yanling; Wu, Qiulan; Ge, Pingju

    2007-06-01

    The "digital reservoir" is usually understood as describing the whole reservoir with digital information technology to make it serve the human existence and development furthest. Strictly speaking, the "digital reservoir" is referred to describing vast information of the reservoir in different dimension and space-time by RS, GPS, GIS, telemetry, remote-control and virtual reality technology based on computer, multi-media, large-scale memory and wide-band networks technology for the human existence, development and daily work, life and entertainment. The core of "digital reservoir" is to realize the intelligence and visibility of vast information of the reservoir through computers and networks. The dam is main building of reservoir, whose safety concerns reservoir and people's safety. Safety monitoring is important way guaranteeing the dam's safety, which controls the dam's running through collecting the dam's information concerned and developing trend. Safety monitoring of the dam is the process from collection and processing of initial safety information to forming safety concept in the brain. The paper mainly researches information collection and processing of the dam by digital means.

  18. NASA's Software Safety Standard

    NASA Technical Reports Server (NTRS)

    Ramsay, Christopher M.

    2007-01-01

    NASA relies more and more on software to control, monitor, and verify its safety critical systems, facilities and operations. Since the 1960's there has hardly been a spacecraft launched that does not have a computer on board that will provide command and control services. There have been recent incidents where software has played a role in high-profile mission failures and hazardous incidents. For example, the Mars Orbiter, Mars Polar Lander, the DART (Demonstration of Autonomous Rendezvous Technology), and MER (Mars Exploration Rover) Spirit anomalies were all caused or contributed to by software. The Mission Control Centers for the Shuttle, ISS, and unmanned programs are highly dependant on software for data displays, analysis, and mission planning. Despite this growing dependence on software control and monitoring, there has been little to no consistent application of software safety practices and methodology to NASA's projects with safety critical software. Meanwhile, academia and private industry have been stepping forward with procedures and standards for safety critical systems and software, for example Dr. Nancy Leveson's book Safeware: System Safety and Computers. The NASA Software Safety Standard, originally published in 1997, was widely ignored due to its complexity and poor organization. It also focused on concepts rather than definite procedural requirements organized around a software project lifecycle. Led by NASA Headquarters Office of Safety and Mission Assurance, the NASA Software Safety Standard has recently undergone a significant update. This new standard provides the procedures and guidelines for evaluating a project for safety criticality and then lays out the minimum project lifecycle requirements to assure the software is created, operated, and maintained in the safest possible manner. This update of the standard clearly delineates the minimum set of software safety requirements for a project without detailing the implementation for those requirements. This allows the projects leeway to meet these requirements in many forms that best suit a particular project's needs and safety risk. In other words, it tells the project what to do, not how to do it. This update also incorporated advances in the state of the practice of software safety from academia and private industry. It addresses some of the more common issues now facing software developers in the NASA environment such as the use of Commercial-Off-the-Shelf Software (COTS), Modified OTS (MOTS), Government OTS (GOTS), and reused software. A team from across NASA developed the update and it has had both NASA-wide internal reviews by software engineering, quality, safety, and project management. It has also had expert external review. This presentation and paper will discuss the new NASA Software Safety Standard, its organization, and key features. It will start with a brief discussion of some NASA mission failures and incidents that had software as one of their root causes. It will then give a brief overview of the NASA Software Safety Process. This will include an overview of the key personnel responsibilities and functions that must be performed for safety-critical software.

  19. THE EFFECTS OF MAINTENANCE ACTIONS ON THE PFDavg OF SPRING OPERATED PRESSURE RELIEF VALVES

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Harris, S.; Gross, R.

    2014-04-01

    The safety integrity level (SIL) of equipment used in safety instrumented functions is determined by the average probability of failure on demand (PFDavg) computed at the time of periodic inspection and maintenance, i.e., the time of proof testing. The computation of PFDavg is generally based solely on predictions or estimates of the assumed constant failure rate of the equipment. However, PFDavg is also affected by maintenance actions (or lack thereof) taken by the end user. This paper shows how maintenance actions can affect the PFDavg of spring operated pressure relief valves (SOPRV) and how these maintenance actions may be accountedmore » for in the computation of the PFDavg metric. The method provides a means for quantifying the effects of changes in maintenance practices and shows how these changes impact plant safety.« less

  20. The Effects of Maintenance Actions on the PFDavg of Spring Operated Pressure Relief Valves

    DOE PAGES

    Harris, S.; Gross, R.; Goble, W; ...

    2015-12-01

    The safety integrity level (SIL) of equipment used in safety instrumented functions is determined by the average probability of failure on demand (PFDavg) computed at the time of periodic inspection and maintenance, i.e., the time of proof testing. The computation of PFDavg is generally based solely on predictions or estimates of the assumed constant failure rate of the equipment. However, PFDavg is also affected by maintenance actions (or lack thereof) taken by the end user. This paper shows how maintenance actions can affect the PFDavg of spring operated pressure relief valves (SOPRV) and how these maintenance actions may be accountedmore » for in the computation of the PFDavg metric. The method provides a means for quantifying the effects of changes in maintenance practices and shows how these changes impact plant safety.« less

  1. An Assessment of Student Computer Ergonomic Knowledge.

    ERIC Educational Resources Information Center

    Alexander, Melody W.

    1997-01-01

    Business students (n=254) were assessed on their knowledge of computers, health and safety, radiation, workstations, and ergonomic techniques. Overall knowledge was low in all categories. In particular, they had not learned computer-use techniques. (SK)

  2. Vision 20/20: Automation and advanced computing in clinical radiation oncology.

    PubMed

    Moore, Kevin L; Kagadis, George C; McNutt, Todd R; Moiseenko, Vitali; Mutic, Sasa

    2014-01-01

    This Vision 20/20 paper considers what computational advances are likely to be implemented in clinical radiation oncology in the coming years and how the adoption of these changes might alter the practice of radiotherapy. Four main areas of likely advancement are explored: cloud computing, aggregate data analyses, parallel computation, and automation. As these developments promise both new opportunities and new risks to clinicians and patients alike, the potential benefits are weighed against the hazards associated with each advance, with special considerations regarding patient safety under new computational platforms and methodologies. While the concerns of patient safety are legitimate, the authors contend that progress toward next-generation clinical informatics systems will bring about extremely valuable developments in quality improvement initiatives, clinical efficiency, outcomes analyses, data sharing, and adaptive radiotherapy.

  3. Architecture-Led Safety Analysis of the Joint Multi-Role (JMR) Joint Common Architecture (JCA) Demonstration System

    DTIC Science & Technology

    2015-12-01

    relevant system components (i.e., their component type declarations) have been anno - tated with EMV2 error source or propagation declarations and hazard...contributors. They are recorded as EMV2 anno - tations for each of the ASSA. Figure 40 shows a sampling of potential hazard contributors by the functional...2012] Leveson, N., Engineering a Safer World. MIT Press. 2012. [Parnas 1991] Parnas, D. & Madey, J . Functional Documentation for Computer Systems

  4. Strengthening National, Homeland, and Economic Security. Networking and Information Technology Research and Development Supplement to the President’s FY 2003 Budget

    DTIC Science & Technology

    2002-07-01

    Knowledge From Data .................................................. 25 HIGH-CONFIDENCE SOFTWARE AND SYSTEMS Reliability, Security, and Safety for...NOAA’s Cessna Citation flew over the 16-acre World Trade Center site, scanning with an Optech ALSM unit. The system recorded data points from 33,000...provide the data storage and compute power for intelligence analysis, high-performance national defense systems , and critical scientific research • Large

  5. Verification Methodology of Fault-tolerant, Fail-safe Computers Applied to MAGLEV Control Computer Systems

    DOT National Transportation Integrated Search

    1993-05-01

    The Maglev control computer system should be designed to verifiably possess high reliability and safety as well as high availability to make Maglev a dependable and attractive transportation alternative to the public. A Maglev computer system has bee...

  6. Engineering the electronic health record for safety: a multi-level video-based approach to diagnosing and preventing technology-induced error arising from usability problems.

    PubMed

    Borycki, Elizabeth M; Kushniruk, Andre W; Kuwata, Shigeki; Kannry, Joseph

    2011-01-01

    Electronic health records (EHRs) promise to improve and streamline healthcare through electronic entry and retrieval of patient data. Furthermore, based on a number of studies showing their positive benefits, they promise to reduce medical error and make healthcare safer. However, a growing body of literature has clearly documented that if EHRS are not designed properly and with usability as an important goal in their design, rather than reducing error, EHR deployment has the potential to actually increase medical error. In this paper we describe our approach to engineering (and reengineering) EHRs in order to increase their beneficial potential while at the same time improving their safety. The approach described in this paper involves an integration of the methods of usability analysis with video analysis of end users interacting with EHR systems and extends the evaluation of the usability of EHRs to include the assessment of the impact of these systems on work practices. Using clinical simulations, we analyze human-computer interaction in real healthcare settings (in a portable, low-cost and high fidelity manner) and include both artificial and naturalistic data collection to identify potential usability problems and sources of technology-induced error prior to widespread system release. Two case studies where the methods we have developed and refined have been applied at different levels of user-computer interaction are described.

  7. Recent advances in computational structural reliability analysis methods

    NASA Astrophysics Data System (ADS)

    Thacker, Ben H.; Wu, Y.-T.; Millwater, Harry R.; Torng, Tony Y.; Riha, David S.

    1993-10-01

    The goal of structural reliability analysis is to determine the probability that the structure will adequately perform its intended function when operating under the given environmental conditions. Thus, the notion of reliability admits the possibility of failure. Given the fact that many different modes of failure are usually possible, achievement of this goal is a formidable task, especially for large, complex structural systems. The traditional (deterministic) design methodology attempts to assure reliability by the application of safety factors and conservative assumptions. However, the safety factor approach lacks a quantitative basis in that the level of reliability is never known and usually results in overly conservative designs because of compounding conservatisms. Furthermore, problem parameters that control the reliability are not identified, nor their importance evaluated. A summary of recent advances in computational structural reliability assessment is presented. A significant level of activity in the research and development community was seen recently, much of which was directed towards the prediction of failure probabilities for single mode failures. The focus is to present some early results and demonstrations of advanced reliability methods applied to structural system problems. This includes structures that can fail as a result of multiple component failures (e.g., a redundant truss), or structural components that may fail due to multiple interacting failure modes (e.g., excessive deflection, resonate vibration, or creep rupture). From these results, some observations and recommendations are made with regard to future research needs.

  8. Recent advances in computational structural reliability analysis methods

    NASA Technical Reports Server (NTRS)

    Thacker, Ben H.; Wu, Y.-T.; Millwater, Harry R.; Torng, Tony Y.; Riha, David S.

    1993-01-01

    The goal of structural reliability analysis is to determine the probability that the structure will adequately perform its intended function when operating under the given environmental conditions. Thus, the notion of reliability admits the possibility of failure. Given the fact that many different modes of failure are usually possible, achievement of this goal is a formidable task, especially for large, complex structural systems. The traditional (deterministic) design methodology attempts to assure reliability by the application of safety factors and conservative assumptions. However, the safety factor approach lacks a quantitative basis in that the level of reliability is never known and usually results in overly conservative designs because of compounding conservatisms. Furthermore, problem parameters that control the reliability are not identified, nor their importance evaluated. A summary of recent advances in computational structural reliability assessment is presented. A significant level of activity in the research and development community was seen recently, much of which was directed towards the prediction of failure probabilities for single mode failures. The focus is to present some early results and demonstrations of advanced reliability methods applied to structural system problems. This includes structures that can fail as a result of multiple component failures (e.g., a redundant truss), or structural components that may fail due to multiple interacting failure modes (e.g., excessive deflection, resonate vibration, or creep rupture). From these results, some observations and recommendations are made with regard to future research needs.

  9. Take a Seat.

    ERIC Educational Resources Information Center

    Grant, Deborah R.

    1999-01-01

    Examines the factors involved in purchasing school furnishings that will help ensure its long-time use, safety, and ability to resist abuse. Cost and safety factors discussed include resisting trendy colors to reduce cost in furniture matching, managing computer and office wiring for safety, considering ergonomics in the purchasing decision, and…

  10. Development and Validation of the Total HUman Model for Safety (THUMS) Toward Further Understanding of Occupant Injury Mechanisms in Precrash and During Crash.

    PubMed

    Iwamoto, Masami; Nakahira, Yuko; Kimpara, Hideyuki

    2015-01-01

    Active safety devices such as automatic emergency brake (AEB) and precrash seat belt have the potential to accomplish further reduction in the number of the fatalities due to automotive accidents. However, their effectiveness should be investigated by more accurate estimations of their interaction with human bodies. Computational human body models are suitable for investigation, especially considering muscular tone effects on occupant motions and injury outcomes. However, the conventional modeling approaches such as multibody models and detailed finite element (FE) models have advantages and disadvantages in computational costs and injury predictions considering muscular tone effects. The objective of this study is to develop and validate a human body FE model with whole body muscles, which can be used for the detailed investigation of interaction between human bodies and vehicular structures including some safety devices precrash and during a crash with relatively low computational costs. In this study, we developed a human body FE model called THUMS (Total HUman Model for Safety) with a body size of 50th percentile adult male (AM50) and a sitting posture. The model has anatomical structures of bones, ligaments, muscles, brain, and internal organs. The total number of elements is 281,260, which would realize relatively low computational costs. Deformable material models were assigned to all body parts. The muscle-tendon complexes were modeled by truss elements with Hill-type muscle material and seat belt elements with tension-only material. The THUMS was validated against 35 series of cadaver or volunteer test data on frontal, lateral, and rear impacts. Model validations for 15 series of cadaver test data associated with frontal impacts are presented in this article. The THUMS with a vehicle sled model was applied to investigate effects of muscle activations on occupant kinematics and injury outcomes in specific frontal impact situations with AEB. In the validations using 5 series of cadaver test data, force-time curves predicted by the THUMS were quantitatively evaluated using correlation and analysis (CORA), which showed good or acceptable agreement with cadaver test data in most cases. The investigation of muscular effects showed that muscle activation levels and timing had significant effects on occupant kinematics and injury outcomes. Although further studies on accident injury reconstruction are needed, the THUMS has the potential for predictions of occupant kinematics and injury outcomes considering muscular tone effects with relatively low computational costs.

  11. DOE Office of Scientific and Technical Information (OSTI.GOV)

    NONE

    This document comprises Pacific Northwest National Laboratory`s report for Fiscal Year 1996 on research and development programs. The document contains 161 project summaries in 16 areas of research and development. The 16 areas of research and development reported on are: atmospheric sciences, biotechnology, chemical instrumentation and analysis, computer and information science, ecological science, electronics and sensors, health protection and dosimetry, hydrological and geologic sciences, marine sciences, materials science and engineering, molecular science, process science and engineering, risk and safety analysis, socio-technical systems analysis, statistics and applied mathematics, and thermal and energy systems. In addition, this report provides an overview ofmore » the research and development program, program management, program funding, and Fiscal Year 1997 projects.« less

  12. The U. S. Department of Energy SARP review training program

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mauck, C.J.

    1988-01-01

    In support of its radioactive material packaging certification program, the U.S. Department of Energy (DOE) has established a special training workshop. The purpose of the two-week workshop is to develop skills in reviewing Safety Analysis Reports for Packagings (SARPs) and performing confirmatory analyses. The workshop, conducted by the Lawrence Livermore National Laboratory (LLNL) for DOE, is divided into two parts: methods of review and methods of analysis. The sessions covering methods of review are based on the DOE document, ''Packaging Review Guide for Reviewing Safety Analysis Reports for Packagings'' (PRG). The sessions cover relevant DOE Orders and all areas ofmore » review in the applicable Nuclear Regulatory Commission (NRC) Regulatory Guides. The technical areas addressed include structural and thermal behavior, materials, shielding, criticality, and containment. The course sessions on methods of analysis provide hands-on experience in the use of calculational methods and codes for reviewing SARPs. Analytical techniques and computer codes are discussed and sample problems are worked. Homework is assigned each night and over the included weekend; at the conclusion, a comprehensive take-home examination is given requiring six to ten hours to complete.« less

  13. Uncertainty Quantification Techniques of SCALE/TSUNAMI

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rearden, Bradley T; Mueller, Don

    2011-01-01

    The Standardized Computer Analysis for Licensing Evaluation (SCALE) code system developed at Oak Ridge National Laboratory (ORNL) includes Tools for Sensitivity and Uncertainty Analysis Methodology Implementation (TSUNAMI). The TSUNAMI code suite can quantify the predicted change in system responses, such as k{sub eff}, reactivity differences, or ratios of fluxes or reaction rates, due to changes in the energy-dependent, nuclide-reaction-specific cross-section data. Where uncertainties in the neutron cross-section data are available, the sensitivity of the system to the cross-section data can be applied to propagate the uncertainties in the cross-section data to an uncertainty in the system response. Uncertainty quantification ismore » useful for identifying potential sources of computational biases and highlighting parameters important to code validation. Traditional validation techniques often examine one or more average physical parameters to characterize a system and identify applicable benchmark experiments. However, with TSUNAMI correlation coefficients are developed by propagating the uncertainties in neutron cross-section data to uncertainties in the computed responses for experiments and safety applications through sensitivity coefficients. The bias in the experiments, as a function of their correlation coefficient with the intended application, is extrapolated to predict the bias and bias uncertainty in the application through trending analysis or generalized linear least squares techniques, often referred to as 'data adjustment.' Even with advanced tools to identify benchmark experiments, analysts occasionally find that the application models include some feature or material for which adequately similar benchmark experiments do not exist to support validation. For example, a criticality safety analyst may want to take credit for the presence of fission products in spent nuclear fuel. In such cases, analysts sometimes rely on 'expert judgment' to select an additional administrative margin to account for gap in the validation data or to conclude that the impact on the calculated bias and bias uncertainty is negligible. As a result of advances in computer programs and the evolution of cross-section covariance data, analysts can use the sensitivity and uncertainty analysis tools in the TSUNAMI codes to estimate the potential impact on the application-specific bias and bias uncertainty resulting from nuclides not represented in available benchmark experiments. This paper presents the application of methods described in a companion paper.« less

  14. ADAM: An Accident Diagnostic,Analysis and Management System - Applications to Severe Accident Simulation and Management

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zavisca, M.J.; Khatib-Rahbar, M.; Esmaili, H.

    2002-07-01

    The Accident Diagnostic, Analysis and Management (ADAM) computer code has been developed as a tool for on-line applications to accident diagnostics, simulation, management and training. ADAM's severe accident simulation capabilities incorporate a balance of mechanistic, phenomenologically based models with simple parametric approaches for elements including (but not limited to) thermal hydraulics; heat transfer; fuel heatup, meltdown, and relocation; fission product release and transport; combustible gas generation and combustion; and core-concrete interaction. The overall model is defined by a relatively coarse spatial nodalization of the reactor coolant and containment systems and is advanced explicitly in time. The result is to enablemore » much faster than real time (i.e., 100 to 1000 times faster than real time on a personal computer) applications to on-line investigations and/or accident management training. Other features of the simulation module include provision for activation of water injection, including the Engineered Safety Features, as well as other mechanisms for the assessment of accident management and recovery strategies and the evaluation of PSA success criteria. The accident diagnostics module of ADAM uses on-line access to selected plant parameters (as measured by plant sensors) to compute the thermodynamic state of the plant, and to predict various margins to safety (e.g., times to pressure vessel saturation and steam generator dryout). Rule-based logic is employed to classify the measured data as belonging to one of a number of likely scenarios based on symptoms, and a number of 'alarms' are generated to signal the state of the reactor and containment. This paper will address the features and limitations of ADAM with particular focus on accident simulation and management. (authors)« less

  15. An approach to model reactor core nodalization for deterministic safety analysis

    NASA Astrophysics Data System (ADS)

    Salim, Mohd Faiz; Samsudin, Mohd Rafie; Mamat @ Ibrahim, Mohd Rizal; Roslan, Ridha; Sadri, Abd Aziz; Farid, Mohd Fairus Abd

    2016-01-01

    Adopting good nodalization strategy is essential to produce an accurate and high quality input model for Deterministic Safety Analysis (DSA) using System Thermal-Hydraulic (SYS-TH) computer code. The purpose of such analysis is to demonstrate the compliance against regulatory requirements and to verify the behavior of the reactor during normal and accident conditions as it was originally designed. Numerous studies in the past have been devoted to the development of the nodalization strategy for small research reactor (e.g. 250kW) up to the bigger research reactor (e.g. 30MW). As such, this paper aims to discuss the state-of-arts thermal hydraulics channel to be employed in the nodalization for RTP-TRIGA Research Reactor specifically for the reactor core. At present, the required thermal-hydraulic parameters for reactor core, such as core geometrical data (length, coolant flow area, hydraulic diameters, and axial power profile) and material properties (including the UZrH1.6, stainless steel clad, graphite reflector) have been collected, analyzed and consolidated in the Reference Database of RTP using standardized methodology, mainly derived from the available technical documentations. Based on the available information in the database, assumptions made on the nodalization approach and calculations performed will be discussed and presented. The development and identification of the thermal hydraulics channel for the reactor core will be implemented during the SYS-TH calculation using RELAP5-3D® computer code. This activity presented in this paper is part of the development of overall nodalization description for RTP-TRIGA Research Reactor under the IAEA Norwegian Extra-Budgetary Programme (NOKEBP) mentoring project on Expertise Development through the Analysis of Reactor Thermal-Hydraulics for Malaysia, denoted as EARTH-M.

  16. An approach to model reactor core nodalization for deterministic safety analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Salim, Mohd Faiz, E-mail: mohdfaizs@tnb.com.my; Samsudin, Mohd Rafie, E-mail: rafies@tnb.com.my; Mamat Ibrahim, Mohd Rizal, E-mail: m-rizal@nuclearmalaysia.gov.my

    Adopting good nodalization strategy is essential to produce an accurate and high quality input model for Deterministic Safety Analysis (DSA) using System Thermal-Hydraulic (SYS-TH) computer code. The purpose of such analysis is to demonstrate the compliance against regulatory requirements and to verify the behavior of the reactor during normal and accident conditions as it was originally designed. Numerous studies in the past have been devoted to the development of the nodalization strategy for small research reactor (e.g. 250kW) up to the bigger research reactor (e.g. 30MW). As such, this paper aims to discuss the state-of-arts thermal hydraulics channel to bemore » employed in the nodalization for RTP-TRIGA Research Reactor specifically for the reactor core. At present, the required thermal-hydraulic parameters for reactor core, such as core geometrical data (length, coolant flow area, hydraulic diameters, and axial power profile) and material properties (including the UZrH{sub 1.6}, stainless steel clad, graphite reflector) have been collected, analyzed and consolidated in the Reference Database of RTP using standardized methodology, mainly derived from the available technical documentations. Based on the available information in the database, assumptions made on the nodalization approach and calculations performed will be discussed and presented. The development and identification of the thermal hydraulics channel for the reactor core will be implemented during the SYS-TH calculation using RELAP5-3D{sup ®} computer code. This activity presented in this paper is part of the development of overall nodalization description for RTP-TRIGA Research Reactor under the IAEA Norwegian Extra-Budgetary Programme (NOKEBP) mentoring project on Expertise Development through the Analysis of Reactor Thermal-Hydraulics for Malaysia, denoted as EARTH-M.« less

  17. 28 CFR 32.2 - Computation of time; filing.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 28 Judicial Administration 1 2010-07-01 2010-07-01 false Computation of time; filing. 32.2 Section 32.2 Judicial Administration DEPARTMENT OF JUSTICE PUBLIC SAFETY OFFICERS' DEATH, DISABILITY, AND EDUCATIONAL ASSISTANCE BENEFIT CLAIMS General Provisions § 32.2 Computation of time; filing. (a) In computing...

  18. 28 CFR 32.2 - Computation of time; filing.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 28 Judicial Administration 1 2011-07-01 2011-07-01 false Computation of time; filing. 32.2 Section 32.2 Judicial Administration DEPARTMENT OF JUSTICE PUBLIC SAFETY OFFICERS' DEATH, DISABILITY, AND EDUCATIONAL ASSISTANCE BENEFIT CLAIMS General Provisions § 32.2 Computation of time; filing. (a) In computing...

  19. Verification and Validation in a Rapid Software Development Process

    NASA Technical Reports Server (NTRS)

    Callahan, John R.; Easterbrook, Steve M.

    1997-01-01

    The high cost of software production is driving development organizations to adopt more automated design and analysis methods such as rapid prototyping, computer-aided software engineering (CASE) tools, and high-level code generators. Even developers of safety-critical software system have adopted many of these new methods while striving to achieve high levels Of quality and reliability. While these new methods may enhance productivity and quality in many cases, we examine some of the risks involved in the use of new methods in safety-critical contexts. We examine a case study involving the use of a CASE tool that automatically generates code from high-level system designs. We show that while high-level testing on the system structure is highly desirable, significant risks exist in the automatically generated code and in re-validating releases of the generated code after subsequent design changes. We identify these risks and suggest process improvements that retain the advantages of rapid, automated development methods within the quality and reliability contexts of safety-critical projects.

  20. Negative Stress Margins - Are They Real?

    NASA Technical Reports Server (NTRS)

    Raju, Ivatury S.; Lee, Darlene S.; Mohaghegh, Michael

    2011-01-01

    Advances in modeling and simulation, new finite element software, modeling engines and powerful computers are providing opportunities to interrogate designs in a very different manner and in a more detailed approach than ever before. Margins of safety are also often evaluated using local stresses for various design concepts and design parameters quickly once analysis models are defined and developed. This paper suggests that not all the negative margins of safety evaluated are real. The structural areas where negative margins are frequently encountered are often near stress concentrations, point loads and load discontinuities, near locations of stress singularities, in areas having large gradients but with insufficient mesh density, in areas with modeling issues and modeling errors, and in areas with connections and interfaces, in two-dimensional (2D) and three-dimensional (3D) transitions, bolts and bolt modeling, and boundary conditions. Now, more than ever, structural analysts need to examine and interrogate their analysis results and perform basic sanity checks to determine if these negative margins are real.

  1. Lecture Notes on Criticality Safety Validation Using MCNP & Whisper

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brown, Forrest B.; Rising, Michael Evan; Alwin, Jennifer Louise

    Training classes for nuclear criticality safety, MCNP documentation. The need for, and problems surrounding, validation of computer codes and data area considered first. Then some background for MCNP & Whisper is given--best practices for Monte Carlo criticality calculations, neutron spectra, S(α,β) thermal neutron scattering data, nuclear data sensitivities, covariance data, and correlation coefficients. Whisper is computational software designed to assist the nuclear criticality safety analyst with validation studies with the Monte Carlo radiation transport package MCNP. Whisper's methodology (benchmark selection – C k's, weights; extreme value theory – bias, bias uncertainty; MOS for nuclear data uncertainty – GLLS) and usagemore » are discussed.« less

  2. Homogenized modeling methodology for 18650 lithium-ion battery module under large deformation

    PubMed Central

    Tang, Liang; Cheng, Pengle

    2017-01-01

    Effective lithium-ion battery module modeling has become a bottleneck for full-size electric vehicle crash safety numerical simulation. Modeling every single cell in detail would be costly. However, computational accuracy could be lost if the module is modeled by using a simple bulk material or rigid body. To solve this critical engineering problem, a general method to establish a computational homogenized model for the cylindrical battery module is proposed. A single battery cell model is developed and validated through radial compression and bending experiments. To analyze the homogenized mechanical properties of the module, a representative unit cell (RUC) is extracted with the periodic boundary condition applied on it. An elastic–plastic constitutive model is established to describe the computational homogenized model for the module. Two typical packing modes, i.e., cubic dense packing and hexagonal packing for the homogenized equivalent battery module (EBM) model, are targeted for validation compression tests, as well as the models with detailed single cell description. Further, the homogenized EBM model is confirmed to agree reasonably well with the detailed battery module (DBM) model for different packing modes with a length scale of up to 15 × 15 cells and 12% deformation where the short circuit takes place. The suggested homogenized model for battery module makes way for battery module and pack safety evaluation for full-size electric vehicle crashworthiness analysis. PMID:28746390

  3. Homogenized modeling methodology for 18650 lithium-ion battery module under large deformation.

    PubMed

    Tang, Liang; Zhang, Jinjie; Cheng, Pengle

    2017-01-01

    Effective lithium-ion battery module modeling has become a bottleneck for full-size electric vehicle crash safety numerical simulation. Modeling every single cell in detail would be costly. However, computational accuracy could be lost if the module is modeled by using a simple bulk material or rigid body. To solve this critical engineering problem, a general method to establish a computational homogenized model for the cylindrical battery module is proposed. A single battery cell model is developed and validated through radial compression and bending experiments. To analyze the homogenized mechanical properties of the module, a representative unit cell (RUC) is extracted with the periodic boundary condition applied on it. An elastic-plastic constitutive model is established to describe the computational homogenized model for the module. Two typical packing modes, i.e., cubic dense packing and hexagonal packing for the homogenized equivalent battery module (EBM) model, are targeted for validation compression tests, as well as the models with detailed single cell description. Further, the homogenized EBM model is confirmed to agree reasonably well with the detailed battery module (DBM) model for different packing modes with a length scale of up to 15 × 15 cells and 12% deformation where the short circuit takes place. The suggested homogenized model for battery module makes way for battery module and pack safety evaluation for full-size electric vehicle crashworthiness analysis.

  4. Divided attention in computer game play: analysis utilizing unobtrusive health monitoring.

    PubMed

    McKanna, James A; Jimison, Holly; Pavel, Misha

    2009-01-01

    Divided attention is a vital cognitive ability used in important daily activities (e.g., driving), which tends to deteriorate with age. As with Alzheimer's and other neural degenerative conditions, treatment for divided attention problems is likely to be more effective the earlier it is detected. Thus, it is important that a method be found to detect changes in divided attention early on in the process, for both safety and health care reasons. We present here a new method for detecting divided attention unobtrusively, using performance on a computer game designed to force players to attend to different dimensions simultaneously in order to succeed. Should this model prove to predict scores on a standard test for divided attention, it could help to detect cognitive decline earlier in our increasingly computer-involved aging population, providing treatment efficacy benefits to those who will experience cognitive decline.

  5. 2011 Annual Meeting of the Safety Pharmacology Society: an overview.

    PubMed

    Cavero, Icilio

    2012-03-01

    The keynote address of 2011 Annual Meeting of the Safety Pharmacology Society examined the known and the still to be known on drug-induced nephrotoxicity. The nominee of the Distinguished Service Award Lecture gave an account of his career achievements particularly on the domain of chronically instrumented animals for assessing cardiovascular safety. The value of Safety Pharmacology resides in the benefits delivered to Pharma organizations, regulators, payers and patients. Meticulous due diligence concerning compliance of Safety Pharmacology studies to best practices is an effective means to ensure that equally stringent safety criteria are applied to both in-licensed and in-house compounds. Innovative technologies of great potential for Safety Pharmacology presented at the meeting are organs on chips (lung, heart, intestine) displaying mechanical and biochemical features of native organs, electrical field potential (MEA) or impedance (xCELLigence Cardio) measurements in human induced pluripotent stem cell-derived cardiomyocytes for unveiling cardiac electrophysiological and mechanical liabilities, functional human airway epithelium (MucilAir™) preparations with unique 1-year shelf-life for acute and chronic in vitro evaluation of drug efficacy and toxicity. Custom-designed in silico and in vitro assay platforms defining the receptorome space occupied by chemical entities facilitate, throughout the drug discovery phase, the selection of candidates with optimized safety profile on organ function. These approaches can now be complemented by advanced computational analysis allowing the identification of compounds with receptorome, or clinically adverse effect profiles, similar to those of the drug candidate under scrutiny for extending the safety assessment to potential liability targets not captured by classical approaches. Nonclinical data supporting safety can be quite reassuring for drugs with a discovered signal of risk. However, for marketing authorization this information should be complemented by a clear clinical proof of safety. The ongoing outsourcing process of Regulatory Safety Pharmacology activities from large Pharmas to contract research organizations should be taken as an opportunity to establish long-overdue in-house Exploratory Safety Pharmacology units fully dedicated to the optimization of clinical candidates on organ safety.

  6. Anatomical analysis of medial branches of dorsal rami of cervical nerves for radiofrequency thermocoagulation.

    PubMed

    Kweon, Tae Dong; Kim, Ji Young; Lee, Hye Yeon; Kim, Myung Hwa; Lee, Youn-Woo

    2014-01-01

    Cervical medial branch blocks are used to treat patients with chronic neck pain. The aim of this study was to clarify the anatomical aspects of the cervical medial branches to improve the accuracy and safety of radiofrequency denervation. Twenty cervical specimens were harvested from 20 adult cadavers. The anatomical parameters of the C4-C7 cervical medial branches were measured. The 3-dimensional computed tomography reconstruction images of the bone were also analyzed. Based on cadaveric analysis, most of the cervical dorsal rami gave off 1 medial branch; however, the cervical dorsal rami gave off 2 medial branches in 27%, 15%, 2%, and 0% at the vertebral level C4, C5, C6, and C7, respectively. The diameters of the medial branches varied from 1.0 to 1.2 mm, and the average distance from the notch of inferior articular process to the medial branches was about 2 mm. Most of the bifurcation sites were located at the medial side of the posterior tubercle of the transverse process. On the analysis of 3-dimensional computed tomography reconstruction images, cervical medial branches (C4 to C6) passed through the upper 49% to 53% of a line between the tips of 2 consecutive superior articular processes (anterior line). Also, cervical medial branches passed through the upper 28% to 35% of a line between the midpoints of 2 consecutive facet joints (midline). The present anatomical study may help improve accuracy and safety during radiofrequency denervation of the cervical medial branches.

  7. D Animation Reconstruction from Multi-Camera Coordinates Transformation

    NASA Astrophysics Data System (ADS)

    Jhan, J. P.; Rau, J. Y.; Chou, C. M.

    2016-06-01

    Reservoir dredging issues are important to extend the life of reservoir. The most effective and cost reduction way is to construct a tunnel to desilt the bottom sediment. Conventional technique is to construct a cofferdam to separate the water, construct the intake of tunnel inside and remove the cofferdam afterwards. In Taiwan, the ZengWen reservoir dredging project will install an Elephant-trunk Steel Pipe (ETSP) in the water to connect the desilting tunnel without building the cofferdam. Since the installation is critical to the whole project, a 1:20 model was built to simulate the installation steps in a towing tank, i.e. launching, dragging, water injection, and sinking. To increase the construction safety, photogrammetry technic is adopted to record images during the simulation, compute its transformation parameters for dynamic analysis and reconstruct the 4D animations. In this study, several Australiscoded targets are fixed on the surface of ETSP for auto-recognition and measurement. The cameras orientations are computed by space resection where the 3D coordinates of coded targets are measured. Two approaches for motion parameters computation are proposed, i.e. performing 3D conformal transformation from the coordinates of cameras and relative orientation computation by the orientation of single camera. Experimental results show the 3D conformal transformation can achieve sub-mm simulation results, and relative orientation computation shows the flexibility for dynamic motion analysis which is easier and more efficiency.

  8. Computed tomography, magnetic resonance, and ultrasound imaging: basic principles, glossary of terms, and patient safety.

    PubMed

    Cogbill, Thomas H; Ziegelbein, Kurt J

    2011-02-01

    The basic principles underlying computed tomography, magnetic resonance, and ultrasound are reviewed to promote better understanding of the properties and appropriate applications of these 3 common imaging modalities. A glossary of frequently used terms for each technique is appended for convenience. Risks to patient safety including contrast-induced nephropathy, radiation-induced malignancy, and nephrogenic systemic fibrosis are discussed. Copyright © 2011 Elsevier Inc. All rights reserved.

  9. Massive parallelization of serial inference algorithms for a complex generalized linear model

    PubMed Central

    Suchard, Marc A.; Simpson, Shawn E.; Zorych, Ivan; Ryan, Patrick; Madigan, David

    2014-01-01

    Following a series of high-profile drug safety disasters in recent years, many countries are redoubling their efforts to ensure the safety of licensed medical products. Large-scale observational databases such as claims databases or electronic health record systems are attracting particular attention in this regard, but present significant methodological and computational concerns. In this paper we show how high-performance statistical computation, including graphics processing units, relatively inexpensive highly parallel computing devices, can enable complex methods in large databases. We focus on optimization and massive parallelization of cyclic coordinate descent approaches to fit a conditioned generalized linear model involving tens of millions of observations and thousands of predictors in a Bayesian context. We find orders-of-magnitude improvement in overall run-time. Coordinate descent approaches are ubiquitous in high-dimensional statistics and the algorithms we propose open up exciting new methodological possibilities with the potential to significantly improve drug safety. PMID:25328363

  10. Vision 20/20: Automation and advanced computing in clinical radiation oncology

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Moore, Kevin L., E-mail: kevinmoore@ucsd.edu; Moiseenko, Vitali; Kagadis, George C.

    This Vision 20/20 paper considers what computational advances are likely to be implemented in clinical radiation oncology in the coming years and how the adoption of these changes might alter the practice of radiotherapy. Four main areas of likely advancement are explored: cloud computing, aggregate data analyses, parallel computation, and automation. As these developments promise both new opportunities and new risks to clinicians and patients alike, the potential benefits are weighed against the hazards associated with each advance, with special considerations regarding patient safety under new computational platforms and methodologies. While the concerns of patient safety are legitimate, the authorsmore » contend that progress toward next-generation clinical informatics systems will bring about extremely valuable developments in quality improvement initiatives, clinical efficiency, outcomes analyses, data sharing, and adaptive radiotherapy.« less

  11. Vision 20/20: Automation and advanced computing in clinical radiation oncology

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Moore, Kevin L., E-mail: kevinmoore@ucsd.edu; Moiseenko, Vitali; Kagadis, George C.

    2014-01-15

    This Vision 20/20 paper considers what computational advances are likely to be implemented in clinical radiation oncology in the coming years and how the adoption of these changes might alter the practice of radiotherapy. Four main areas of likely advancement are explored: cloud computing, aggregate data analyses, parallel computation, and automation. As these developments promise both new opportunities and new risks to clinicians and patients alike, the potential benefits are weighed against the hazards associated with each advance, with special considerations regarding patient safety under new computational platforms and methodologies. While the concerns of patient safety are legitimate, the authorsmore » contend that progress toward next-generation clinical informatics systems will bring about extremely valuable developments in quality improvement initiatives, clinical efficiency, outcomes analyses, data sharing, and adaptive radiotherapy.« less

  12. Non-developmental item computer systems and the malicious software threat

    NASA Technical Reports Server (NTRS)

    Bown, Rodney L.

    1991-01-01

    The following subject areas are covered: a DOD development system - the Army Secure Operating System; non-development commercial computer systems; security, integrity, and assurance of service (SI and A); post delivery SI and A and malicious software; computer system unique attributes; positive feedback to commercial computer systems vendors; and NDI (Non-Development Item) computers and software safety.

  13. Selected computations of transonic cavity flows

    NASA Technical Reports Server (NTRS)

    Atwood, Christopher A.

    1993-01-01

    An efficient diagonal scheme implemented in an overset mesh framework has permitted the analysis of geometrically complex cavity flows via the Reynolds averaged Navier-Stokes equations. Use of rapid hyperbolic and algebraic grid methods has allowed simple specification of critical turbulent regions with an algebraic turbulence model. Comparisons between numerical and experimental results are made in two dimensions for the following problems: a backward-facing step; a resonating cavity; and two quieted cavity configurations. In three-dimensions the flow about three early concepts of the stratospheric Observatory For Infrared Astronomy (SOFIA) are compared to wind-tunnel data. Shedding frequencies of resolved shear layer structures are compared against experiment for the quieted cavities. The results demonstrate the progress of computational assessment of configuration safety and performance.

  14. System cost performance analysis (study 2.3). Volume 1: Executive summary. [unmanned automated payload programs and program planning

    NASA Technical Reports Server (NTRS)

    Campbell, B. H.

    1974-01-01

    A study is described which was initiated to identify and quantify the interrelationships between and within the performance, safety, cost, and schedule parameters for unmanned, automated payload programs. The result of the investigation was a systems cost/performance model which was implemented as a digital computer program and could be used to perform initial program planning, cost/performance tradeoffs, and sensitivity analyses for mission model and advanced payload studies. Program objectives and results are described briefly.

  15. Computationally optimized ECoG stimulation with local safety constraints.

    PubMed

    Guler, Seyhmus; Dannhauer, Moritz; Roig-Solvas, Biel; Gkogkidis, Alexis; Macleod, Rob; Ball, Tonio; Ojemann, Jeffrey G; Brooks, Dana H

    2018-06-01

    Direct stimulation of the cortical surface is used clinically for cortical mapping and modulation of local activity. Future applications of cortical modulation and brain-computer interfaces may also use cortical stimulation methods. One common method to deliver current is through electrocorticography (ECoG) stimulation in which a dense array of electrodes are placed subdurally or epidurally to stimulate the cortex. However, proximity to cortical tissue limits the amount of current that can be delivered safely. It may be desirable to deliver higher current to a specific local region of interest (ROI) while limiting current to other local areas more stringently than is guaranteed by global safety limits. Two commonly used global safety constraints bound the total injected current and individual electrode currents. However, these two sets of constraints may not be sufficient to prevent high current density locally (hot-spots). In this work, we propose an efficient approach that prevents current density hot-spots in the entire brain while optimizing ECoG stimulus patterns for targeted stimulation. Specifically, we maximize the current along a particular desired directional field in the ROI while respecting three safety constraints: one on the total injected current, one on individual electrode currents, and the third on the local current density magnitude in the brain. This third set of constraints creates a computational barrier due to the huge number of constraints needed to bound the current density at every point in the entire brain. We overcome this barrier by adopting an efficient two-step approach. In the first step, the proposed method identifies the safe brain region, which cannot contain any hot-spots solely based on the global bounds on total injected current and individual electrode currents. In the second step, the proposed algorithm iteratively adjusts the stimulus pattern to arrive at a solution that exhibits no hot-spots in the remaining brain. We report on simulations on a realistic finite element (FE) head model with five anatomical ROIs and two desired directional fields. We also report on the effect of ROI depth and desired directional field on the focality of the stimulation. Finally, we provide an analysis of optimization runtime as a function of different safety and modeling parameters. Our results suggest that optimized stimulus patterns tend to differ from those used in clinical practice. Copyright © 2018 Elsevier Inc. All rights reserved.

  16. Redistricting Is Less Torturous When a Computer Does the Nitty-Gritty for You.

    ERIC Educational Resources Information Center

    Rust, Albert O.; Judd, Frank F.

    1984-01-01

    Describes "optimization" computer programing to aid in school redistricting. Using diverse demographic data, the computer plots district boundaries to minimize children's walking distance and maximize safety, improve racial balance, and keep enrollment within school capacity. (TE)

  17. 46 CFR 15.818 - Global Maritime Distress and Safety System (GMDSS) at-sea maintainer.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 46 Shipping 1 2014-10-01 2014-10-01 false Global Maritime Distress and Safety System (GMDSS) at-sea maintainer. 15.818 Section 15.818 Shipping COAST GUARD, DEPARTMENT OF HOMELAND SECURITY MERCHANT MARINE OFFICERS AND SEAMEN MANNING REQUIREMENTS Computations § 15.818 Global Maritime Distress and Safety...

  18. 46 CFR 15.817 - Global Maritime Distress and Safety System (GMDSS) radio operator.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 46 Shipping 1 2014-10-01 2014-10-01 false Global Maritime Distress and Safety System (GMDSS) radio operator. 15.817 Section 15.817 Shipping COAST GUARD, DEPARTMENT OF HOMELAND SECURITY MERCHANT MARINE OFFICERS AND SEAMEN MANNING REQUIREMENTS Computations § 15.817 Global Maritime Distress and Safety System...

  19. Surrogate safety measures from traffic simulation models

    DOT National Transportation Integrated Search

    2003-01-01

    This project investigates the potential for deriving surrogate measures of safety from existing microscopic traffic simulation models for intersections. The process of computing the measures in the simulation, extracting the required data, and summar...

  20. Reduced Order Model Implementation in the Risk-Informed Safety Margin Characterization Toolkit

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mandelli, Diego; Smith, Curtis L.; Alfonsi, Andrea

    2015-09-01

    The RISMC project aims to develop new advanced simulation-based tools to perform Probabilistic Risk Analysis (PRA) for the existing fleet of U.S. nuclear power plants (NPPs). These tools numerically model not only the thermo-hydraulic behavior of the reactor primary and secondary systems but also external events temporal evolution and components/system ageing. Thus, this is not only a multi-physics problem but also a multi-scale problem (both spatial, µm-mm-m, and temporal, ms-s-minutes-years). As part of the RISMC PRA approach, a large amount of computationally expensive simulation runs are required. An important aspect is that even though computational power is regularly growing, themore » overall computational cost of a RISMC analysis may be not viable for certain cases. A solution that is being evaluated is the use of reduce order modeling techniques. During the FY2015, we investigated and applied reduced order modeling techniques to decrease the RICM analysis computational cost by decreasing the number of simulations runs to perform and employ surrogate models instead of the actual simulation codes. This report focuses on the use of reduced order modeling techniques that can be applied to any RISMC analysis to generate, analyze and visualize data. In particular, we focus on surrogate models that approximate the simulation results but in a much faster time (µs instead of hours/days). We apply reduced order and surrogate modeling techniques to several RISMC types of analyses using RAVEN and RELAP-7 and show the advantages that can be gained.« less

  1. National survey on dose data analysis in computed tomography.

    PubMed

    Heilmaier, Christina; Treier, Reto; Merkle, Elmar Max; Alkhadi, Hatem; Weishaupt, Dominik; Schindera, Sebastian

    2018-05-28

    A nationwide survey was performed assessing current practice of dose data analysis in computed tomography (CT). All radiological departments in Switzerland were asked to participate in the on-line survey composed of 19 questions (16 multiple choice, 3 free text). It consisted of four sections: (1) general information on the department, (2) dose data analysis, (3) use of a dose management software (DMS) and (4) radiation protection activities. In total, 152 out of 241 Swiss radiological departments filled in the whole questionnaire (return rate, 63%). Seventy-nine per cent of the departments (n = 120/152) analyse dose data on a regular basis with considerable heterogeneity in the frequency (1-2 times per year, 45%, n = 54/120; every month, 35%, n = 42/120) and method of analysis. Manual analysis is carried out by 58% (n = 70/120) compared with 42% (n = 50/120) of departments using a DMS. Purchase of a DMS is planned by 43% (n = 30/70) of the departments with manual analysis. Real-time analysis of dose data is performed by 42% (n = 21/50) of the departments with a DMS; however, residents can access the DMS in clinical routine only in 20% (n = 10/50) of the departments. An interdisciplinary dose team, which among other things communicates dose data internally (63%, n = 76/120) and externally, is already implemented in 57% (n = 68/120) departments. Swiss radiological departments are committed to radiation safety. However, there is high heterogeneity among them regarding the frequency and method of dose data analysis as well as the use of DMS and radiation protection activities. • Swiss radiological departments are committed to and interest in radiation safety as proven by a 63% return rate of the survey. • Seventy-nine per cent of departments analyse dose data on a regular basis with differences in the frequency and method of analysis: 42% use a dose management software, while 58% currently perform manual dose data analysis. Of the latter, 43% plan to buy a dose management software. • Currently, only 25% of the departments add radiation exposure data to the final CT report.

  2. Safety Assessment for the Kozloduy National Disposal Facility in Bulgaria - 13507

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Biurrun, E.; Haverkamp, B.; Lazaro, A.

    2013-07-01

    Due to the early decommissioning of four Water-Water Energy Reactors (WWER) 440-V230 reactors at the Nuclear Power Plant (NPP) near the city of Kozloduy in Bulgaria, large amounts of low and intermediate radioactive waste will arise much earlier than initially scheduled. In or-der to manage the radioactive waste from the early decommissioning, Bulgaria has intensified its efforts to provide a near surface disposal facility at Radiana with the required capacity. To this end, a project was launched and assigned in international competition to a German-Spanish consortium to provide the complete technical planning including the preparation of the Intermediate Safety Assessmentmore » Report. Preliminary results of operational and long-term safety show compliance with the Bulgarian regulatory requirements. The long-term calculations carried out for the Radiana site are also a good example of how analysis of safety assessment results can be used for iterative improvements of the assessment by pointing out uncertainties and areas of future investigations to reduce such uncertainties in regard to the potential radiological impact. The computer model used to estimate the long-term evolution of the future repository at Radiana predicted a maximum total annual dose for members of the critical group, which is carried to approximately 80 % by C-14 for a specific ingestion pathway. Based on this result and the outcome of the sensitivity analysis, existing uncertainties were evaluated and areas for reasonable future investigations to reduce these uncertainties were identified. (authors)« less

  3. Relating voltage and thermal safety in Li-ion battery cathodes: a high-throughput computational study.

    PubMed

    Jain, Anubhav; Hautier, Geoffroy; Ong, Shyue Ping; Dacek, Stephen; Ceder, Gerbrand

    2015-02-28

    High voltage and high thermal safety are desirable characteristics of cathode materials, but difficult to achieve simultaneously. This work uses high-throughput density functional theory computations to evaluate the link between voltage and safety (as estimated by thermodynamic O2 release temperatures) for over 1400 cathode materials. Our study indicates that a strong inverse relationship exists between voltage and safety: just over half the variance in O2 release temperature can be explained by voltage alone. We examine the effect of polyanion group, redox couple, and ratio of oxygen to counter-cation on both voltage and safety. As expected, our data demonstrates that polyanion groups improve safety when comparing compounds with similar voltages. However, a counterintuitive result of our study is that polyanion groups produce either no benefit or reduce safety when comparing compounds with the same redox couple. Using our data set, we tabulate voltages and oxidation potentials for over 105 combinations of redox couple/anion, which can be used towards the design and rationalization of new cathode materials. Overall, only a few compounds in our study, representing limited redox couple/polyanion combinations, exhibit both high voltage and high safety. We discuss these compounds in more detail as well as the opportunities for designing safe, high-voltage cathodes.

  4. Implicit time-integration method for simultaneous solution of a coupled non-linear system

    NASA Astrophysics Data System (ADS)

    Watson, Justin Kyle

    Historically large physical problems have been divided into smaller problems based on the physics involved. This is no different in reactor safety analysis. The problem of analyzing a nuclear reactor for design basis accidents is performed by a handful of computer codes each solving a portion of the problem. The reactor thermal hydraulic response to an event is determined using a system code like TRAC RELAP Advanced Computational Engine (TRACE). The core power response to the same accident scenario is determined using a core physics code like Purdue Advanced Core Simulator (PARCS). Containment response to the reactor depressurization in a Loss Of Coolant Accident (LOCA) type event is calculated by a separate code. Sub-channel analysis is performed with yet another computer code. This is just a sample of the computer codes used to solve the overall problems of nuclear reactor design basis accidents. Traditionally each of these codes operates independently from each other using only the global results from one calculation as boundary conditions to another. Industry's drive to uprate power for reactors has motivated analysts to move from a conservative approach to design basis accident towards a best estimate method. To achieve a best estimate calculation efforts have been aimed at coupling the individual physics models to improve the accuracy of the analysis and reduce margins. The current coupling techniques are sequential in nature. During a calculation time-step data is passed between the two codes. The individual codes solve their portion of the calculation and converge to a solution before the calculation is allowed to proceed to the next time-step. This thesis presents a fully implicit method of simultaneous solving the neutron balance equations, heat conduction equations and the constitutive fluid dynamics equations. It discusses the problems involved in coupling different physics phenomena within multi-physics codes and presents a solution to these problems. The thesis also outlines the basic concepts behind the nodal balance equations, heat transfer equations and the thermal hydraulic equations, which will be coupled to form a fully implicit nonlinear system of equations. The coupling of separate physics models to solve a larger problem and improve accuracy and efficiency of a calculation is not a new idea, however implementing them in an implicit manner and solving the system simultaneously is. Also the application to reactor safety codes is new and has not be done with thermal hydraulics and neutronics codes on realistic applications in the past. The coupling technique described in this thesis is applicable to other similar coupled thermal hydraulic and core physics reactor safety codes. This technique is demonstrated using coupled input decks to show that the system is solved correctly and then verified by using two derivative test problems based on international benchmark problems the OECD/NRC Three mile Island (TMI) Main Steam Line Break (MSLB) problem (representative of pressurized water reactor analysis) and the OECD/NRC Peach Bottom (PB) Turbine Trip (TT) benchmark (representative of boiling water reactor analysis).

  5. Probabilistic Surface Characterization for Safe Landing Hazard Detection and Avoidance (HDA)

    NASA Technical Reports Server (NTRS)

    Johnson, Andrew E. (Inventor); Ivanov, Tonislav I. (Inventor); Huertas, Andres (Inventor)

    2015-01-01

    Apparatuses, systems, computer programs and methods for performing hazard detection and avoidance for landing vehicles are provided. Hazard assessment takes into consideration the geometry of the lander. Safety probabilities are computed for a plurality of pixels in a digital elevation map. The safety probabilities are combined for pixels associated with one or more aim points and orientations. A worst case probability value is assigned to each of the one or more aim points and orientations.

  6. A Model-based Framework for Risk Assessment in Human-Computer Controlled Systems

    NASA Technical Reports Server (NTRS)

    Hatanaka, Iwao

    2000-01-01

    The rapid growth of computer technology and innovation has played a significant role in the rise of computer automation of human tasks in modem production systems across all industries. Although the rationale for automation has been to eliminate "human error" or to relieve humans from manual repetitive tasks, various computer-related hazards and accidents have emerged as a direct result of increased system complexity attributed to computer automation. The risk assessment techniques utilized for electromechanical systems are not suitable for today's software-intensive systems or complex human-computer controlled systems. This thesis will propose a new systemic model-based framework for analyzing risk in safety-critical systems where both computers and humans are controlling safety-critical functions. A new systems accident model will be developed based upon modem systems theory and human cognitive processes to better characterize system accidents, the role of human operators, and the influence of software in its direct control of significant system functions. Better risk assessments will then be achievable through the application of this new framework to complex human-computer controlled systems.

  7. [The economics of preventing psycho-social risks].

    PubMed

    Golzio, Luigi

    2014-01-01

    The aim of the essay is to show the SHIELD methodology for helping the firm management to improve the risks prevention policy. It has been tested in the field with positive results. SHIELD is a cost-benefit analysis application to compare prevention and non-prevention costs, which arise from non-market risks. In the economic perspective safety risks (which include psycho-social risks) are non-market ones as they cause injures to workers during the job. SHIELD (Social Health Indicators for Economic Labour Decisions), is the original method proposed by the author. It is a cost benefits analysis application, which compares safety prevention and non-prevention costs. The comparison allow stop management to evaluate the efficiency of the current safety prevention policy as it helps top management to answer to the policy question: how much to invest in prevention costs? The costs comparison is obtained through the reclassification of safety costs between prevention and non-prevention costs (which are composed by claim damages and penalty sanction costs). SHIELD has been tested empirically in four companies operating in the agribusiness sector during a research financed by the Assessorato all'Agricoltura and INAI Regionale of Emilia Romagna Region. Results are postive: it has been found that the increase of prevention costs causes the cut of non-prevention costs in all companies looked into, as assumed by the high reliability organization theory. SHIELD can be applied to all companies which must have an accounting system by law, no matter of the industry they act. Its application has limited costs as SHIELD doesn't need changes in the accounting system. Safety costs sustained by the company are simply reclassified in prevention and non-prevention costs. The comparison of these two costs categories has been appreciated by top management of companies investigated as a useful support to decide the risks prevention policy for the company. The SHIELD original feature compared with others cost benefit analysis application is to compute registered costs in the company accounting system.

  8. Cyclic Symmetry Finite Element Forced Response Analysis of a Distortion-Tolerant Fan with Boundary Layer Ingestion

    NASA Technical Reports Server (NTRS)

    Min, J. B.; Reddy, T. S. R.; Bakhle, M. A.; Coroneos, R. M.; Stefko, G. L.; Provenza, A. J.; Duffy, K. P.

    2018-01-01

    Accurate prediction of the blade vibration stress is required to determine overall durability of fan blade design under Boundary Layer Ingestion (BLI) distorted flow environments. Traditional single blade modeling technique is incapable of representing accurate modeling for the entire rotor blade system subject to complex dynamic loading behaviors and vibrations in distorted flow conditions. A particular objective of our work was to develop a high-fidelity full-rotor aeromechanics analysis capability for a system subjected to a distorted inlet flow by applying cyclic symmetry finite element modeling methodology. This reduction modeling method allows computationally very efficient analysis using a small periodic section of the full rotor blade system. Experimental testing by the use of the 8-foot by 6-foot Supersonic Wind Tunnel Test facility at NASA Glenn Research Center was also carried out for the system designated as the Boundary Layer Ingesting Inlet/Distortion-Tolerant Fan (BLI2DTF) technology development. The results obtained from the present numerical modeling technique were evaluated with those of the wind tunnel experimental test, toward establishing a computationally efficient aeromechanics analysis modeling tool facilitating for analyses of the full rotor blade systems subjected to a distorted inlet flow conditions. Fairly good correlations were achieved hence our computational modeling techniques were fully demonstrated. The analysis result showed that the safety margin requirement set in the BLI2DTF fan blade design provided a sufficient margin with respect to the operating speed range.

  9. Wenxin Keli for atrial fibrillation

    PubMed Central

    He, Zhuogen; Zheng, Minan; Xie, Pingchang; Wang, Yuanping; Yan, Xia; Deng, Dingwei

    2018-01-01

    Abstract Background: Atrial fibrillation (AF) is a most common cardiac arrhythmia in clinical practice. In China, Wenxin Keli (WXKL) therapy is a common treatment for AF, but its effects and safety remain uncertain. This protocol is to provide the methods used to assess the effectiveness and safety of WXKL for the treatment of patients with AF. Methods: We will search comprehensively the 4 English databases EMBASE, the Cochrane Central Register of Controlled Trials (Cochrane Library), PubMed, and Medline and 3 Chinese databases China National Knowledge Infrastructure (CNKI), Chinese Biomedical Literature Database (CBM), and Chinese Science and Technology Periodical database (VIP) on computer on March 2018 for the randomized controlled trials (RCTs) regarding WXKL for AF. The therapeutic effects according to the sinus rhythm and p-wave dispersion (Pwd) will be accepted as the primary outcomes. We will use RevMan V.5.3 software as well to compute the data synthesis carefully when a meta-analysis is allowed. Results: This study will provide a high-quality synthesis of current evidence of WXKL for AF. Conclusion: The conclusion of our systematic review will provide evidence to judge whether WXKL is an effective intervention for patient with AF. PROSPERO registration number: PROSPERO CRD 42018082045. PMID:29702984

  10. A Simulation Framework for Battery Cell Impact Safety Modeling Using LS-DYNA

    DOE PAGES

    Marcicki, James; Zhu, Min; Bartlett, Alexander; ...

    2017-02-04

    The development process of electrified vehicles can benefit significantly from computer-aided engineering tools that predict themultiphysics response of batteries during abusive events. A coupled structural, electrical, electrochemical, and thermal model framework has been developed within the commercially available LS-DYNA software. The finite element model leverages a three-dimensional mesh structure that fully resolves the unit cell components. The mechanical solver predicts the distributed stress and strain response with failure thresholds leading to the onset of an internal short circuit. In this implementation, an arbitrary compressive strain criterion is applied locally to each unit cell. A spatially distributed equivalent circuit model providesmore » an empirical representation of the electrochemical responsewith minimal computational complexity.The thermalmodel provides state information to index the electrical model parameters, while simultaneously accepting irreversible and reversible sources of heat generation. The spatially distributed models of the electrical and thermal dynamics allow for the localization of current density and corresponding temperature response. The ability to predict the distributed thermal response of the cell as its stored energy is completely discharged through the short circuit enables an engineering safety assessment. A parametric analysis of an exemplary model is used to demonstrate the simulation capabilities.« less

  11. Test Facilities and Experience on Space Nuclear System Developments at the Kurchatov Institute

    NASA Astrophysics Data System (ADS)

    Ponomarev-Stepnoi, Nikolai N.; Garin, Vladimir P.; Glushkov, Evgeny S.; Kompaniets, George V.; Kukharkin, Nikolai E.; Madeev, Vicktor G.; Papin, Vladimir K.; Polyakov, Dmitry N.; Stepennov, Boris S.; Tchuniyaev, Yevgeny I.; Tikhonov, Lev Ya.; Uksusov, Yevgeny I.

    2004-02-01

    The complexity of space fission systems and rigidity of requirement on minimization of weight and dimension characteristics along with the wish to decrease expenditures on their development demand implementation of experimental works which results shall be used in designing, safety substantiation, and licensing procedures. Experimental facilities are intended to solve the following tasks: obtainment of benchmark data for computer code validations, substantiation of design solutions when computational efforts are too expensive, quality control in a production process, and ``iron'' substantiation of criticality safety design solutions for licensing and public relations. The NARCISS and ISKRA critical facilities and unique ORM facility on shielding investigations at the operating OR nuclear research reactor were created in the Kurchatov Institute to solve the mentioned tasks. The range of activities performed at these facilities within the implementation of the previous Russian nuclear power system programs is briefly described in the paper. This experience shall be analyzed in terms of methodological approach to development of future space nuclear systems (this analysis is beyond this paper). Because of the availability of these facilities for experiments, the brief description of their critical assemblies and characteristics is given in this paper.

  12. Advanced Computational Thermal Fluid Physics (CTFP) and Its Assessment for Light Water Reactors and Supercritical Reactors

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    D.M. McEligot; K. G. Condie; G. E. McCreery

    2005-10-01

    Background: The ultimate goal of the study is the improvement of predictive methods for safety analyses and design of Generation IV reactor systems such as supercritical water reactors (SCWR) for higher efficiency, improved performance and operation, design simplification, enhanced safety and reduced waste and cost. The objective of this Korean / US / laboratory / university collaboration of coupled fundamental computational and experimental studies is to develop the supporting knowledge needed for improved predictive techniques for use in the technology development of Generation IV reactor concepts and their passive safety systems. The present study emphasizes SCWR concepts in the Generationmore » IV program.« less

  13. 10 CFR 830.204 - Documented safety analysis.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 10 Energy 4 2010-01-01 2010-01-01 false Documented safety analysis. 830.204 Section 830.204 Energy DEPARTMENT OF ENERGY NUCLEAR SAFETY MANAGEMENT Safety Basis Requirements § 830.204 Documented safety analysis... approval from DOE for the methodology used to prepare the documented safety analysis for the facility...

  14. Large Scale Experiments on Spacecraft Fire Safety

    NASA Technical Reports Server (NTRS)

    Urban, David L.; Ruff, Gary A.; Minster, Olivier; Toth, Balazs; Fernandez-Pello, A. Carlos; T'ien, James S.; Torero, Jose L.; Cowlard, Adam J.; Legros, Guillaume; Eigenbrod, Christian; hide

    2012-01-01

    Full scale fire testing complemented by computer modelling has provided significant know how about the risk, prevention and suppression of fire in terrestrial systems (cars, ships, planes, buildings, mines, and tunnels). In comparison, no such testing has been carried out for manned spacecraft due to the complexity, cost and risk associated with operating a long duration fire safety experiment of a relevant size in microgravity. Therefore, there is currently a gap in knowledge of fire behaviour in spacecraft. The entire body of low-gravity fire research has either been conducted in short duration ground-based microgravity facilities or has been limited to very small fuel samples. Still, the work conducted to date has shown that fire behaviour in low-gravity is very different from that in normal-gravity, with differences observed for flammability limits, ignition delay, flame spread behaviour, flame colour and flame structure. As a result, the prediction of the behaviour of fires in reduced gravity is at present not validated. To address this gap in knowledge, a collaborative international project, Spacecraft Fire Safety, has been established with its cornerstone being the development of an experiment (Fire Safety 1) to be conducted on an ISS resupply vehicle, such as the Automated Transfer Vehicle (ATV) or Orbital Cygnus after it leaves the ISS and before it enters the atmosphere. A computer modelling effort will complement the experimental effort. Although the experiment will need to meet rigorous safety requirements to ensure the carrier vehicle does not sustain damage, the absence of a crew removes the need for strict containment of combustion products. This will facilitate the possibility of examining fire behaviour on a scale that is relevant to spacecraft fire safety and will provide unique data for fire model validation. This unprecedented opportunity will expand the understanding of the fundamentals of fire behaviour in spacecraft. The experiment is being developed by an international topical team that is collaboratively defining the experiment requirements and performing supporting analysis, experimentation and technology development. This paper presents the objectives, status and concept of this project.

  15. Large Scale Experiments on Spacecraft Fire Safety

    NASA Technical Reports Server (NTRS)

    Urban, David; Ruff, Gary A.; Minster, Olivier; Fernandez-Pello, A. Carlos; Tien, James S.; Torero, Jose L.; Legros, Guillaume; Eigenbrod, Christian; Smirnov, Nickolay; Fujita, Osamu; hide

    2012-01-01

    Full scale fire testing complemented by computer modelling has provided significant knowhow about the risk, prevention and suppression of fire in terrestrial systems (cars, ships, planes, buildings, mines, and tunnels). In comparison, no such testing has been carried out for manned spacecraft due to the complexity, cost and risk associated with operating a long duration fire safety experiment of a relevant size in microgravity. Therefore, there is currently a gap in knowledge of fire behaviour in spacecraft. The entire body of low-gravity fire research has either been conducted in short duration ground-based microgravity facilities or has been limited to very small fuel samples. Still, the work conducted to date has shown that fire behaviour in low-gravity is very different from that in normal gravity, with differences observed for flammability limits, ignition delay, flame spread behaviour, flame colour and flame structure. As a result, the prediction of the behaviour of fires in reduced gravity is at present not validated. To address this gap in knowledge, a collaborative international project, Spacecraft Fire Safety, has been established with its cornerstone being the development of an experiment (Fire Safety 1) to be conducted on an ISS resupply vehicle, such as the Automated Transfer Vehicle (ATV) or Orbital Cygnus after it leaves the ISS and before it enters the atmosphere. A computer modelling effort will complement the experimental effort. Although the experiment will need to meet rigorous safety requirements to ensure the carrier vehicle does not sustain damage, the absence of a crew removes the need for strict containment of combustion products. This will facilitate the possibility of examining fire behaviour on a scale that is relevant to spacecraft fire safety and will provide unique data for fire model validation. This unprecedented opportunity will expand the understanding of the fundamentals of fire behaviour in spacecraft. The experiment is being developed by an international topical team that is collaboratively defining the experiment requirements and performing supporting analysis, experimentation and technology development. This paper presents the objectives, status and concept of this project.

  16. System Risk Assessment and Allocation in Conceptual Design

    NASA Technical Reports Server (NTRS)

    Mahadevan, Sankaran; Smith, Natasha L.; Zang, Thomas A. (Technical Monitor)

    2003-01-01

    As aerospace systems continue to evolve in addressing newer challenges in air and space transportation, there exists a heightened priority for significant improvement in system performance, cost effectiveness, reliability, and safety. Tools, which synthesize multidisciplinary integration, probabilistic analysis, and optimization, are needed to facilitate design decisions allowing trade-offs between cost and reliability. This study investigates tools for probabilistic analysis and probabilistic optimization in the multidisciplinary design of aerospace systems. A probabilistic optimization methodology is demonstrated for the low-fidelity design of a reusable launch vehicle at two levels, a global geometry design and a local tank design. Probabilistic analysis is performed on a high fidelity analysis of a Navy missile system. Furthermore, decoupling strategies are introduced to reduce the computational effort required for multidisciplinary systems with feedback coupling.

  17. Implementation guide for monitoring work zone safety and mobility impacts

    DOT National Transportation Integrated Search

    2009-01-01

    This implementation guide describes the conceptual framework, data requirements, and computational procedures for determining the safety and mobility impacts of work zones in Texas. Researchers designed the framework and procedures to assist district...

  18. IAEA Coordinated Research Project on HTGR Reactor Physics, Thermal-hydraulics and Depletion Uncertainty Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Strydom, Gerhard; Bostelmann, F.

    The continued development of High Temperature Gas Cooled Reactors (HTGRs) requires verification of HTGR design and safety features with reliable high fidelity physics models and robust, efficient, and accurate codes. The predictive capability of coupled neutronics/thermal-hydraulics and depletion simulations for reactor design and safety analysis can be assessed with sensitivity analysis (SA) and uncertainty analysis (UA) methods. Uncertainty originates from errors in physical data, manufacturing uncertainties, modelling and computational algorithms. (The interested reader is referred to the large body of published SA and UA literature for a more complete overview of the various types of uncertainties, methodologies and results obtained).more » SA is helpful for ranking the various sources of uncertainty and error in the results of core analyses. SA and UA are required to address cost, safety, and licensing needs and should be applied to all aspects of reactor multi-physics simulation. SA and UA can guide experimental, modelling, and algorithm research and development. Current SA and UA rely either on derivative-based methods such as stochastic sampling methods or on generalized perturbation theory to obtain sensitivity coefficients. Neither approach addresses all needs. In order to benefit from recent advances in modelling and simulation and the availability of new covariance data (nuclear data uncertainties) extensive sensitivity and uncertainty studies are needed for quantification of the impact of different sources of uncertainties on the design and safety parameters of HTGRs. Only a parallel effort in advanced simulation and in nuclear data improvement will be able to provide designers with more robust and well validated calculation tools to meet design target accuracies. In February 2009, the Technical Working Group on Gas-Cooled Reactors (TWG-GCR) of the International Atomic Energy Agency (IAEA) recommended that the proposed Coordinated Research Program (CRP) on the HTGR Uncertainty Analysis in Modelling (UAM) be implemented. This CRP is a continuation of the previous IAEA and Organization for Economic Co-operation and Development (OECD)/Nuclear Energy Agency (NEA) international activities on Verification and Validation (V&V) of available analytical capabilities for HTGR simulation for design and safety evaluations. Within the framework of these activities different numerical and experimental benchmark problems were performed and insight was gained about specific physics phenomena and the adequacy of analysis methods.« less

  19. Eigenvalue Contributon Estimator for Sensitivity Calculations with TSUNAMI-3D

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rearden, Bradley T; Williams, Mark L

    2007-01-01

    Since the release of the Tools for Sensitivity and Uncertainty Analysis Methodology Implementation (TSUNAMI) codes in SCALE [1], the use of sensitivity and uncertainty analysis techniques for criticality safety applications has greatly increased within the user community. In general, sensitivity and uncertainty analysis is transitioning from a technique used only by specialists to a practical tool in routine use. With the desire to use the tool more routinely comes the need to improve the solution methodology to reduce the input and computational burden on the user. This paper reviews the current solution methodology of the Monte Carlo eigenvalue sensitivity analysismore » sequence TSUNAMI-3D, describes an alternative approach, and presents results from both methodologies.« less

  20. Structural Reliability Analysis and Optimization: Use of Approximations

    NASA Technical Reports Server (NTRS)

    Grandhi, Ramana V.; Wang, Liping

    1999-01-01

    This report is intended for the demonstration of function approximation concepts and their applicability in reliability analysis and design. Particularly, approximations in the calculation of the safety index, failure probability and structural optimization (modification of design variables) are developed. With this scope in mind, extensive details on probability theory are avoided. Definitions relevant to the stated objectives have been taken from standard text books. The idea of function approximations is to minimize the repetitive use of computationally intensive calculations by replacing them with simpler closed-form equations, which could be nonlinear. Typically, the approximations provide good accuracy around the points where they are constructed, and they need to be periodically updated to extend their utility. There are approximations in calculating the failure probability of a limit state function. The first one, which is most commonly discussed, is how the limit state is approximated at the design point. Most of the time this could be a first-order Taylor series expansion, also known as the First Order Reliability Method (FORM), or a second-order Taylor series expansion (paraboloid), also known as the Second Order Reliability Method (SORM). From the computational procedure point of view, this step comes after the design point identification; however, the order of approximation for the probability of failure calculation is discussed first, and it is denoted by either FORM or SORM. The other approximation of interest is how the design point, or the most probable failure point (MPP), is identified. For iteratively finding this point, again the limit state is approximated. The accuracy and efficiency of the approximations make the search process quite practical for analysis intensive approaches such as the finite element methods; therefore, the crux of this research is to develop excellent approximations for MPP identification and also different approximations including the higher-order reliability methods (HORM) for representing the failure surface. This report is divided into several parts to emphasize different segments of the structural reliability analysis and design. Broadly, it consists of mathematical foundations, methods and applications. Chapter I discusses the fundamental definitions of the probability theory, which are mostly available in standard text books. Probability density function descriptions relevant to this work are addressed. In Chapter 2, the concept and utility of function approximation are discussed for a general application in engineering analysis. Various forms of function representations and the latest developments in nonlinear adaptive approximations are presented with comparison studies. Research work accomplished in reliability analysis is presented in Chapter 3. First, the definition of safety index and most probable point of failure are introduced. Efficient ways of computing the safety index with a fewer number of iterations is emphasized. In chapter 4, the probability of failure prediction is presented using first-order, second-order and higher-order methods. System reliability methods are discussed in chapter 5. Chapter 6 presents optimization techniques for the modification and redistribution of structural sizes for improving the structural reliability. The report also contains several appendices on probability parameters.

  1. The Role of Computer-Assisted Technology in Post-Traumatic Orbital Reconstruction: A PRISMA-driven Systematic Review.

    PubMed

    Wan, Kelvin H; Chong, Kelvin K L; Young, Alvin L

    2015-12-08

    Post-traumatic orbital reconstruction remains a surgical challenge and requires careful preoperative planning, sound anatomical knowledge and good intraoperative judgment. Computer-assisted technology has the potential to reduce error and subjectivity in the management of these complex injuries. A systematic review of the literature was conducted to explore the emerging role of computer-assisted technologies in post-traumatic orbital reconstruction, in terms of functional and safety outcomes. We searched for articles comparing computer-assisted procedures with conventional surgery and studied outcomes on diplopia, enophthalmos, or procedure-related complications. Six observational studies with 273 orbits at a mean follow-up of 13 months were included. Three out of 4 studies reported significantly fewer patients with residual diplopia in the computer-assisted group, while only 1 of the 5 studies reported better improvement in enophthalmos in the assisted group. Types and incidence of complications were comparable. Study heterogeneities limiting statistical comparison by meta-analysis will be discussed. This review highlights the scarcity of data on computer-assisted technology in orbital reconstruction. The result suggests that computer-assisted technology may offer potential advantage in treating diplopia while its role remains to be confirmed in enophthalmos. Additional well-designed and powered randomized controlled trials are much needed.

  2. Computer-assisted spinal osteotomy: a technical note and report of four cases.

    PubMed

    Fujibayashi, Shunsuke; Neo, Masashi; Takemoto, Mitsuru; Ota, Masato; Nakayama, Tomitaka; Toguchida, Junya; Nakamura, Takashi

    2010-08-15

    A report of 4 cases of spinal osteotomy performed under the guidance of a computer-assisted navigation system and a technical note about the use of the navigation system for spinal osteotomy. To document the surgical technique and usefulness of computer-assisted surgery for spinal osteotomy. A computer-assisted navigation system provides accurate 3-dimensional (3D) real-time surgical information during the operation. Although there are many reports on the accuracy and usefulness of a navigation system for pedicle screw placement, there are few reports on the application for spinal osteotomy. We report on 4 complex cases including 3 solitary malignant spinal tumors and 1 spinal kyphotic deformity of ankylosing spondylitis, which were treated surgically using a computer-assisted spinal osteotomy. The surgical technique and postoperative clinical and radiologic results are presented. 3D spinal osteotomy under the guidance of a computer-assisted navigation system was performed successfully in 4 patients. All malignant tumors were resected en bloc, and the spinal deformity was corrected precisely according to the preoperative plan. Pathologic analysis confirmed the en bloc resection without tumor exposure in the 3 patients with a spinal tumor. The use of a computer-assisted navigation system will help ensure the safety and efficacy of a complex 3D spinal osteotomy.

  3. 48 CFR 1852.223-75 - Major breach of safety or security.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... of safeguarding against espionage, sabotage, crime (including computer crime), or attack. A major... of safeguarding against espionage, sabotage, crime (including computer crime), or attack. A major...

  4. 48 CFR 1852.223-75 - Major breach of safety or security.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... of safeguarding against espionage, sabotage, crime (including computer crime), or attack. A major... of safeguarding against espionage, sabotage, crime (including computer crime), or attack. A major...

  5. 48 CFR 1852.223-75 - Major breach of safety or security.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... of safeguarding against espionage, sabotage, crime (including computer crime), or attack. A major... of safeguarding against espionage, sabotage, crime (including computer crime), or attack. A major...

  6. 48 CFR 1852.223-75 - Major breach of safety or security.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... of safeguarding against espionage, sabotage, crime (including computer crime), or attack. A major... of safeguarding against espionage, sabotage, crime (including computer crime), or attack. A major...

  7. REgulatory Management: Communication About Technology-Based Innovations Can Be Improved

    DTIC Science & Technology

    2001-02-01

    locations and was built to accommodate a variety of users’ computing environments. • FDA’s Center for Food Safety and Applied Nutrition’s Voluntary...transportation communities. Food Safety Initiative Ensuring the safety of the nation’s food supply is the responsibility of an interlocking monitoring system...that watches over food production and distribution at every level of government—local, state, and national. Given the complex set of food safety laws

  8. Long-term safety assessment of trench-type surface repository at Chernobyl, Ukraine - computer model and comparison with results from simplified models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Haverkamp, B.; Krone, J.; Shybetskyi, I.

    2013-07-01

    The Radioactive Waste Disposal Facility (RWDF) Buryakovka was constructed in 1986 as part of the intervention measures after the accident at Chernobyl NPP (ChNPP). Today, the surface repository for solid low and intermediate level waste (LILW) is still being operated but its maximum capacity is nearly reached. Long-existing plans for increasing the capacity of the facility shall be implemented in the framework of the European Commission INSC Programme (Instrument for Nuclear Safety Co-operation). Within the first phase of this project, DBE Technology GmbH prepared a safety analysis report of the facility in its current state (SAR) and a preliminary safetymore » analysis report (PSAR) for a future extended facility based on the planned enlargement. In addition to a detailed mathematical model, also simplified models have been developed to verify results of the former one and enhance confidence in the results. Comparison of the results show that - depending on the boundary conditions - simplifications like modeling the multi trench repository as one generic trench might have very limited influence on the overall results compared to the general uncertainties associated with respective long-term calculations. In addition to their value in regard to verification of more complex models which is important to increase confidence in the overall results, such simplified models can also offer the possibility to carry out time consuming calculations like probabilistic calculations or detailed sensitivity analysis in an economic manner. (authors)« less

  9. Mobile Computing for Aerospace Applications

    NASA Technical Reports Server (NTRS)

    Alena, Richard; Swietek, Gregory E. (Technical Monitor)

    1994-01-01

    The use of commercial computer technology in specific aerospace mission applications can reduce the cost and project cycle time required for the development of special-purpose computer systems. Additionally, the pace of technological innovation in the commercial market has made new computer capabilities available for demonstrations and flight tests. Three areas of research and development being explored by the Portable Computer Technology Project at NASA Ames Research Center are the application of commercial client/server network computing solutions to crew support and payload operations, the analysis of requirements for portable computing devices, and testing of wireless data communication links as extensions to the wired network. This paper will present computer architectural solutions to portable workstation design including the use of standard interfaces, advanced flat-panel displays and network configurations incorporating both wired and wireless transmission media. It will describe the design tradeoffs used in selecting high-performance processors and memories, interfaces for communication and peripheral control, and high resolution displays. The packaging issues for safe and reliable operation aboard spacecraft and aircraft are presented. The current status of wireless data links for portable computers is discussed from a system design perspective. An end-to-end data flow model for payload science operations from the experiment flight rack to the principal investigator is analyzed using capabilities provided by the new generation of computer products. A future flight experiment on-board the Russian MIR space station will be described in detail including system configuration and function, the characteristics of the spacecraft operating environment, the flight qualification measures needed for safety review, and the specifications of the computing devices to be used in the experiment. The software architecture chosen shall be presented. An analysis of the performance characteristics of wireless data links in the spacecraft environment will be discussed. Network performance and operation will be modeled and preliminary test results presented. A crew support application will be demonstrated in conjunction with the network metrics experiment.

  10. The home electronic media environment and parental safety concerns: relationships with outdoor time after school and over the weekend among 9-11 year old children.

    PubMed

    Wilkie, Hannah J; Standage, Martyn; Gillison, Fiona B; Cumming, Sean P; Katzmarzyk, Peter T

    2018-04-05

    Time spent outdoors is associated with higher physical activity levels among children, yet it may be threatened by parental safety concerns and the attraction of indoor sedentary pursuits. The purpose of this study was to explore the relationships between these factors and outdoor time during children's discretionary periods (i.e., after school and over the weekend). Data from 462 children aged 9-11 years old were analysed using generalised linear mixed models. The odds of spending > 1 h outdoors after school, and > 2 h outdoors on a weekend were computed, according to demographic variables, screen-based behaviours, media access, and parental safety concerns. Interactions with sex and socioeconomic status (SES) were explored. Boys, low SES participants, and children who played on their computer for < 2 h on a school day had higher odds of spending > 1 h outside after school than girls, high SES children and those playing on a computer for ≥2 h, respectively. Counterintuitive results were found for access to media devices and crime-related safety concerns as both of these were positively associated with time spent outdoors after school. A significant interaction for traffic-related concerns*sex was found; higher road safety concerns were associated with lower odds of outdoor time after school in boys only. Age was associated with weekend outdoor time, which interacted with sex and SES; older children were more likely to spend > 2 h outside on weekends but this was only significant among girls and high SES participants. Our results suggest that specific groups of children are less likely to spend their free time outside, and it would seem that only prolonged recreational computer use has a negative association with children's outdoor time after school. Further research is needed to explore potential underlying mechanisms, and parental safety concerns in more detail.

  11. Space Shuttle Communications Coverage Analysis for Thermal Tile Inspection

    NASA Technical Reports Server (NTRS)

    Kroll, Quin D.; Hwu, Shian U.; Upanavage, Matthew; Boster, John P.; Chavez, Mark A.

    2009-01-01

    The space shuttle ultra-high frequency Space-to-Space Communication System has to provide adequate communication coverage for astronauts who are performing thermal tile inspection and repair on the underside of the space shuttle orbiter (SSO). Careful planning and quantitative assessment are necessary to ensure successful system operations and mission safety in this work environment. This study assesses communication systems performance for astronauts who are working in the underside, non-line-of-sight shadow region on the space shuttle. All of the space shuttle and International Space Station (ISS) transmitting antennas are blocked by the SSO structure. To ensure communication coverage at planned inspection worksites, the signal strength and link margin between the SSO/ISS antennas and the extravehicular activity astronauts, whose line-of-sight is blocked by vehicle structure, was analyzed. Investigations were performed using rigorous computational electromagnetic modeling techniques. Signal strength was obtained by computing the reflected and diffracted fields along the signal propagation paths between transmitting and receiving antennas. Radio frequency (RF) coverage was determined for thermal tile inspection and repair missions using the results of this computation. Analysis results from this paper are important in formulating the limits on reliable communication range and RF coverage at planned underside inspection and repair worksites.

  12. Trends in radiology and experimental research.

    PubMed

    Sardanelli, Francesco

    2017-01-01

    European Radiology Experimental , the new journal launched by the European Society of Radiology, is placed in the context of three general and seven radiology-specific trends. After describing the impact of population aging, personalized/precision medicine, and information technology development, the article considers the following trends: the tension between subspecialties and the unity of the discipline; attention to patient safety; the challenge of reproducibility for quantitative imaging; standardized and structured reporting; search for higher levels of evidence in radiology (from diagnostic performance to patient outcome); the increasing relevance of interventional radiology; and continuous technological evolution. The new journal will publish not only studies on phantoms, cells, or animal models but also those describing development steps of imaging biomarkers or those exploring secondary end-points of large clinical trials. Moreover, consideration will be given to studies regarding: computer modelling and computer aided detection and diagnosis; contrast materials, tracers, and theranostics; advanced image analysis; optical, molecular, hybrid and fusion imaging; radiomics and radiogenomics; three-dimensional printing, information technology, image reconstruction and post-processing, big data analysis, teleradiology, clinical decision support systems; radiobiology; radioprotection; and physics in radiology. The journal aims to establish a forum for basic science, computer and information technology, radiology, and other medical subspecialties.

  13. Comparison of computational results of the SABRE LMFBR pin bundle blockage code with data from well-instrumented out-of-pile test bundles (THORS bundles 3A and 5A)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dearing, J.F.

    The Subchannel Analysis of Blockages in Reactor Elements (SABRE) computer code, developed by the United Kingdom Atomic Energy Authority, is currently the only practical tool available for performing detailed analyses of velocity and temperature fields in the recirculating flow regions downstream of blockages in liquid-metal fast breeder reactor (LMFBR) pin bundles. SABRE is a subchannel analysis code; that is, it accurately represents the complex geometry of nuclear fuel pins arranged on a triangular lattice. The results of SABRE computational models are compared here with temperature data from two out-of-pile 19-pin test bundles from the Thermal-Hydraulic Out-of-Reactor Safety (THORS) Facility atmore » Oak Ridge National Laboratory. One of these bundles has a small central flow blockage (bundle 3A), while the other has a large edge blockage (bundle 5A). Values that give best agreement with experiment for the empirical thermal mixing correlation factor, FMIX, in SABRE are suggested. These values of FMIX are Reynolds-number dependent, however, indicating that the coded turbulent mixing correlation is not appropriate for wire-wrap pin bundles.« less

  14. 14 CFR Appendix B to Part 415 - Safety Review Document Outline

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ....0Flight Safety (§ 415.115) 4.1Initial Flight Safety Analysis 4.1.1Flight Safety Sub-Analyses, Methods, and... Analysis Data 4.2Radionuclide Data (where applicable) 4.3Flight Safety Plan 4.3.1Flight Safety Personnel 4... Safety (§ 415.117) 5.1Ground Safety Analysis Report 5.2Ground Safety Plan 6.0Launch Plans (§ 415.119 and...

  15. 14 CFR Appendix B to Part 415 - Safety Review Document Outline

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ....0Flight Safety (§ 415.115) 4.1Initial Flight Safety Analysis 4.1.1Flight Safety Sub-Analyses, Methods, and... Analysis Data 4.2Radionuclide Data (where applicable) 4.3Flight Safety Plan 4.3.1Flight Safety Personnel 4... Safety (§ 415.117) 5.1Ground Safety Analysis Report 5.2Ground Safety Plan 6.0Launch Plans (§ 415.119 and...

  16. Ensuring the validity of calculated subcritical limits

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Clark, H.K.

    1977-01-01

    The care taken at the Savannah River Laboratory and Plant to ensure the validity of calculated subcritical limits is described. Close attention is given to ANSI N16.1-1975, ''Validation of Calculational Methods for Nuclear Criticality Safety.'' The computer codes used for criticality safety computations, which are listed and are briefly described, have been placed in the SRL JOSHUA system to facilitate calculation and to reduce input errors. A driver module, KOKO, simplifies and standardizes input and links the codes together in various ways. For any criticality safety evaluation, correlations of the calculational methods are made with experiment to establish bias. Occasionallymore » subcritical experiments are performed expressly to provide benchmarks. Calculated subcritical limits contain an adequate but not excessive margin to allow for uncertainty in the bias. The final step in any criticality safety evaluation is the writing of a report describing the calculations and justifying the margin.« less

  17. Systems cost/performance analysis (study 2.3). Volume 2: Systems cost/performance model. [unmanned automated payload programs and program planning

    NASA Technical Reports Server (NTRS)

    Campbell, B. H.

    1974-01-01

    A methodology which was developed for balanced designing of spacecraft subsystems and interrelates cost, performance, safety, and schedule considerations was refined. The methodology consists of a two-step process: the first step is one of selecting all hardware designs which satisfy the given performance and safety requirements, the second step is one of estimating the cost and schedule required to design, build, and operate each spacecraft design. Using this methodology to develop a systems cost/performance model allows the user of such a model to establish specific designs and the related costs and schedule. The user is able to determine the sensitivity of design, costs, and schedules to changes in requirements. The resulting systems cost performance model is described and implemented as a digital computer program.

  18. Evaluation of liquefaction potential for building code

    NASA Astrophysics Data System (ADS)

    Nunziata, C.; De Nisco, G.; Panza, G. F.

    2008-07-01

    The standard approach for the evaluation of the liquefaction susceptibility is based on the estimation of a safety factor between the cyclic shear resistance to liquefaction and the earthquake induced shear stress. Recently, an updated procedure based on shear-wave velocities (Vs) has been proposed which could be more easily applied. These methods have been applied at La Plaja beach of Catania, that experienced liquefaction because of the 1693 earthquake. The detailed geotechnical and Vs information and the realistic ground motion computed for the 1693 event let us compare the two approaches. The successful application of the Vs procedure, slightly modified to fit historical and safety factor information, even if additional field performances are needed, encourages the development of a guide for liquefaction potential analysis, based on well defined Vs profiles to be included in the italian seismic code.

  19. 77 FR 50726 - Software Requirement Specifications for Digital Computer Software and Complex Electronics Used in...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-08-22

    ... Computer Software and Complex Electronics Used in Safety Systems of Nuclear Power Plants AGENCY: Nuclear...-1209, ``Software Requirement Specifications for Digital Computer Software and Complex Electronics used... Electronics Engineers (ANSI/IEEE) Standard 830-1998, ``IEEE Recommended Practice for Software Requirements...

  20. 47 CFR 80.771 - Method of computing coverage.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 47 Telecommunication 5 2012-10-01 2012-10-01 false Method of computing coverage. 80.771 Section 80.771 Telecommunication FEDERAL COMMUNICATIONS COMMISSION (CONTINUED) SAFETY AND SPECIAL RADIO SERVICES STATIONS IN THE MARITIME SERVICES Standards for Computing Public Coast Station VHF Coverage § 80.771 Method...

  1. 29 CFR 1921.22 - Computation of time.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 29 Labor 7 2010-07-01 2010-07-01 false Computation of time. 1921.22 Section 1921.22 Labor Regulations Relating to Labor (Continued) OCCUPATIONAL SAFETY AND HEALTH ADMINISTRATION, DEPARTMENT OF LABOR... WORKERS' COMPENSATION ACT Miscellaneous § 1921.22 Computation of time. Sundays and holidays shall be...

  2. Application of fuzzy fault tree analysis based on modified fuzzy AHP and fuzzy TOPSIS for fire and explosion in the process industry.

    PubMed

    Yazdi, Mohammad; Korhan, Orhan; Daneshvar, Sahand

    2018-05-09

    This study aimed at establishing fault tree analysis (FTA) using expert opinion to compute the probability of an event. To find the probability of the top event (TE), all probabilities of the basic events (BEs) should be available when the FTA is drawn. In this case, employing expert judgment can be used as an alternative to failure data in an awkward situation. The fuzzy analytical hierarchy process as a standard technique is used to give a specific weight to each expert, and fuzzy set theory is engaged for aggregating expert opinion. In this regard, the probability of BEs will be computed and, consequently, the probability of the TE obtained using Boolean algebra. Additionally, to reduce the probability of the TE in terms of three parameters (safety consequences, cost and benefit), the importance measurement technique and modified TOPSIS was employed. The effectiveness of the proposed approach is demonstrated with a real-life case study.

  3. A Novel Interdisciplinary Approach to Socio-Technical Complexity

    NASA Astrophysics Data System (ADS)

    Bassetti, Chiara

    The chapter presents a novel interdisciplinary approach that integrates micro-sociological analysis into computer-vision and pattern-recognition modeling and algorithms, the purpose being to tackle socio-technical complexity at a systemic yet micro-grounded level. The approach is empirically-grounded and both theoretically- and analytically-driven, yet systemic and multidimensional, semi-supervised and computable, and oriented towards large scale applications. The chapter describes the proposed approach especially as for its sociological foundations, and as applied to the analysis of a particular setting --i.e. sport-spectator crowds. Crowds, better defined as large gatherings, are almost ever-present in our societies, and capturing their dynamics is crucial. From social sciences to public safety management and emergency response, modeling and predicting large gatherings' presence and dynamics, thus possibly preventing critical situations and being able to properly react to them, is fundamental. This is where semi/automated technologies can make the difference. The work presented in this chapter is intended as a scientific step towards such an objective.

  4. Software Design Improvements. Part 1; Software Benefits and Limitations

    NASA Technical Reports Server (NTRS)

    Lalli, Vincent R.; Packard, Michael H.; Ziemianski, Tom

    1997-01-01

    Computer hardware and associated software have been used for many years to process accounting information, to analyze test data and to perform engineering analysis. Now computers and software also control everything from automobiles to washing machines and the number and type of applications are growing at an exponential rate. The size of individual program has shown similar growth. Furthermore, software and hardware are used to monitor and/or control potentially dangerous products and safety-critical systems. These uses include everything from airplanes and braking systems to medical devices and nuclear plants. The question is: how can this hardware and software be made more reliable? Also, how can software quality be improved? What methodology needs to be provided on large and small software products to improve the design and how can software be verified?

  5. Reducing Wrong Patient Selection Errors: Exploring the Design Space of User Interface Techniques

    PubMed Central

    Sopan, Awalin; Plaisant, Catherine; Powsner, Seth; Shneiderman, Ben

    2014-01-01

    Wrong patient selection errors are a major issue for patient safety; from ordering medication to performing surgery, the stakes are high. Widespread adoption of Electronic Health Record (EHR) and Computerized Provider Order Entry (CPOE) systems makes patient selection using a computer screen a frequent task for clinicians. Careful design of the user interface can help mitigate the problem by helping providers recall their patients’ identities, accurately select their names, and spot errors before orders are submitted. We propose a catalog of twenty seven distinct user interface techniques, organized according to a task analysis. An associated video demonstrates eighteen of those techniques. EHR designers who consider a wider range of human-computer interaction techniques could reduce selection errors, but verification of efficacy is still needed. PMID:25954415

  6. On-Line Safe Flight Envelope Determination for Impaired Aircraft

    NASA Technical Reports Server (NTRS)

    Lombaerts, Thomas; Schuet, Stefan; Acosta, Diana; Kaneshige, John

    2015-01-01

    The design and simulation of an on-line algorithm which estimates the safe maneuvering envelope of aircraft is discussed in this paper. The trim envelope is estimated using probabilistic methods and efficient high-fidelity model based computations of attainable equilibrium sets. From this trim envelope, a robust reachability analysis provides the maneuverability limitations of the aircraft through an optimal control formulation. Both envelope limits are presented to the flight crew on the primary flight display. In the results section, scenarios are considered where this adaptive algorithm is capable of computing online changes to the maneuvering envelope due to impairment. Furthermore, corresponding updates to display features on the primary flight display are provided to potentially inform the flight crew of safety critical envelope alterations caused by the impairment.

  7. Reducing wrong patient selection errors: exploring the design space of user interface techniques.

    PubMed

    Sopan, Awalin; Plaisant, Catherine; Powsner, Seth; Shneiderman, Ben

    2014-01-01

    Wrong patient selection errors are a major issue for patient safety; from ordering medication to performing surgery, the stakes are high. Widespread adoption of Electronic Health Record (EHR) and Computerized Provider Order Entry (CPOE) systems makes patient selection using a computer screen a frequent task for clinicians. Careful design of the user interface can help mitigate the problem by helping providers recall their patients' identities, accurately select their names, and spot errors before orders are submitted. We propose a catalog of twenty seven distinct user interface techniques, organized according to a task analysis. An associated video demonstrates eighteen of those techniques. EHR designers who consider a wider range of human-computer interaction techniques could reduce selection errors, but verification of efficacy is still needed.

  8. PARTIAL RESTRAINING FORCE INTRODUCTION METHOD FOR DESIGNING CONSTRUCTION COUNTERMESURE ON ΔB METHOD

    NASA Astrophysics Data System (ADS)

    Nishiyama, Taku; Imanishi, Hajime; Chiba, Noriyuki; Ito, Takao

    Landslide or slope failure is a three-dimensional movement phenomenon, thus a three-dimensional treatment makes it easier to understand stability. The ΔB method (simplified three-dimensional slope stability analysis method) is based on the limit equilibrium method and equals to an approximate three-dimensional slope stability analysis that extends two-dimensional cross-section stability analysis results to assess stability. This analysis can be conducted using conventional spreadsheets or two-dimensional slope stability computational software. This paper describes the concept of the partial restraining force in-troduction method for designing construction countermeasures using the distribution of the restraining force found along survey lines, which is based on the distribution of survey line safety factors derived from the above-stated analysis. This paper also presents the transverse distributive method of restraining force used for planning ground stabilizing on the basis of the example analysis.

  9. Evaluation of Mid-Size Male Hybrid III Models for use in Spaceflight Occupant Protection Analysis

    NASA Technical Reports Server (NTRS)

    Putnam, J.; Somers, J.; Wells, J.; Newby, N.; Currie-Gregg, N.; Lawrence, C.

    2016-01-01

    Introduction: In an effort to improve occupant safety during dynamic phases of spaceflight, the National Aeronautics and Space Administration (NASA) has worked to develop occupant protection standards for future crewed spacecraft. One key aspect of these standards is the identification of injury mechanisms through anthropometric test devices (ATDs). Within this analysis, both physical and computational ATD evaluations are required to reasonably encompass the vast range of loading conditions any spaceflight crew may encounter. In this study the accuracy of publically available mid-size male HIII ATD finite element (FE) models are evaluated within applicable loading conditions against extensive sled testing performed on their physical counterparts. Methods: A series of sled tests were performed at the Wright Patterson Air force Base (WPAFB) employing variations of magnitude, duration, and impact direction to encompass the dynamic loading range for expected spaceflight. FE simulations were developed to the specifications of the test setup and driven using measured acceleration profiles. Both fast and detailed FE models of the mid-size male HIII were ran to quantify differences in their accuracy and thus assess the applicability of each within this field. Results: Preliminary results identify the dependence of model accuracy on loading direction, magnitude, and rate. Additionally the accuracy of individual response metrics are shown to vary across each model within evaluated test conditions. Causes for model inaccuracy are identified based on the observed relationships. Discussion: Computational modeling provides an essential component to ATD injury metric evaluation used to ensure the safety of future spaceflight occupants. The assessment of current ATD models lays the groundwork for how these models can be used appropriately in the future. Identification of limitations and possible paths for improvement aid in the development of these effective analysis tools.

  10. Evaluation of Mid-Size Male Hybrid III Models for use in Spaceflight Occupant Protection Analysis

    NASA Technical Reports Server (NTRS)

    Putnam, Jacob B.; Sommers, Jeffrey T.; Wells, Jessica A.; Newby, Nathaniel J.; Currie-Gregg, Nancy J.; Lawrence, Chuck

    2016-01-01

    In an effort to improve occupant safety during dynamic phases of spaceflight, the National Aeronautics and Space Administration (NASA) has worked to develop occupant protection standards for future crewed spacecraft. One key aspect of these standards is the identification of injury mechanisms through anthropometric test devices (ATDs). Within this analysis, both physical and computational ATD evaluations are required to reasonably encompass the vast range of loading conditions any spaceflight crew may encounter. In this study the accuracy of publically available mid-size male HIII ATD finite element (FE) models are evaluated within applicable loading conditions against extensive sled testing performed on their physical counterparts. Methods: A series of sled tests were performed at the Wright Patterson Air force Base (WPAFB) employing variations of magnitude, duration, and impact direction to encompass the dynamic loading range for expected spaceflight. FE simulations were developed to the specifications of the test setup and driven using measured acceleration profiles. Both fast and detailed FE models of the mid-size male HIII were ran to quantify differences in their accuracy and thus assess the applicability of each within this field. Results: Preliminary results identify the dependence of model accuracy on loading direction, magnitude, and rate. Additionally the accuracy of individual response metrics are shown to vary across each model within evaluated test conditions. Causes for model inaccuracy are identified based on the observed relationships. Discussion: Computational modeling provides an essential component to ATD injury metric evaluation used to ensure the safety of future spaceflight occupants. The assessment of current ATD models lays the groundwork for how these models can be used appropriately in the future. Identification of limitations and possible paths for improvement aid in the development of these effective analysis tools.

  11. Games that "work": using computer games to teach alcohol-affected children about fire and street safety.

    PubMed

    Coles, Claire D; Strickland, Dorothy C; Padgett, Lynne; Bellmoff, Lynnae

    2007-01-01

    Unintentional injuries are a leading cause of death and disability for children. Those with developmental disabilities, including children affected by prenatal alcohol exposure, are at highest risk for injuries. Although teaching safety skills is recommended to prevent injury, cognitive limitations and behavioral problems characteristic of children with fetal alcohol spectrum disorder make teaching these skills challenging for parents and teachers. In the current study, 32 children, ages 4-10, diagnosed with fetal alcohol syndrome (FAS) and partial FAS, learned fire and street safety through computer games that employed "virtual worlds" to teach recommended safety skills. Children were pretested on verbal knowledge of four safety elements for both fire and street safety conditions and then randomly assigned to one condition. After playing the game until mastery, children were retested verbally and asked to "generalize" their newly acquired skills in a behavioral context. They were retested after 1 week follow-up. Children showed significantly better knowledge of the game to which they were exposed, immediately and at follow-up, and the majority (72%) was able to generalize all four steps within a behavioral setting. Results suggested that this is a highly effective method for teaching safety skills to high-risk children who have learning difficulties.

  12. Discounting the value of safety: effects of perceived risk and effort.

    PubMed

    Sigurdsson, Sigurdur O; Taylor, Matthew A; Wirth, Oliver

    2013-09-01

    Although falls from heights remain the most prevalent cause of fatalities in the construction industry, factors impacting safety-related choices associated with work at heights are not completely understood. Better tools are needed to identify and study the factors influencing safety-related choices and decision making. Using a computer-based task within a behavioral economics paradigm, college students were presented a choice between two hypothetical scenarios that differed in working height and effort associated with retrieving and donning a safety harness. Participants were instructed to choose the scenario in which they were more likely to wear the safety harness. Based on choice patterns, switch points were identified, indicating when the perceived risk in both scenarios was equivalent. Switch points were a systematic function of working height and effort, and the quantified relation between perceived risk and effort was described well by a hyperbolic equation. Choice patterns revealed that the perceived risk of working at heights decreased as the effort to retrieve and don a safety harness increased. Results contribute to the development of computer-based procedure for assessing risk discounting within a behavioral economics framework. Such a procedure can be used as a research tool to study factors that influence safety-related decision making with a goal of informing more effective prevention and intervention strategies. Copyright © 2013 National Safety Council and Elsevier Ltd. All rights reserved.

  13. The development of an information system and installation of an Internet web database for the purposes of the occupational health and safety management system.

    PubMed

    Mavrikakis, I; Mantas, J; Diomidous, M

    2007-01-01

    This paper is based on the research on the possible structure of an information system for the purposes of occupational health and safety management. We initiated a questionnaire in order to find the possible interest on the part of potential users in the subject of occupational health and safety. The depiction of the potential interest is vital both for the software analysis cycle and development according to previous models. The evaluation of the results tends to create pilot applications among different enterprises. Documentation and process improvements ascertained quality of services, operational support, occupational health and safety advice are the basics of the above applications. Communication and codified information among intersted parts is the other target of the survey regarding health issues. Computer networks can offer such services. The network will consist of certain nodes responsible to inform executives on Occupational Health and Safety. A web database has been installed for inserting and searching documents. The submission of files to a server and the answers to questionnaires through the web help the experts to perform their activities. Based on the requirements of enterprises we have constructed a web file server. We submit files so that users can retrieve the files which they need. The access is limited to authorized users. Digital watermarks authenticate and protect digital objects.

  14. VISPA2: a scalable pipeline for high-throughput identification and annotation of vector integration sites.

    PubMed

    Spinozzi, Giulio; Calabria, Andrea; Brasca, Stefano; Beretta, Stefano; Merelli, Ivan; Milanesi, Luciano; Montini, Eugenio

    2017-11-25

    Bioinformatics tools designed to identify lentiviral or retroviral vector insertion sites in the genome of host cells are used to address the safety and long-term efficacy of hematopoietic stem cell gene therapy applications and to study the clonal dynamics of hematopoietic reconstitution. The increasing number of gene therapy clinical trials combined with the increasing amount of Next Generation Sequencing data, aimed at identifying integration sites, require both highly accurate and efficient computational software able to correctly process "big data" in a reasonable computational time. Here we present VISPA2 (Vector Integration Site Parallel Analysis, version 2), the latest optimized computational pipeline for integration site identification and analysis with the following features: (1) the sequence analysis for the integration site processing is fully compliant with paired-end reads and includes a sequence quality filter before and after the alignment on the target genome; (2) an heuristic algorithm to reduce false positive integration sites at nucleotide level to reduce the impact of Polymerase Chain Reaction or trimming/alignment artifacts; (3) a classification and annotation module for integration sites; (4) a user friendly web interface as researcher front-end to perform integration site analyses without computational skills; (5) the time speedup of all steps through parallelization (Hadoop free). We tested VISPA2 performances using simulated and real datasets of lentiviral vector integration sites, previously obtained from patients enrolled in a hematopoietic stem cell gene therapy clinical trial and compared the results with other preexisting tools for integration site analysis. On the computational side, VISPA2 showed a > 6-fold speedup and improved precision and recall metrics (1 and 0.97 respectively) compared to previously developed computational pipelines. These performances indicate that VISPA2 is a fast, reliable and user-friendly tool for integration site analysis, which allows gene therapy integration data to be handled in a cost and time effective fashion. Moreover, the web access of VISPA2 ( http://openserver.itb.cnr.it/vispa/ ) ensures accessibility and ease of usage to researches of a complex analytical tool. We released the source code of VISPA2 in a public repository ( https://bitbucket.org/andreacalabria/vispa2 ).

  15. Food Safety Education Using an Interactive Multimedia Kiosk in a WIC Setting: Correlates of Client Satisfaction and Practical Issues

    ERIC Educational Resources Information Center

    Trepka, Mary Jo; Newman, Frederick L.; Huffman, Fatma G.; Dixon, Zisca

    2010-01-01

    Objective: To assess acceptability of food safety education delivered by interactive multimedia (IMM) in a Supplemental Nutrition Program for Women, Infants and Children Program (WIC) clinic. Methods: Female clients or caregivers (n = 176) completed the food-handling survey; then an IMM food safety education program on a computer kiosk.…

  16. Delivering Food Safety Education to Middle School Students Using a Web-Based, Interactive, Multimedia, Computer Program

    ERIC Educational Resources Information Center

    Lynch, Rebecca A.; Steen, M. Dale; Pritchard, Todd J.; Buzzell, Paul R.; Pintauro, Stephen J.

    2008-01-01

    More than 76 million persons become ill from foodborne pathogens in the United States each year. To reduce these numbers, food safety education efforts need to be targeted at not only adults, but school children as well. The middle school grades are ideal for integrating food safety education into the curriculum while simultaneously contributing…

  17. 75 FR 17604 - Federal Motor Vehicle Safety Standards; Roof Crush Resistance

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-04-07

    ... Safety Analysis & Forensic Engineering, LLC (SAFE) brought to our attention errors in the preamble that incorrectly attributed to it the comments of another organization, Safety Analysis, Inc. Both of these... Safety Analysis, Inc. SAFE noted that there is no affiliation between SAFE and Safety Analysis, Inc. and...

  18. 77 FR 50720 - Test Documentation for Digital Computer Software Used in Safety Systems of Nuclear Power Plants

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-08-22

    ... NUCLEAR REGULATORY COMMISSION [NRC-2012-0195] Test Documentation for Digital Computer Software...) is issuing for public comment draft regulatory guide (DG), DG-1207, ``Test Documentation for Digital... practices for test documentation for software and computer systems as described in the Institute of...

  19. 20 CFR 410.510 - Computation of benefits.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 20 Employees' Benefits 2 2010-04-01 2010-04-01 false Computation of benefits. 410.510 Section 410.510 Employees' Benefits SOCIAL SECURITY ADMINISTRATION FEDERAL COAL MINE HEALTH AND SAFETY ACT OF 1969, TITLE IV-BLACK LUNG BENEFITS (1969- ) Payment of Benefits § 410.510 Computation of benefits. (a) Basic...

  20. 14 CFR 415.115 - Flight safety.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 14 Aeronautics and Space 4 2012-01-01 2012-01-01 false Flight safety. 415.115 Section 415.115... From a Non-Federal Launch Site § 415.115 Flight safety. (a) Flight safety analysis. An applicant's safety review document must describe each analysis method employed to meet the flight safety analysis...

  1. 14 CFR 415.115 - Flight safety.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 14 Aeronautics and Space 4 2010-01-01 2010-01-01 false Flight safety. 415.115 Section 415.115... From a Non-Federal Launch Site § 415.115 Flight safety. (a) Flight safety analysis. An applicant's safety review document must describe each analysis method employed to meet the flight safety analysis...

  2. 14 CFR 415.115 - Flight safety.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 14 Aeronautics and Space 4 2013-01-01 2013-01-01 false Flight safety. 415.115 Section 415.115... From a Non-Federal Launch Site § 415.115 Flight safety. (a) Flight safety analysis. An applicant's safety review document must describe each analysis method employed to meet the flight safety analysis...

  3. 14 CFR 415.115 - Flight safety.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 14 Aeronautics and Space 4 2011-01-01 2011-01-01 false Flight safety. 415.115 Section 415.115... From a Non-Federal Launch Site § 415.115 Flight safety. (a) Flight safety analysis. An applicant's safety review document must describe each analysis method employed to meet the flight safety analysis...

  4. 14 CFR 415.115 - Flight safety.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 14 Aeronautics and Space 4 2014-01-01 2014-01-01 false Flight safety. 415.115 Section 415.115... From a Non-Federal Launch Site § 415.115 Flight safety. (a) Flight safety analysis. An applicant's safety review document must describe each analysis method employed to meet the flight safety analysis...

  5. Development of a medical information system that minimizes staff workload and secures system safety at a small medical institution

    NASA Astrophysics Data System (ADS)

    Haneda, Kiyofumi; Koyama, Tadashi

    2005-04-01

    We developed a secure system that minimizes staff workload and secures safety of a medical information system. In this study, we assess the legal security requirements and risks occurring from the use of digitized data. We then analyze the security measures for ways of reducing these risks. In the analysis, not only safety, but also costs of security measures and ease of operability are taken into consideration. Finally, we assess the effectiveness of security measures by employing our system in small-sized medical institution. As a result of the current study, we developed and implemented several security measures, such as authentications, cryptography, data back-up, and secure sockets layer protocol (SSL) in our system. In conclusion, the cost for the introduction and maintenance of a system is one of the primary difficulties with its employment by a small-sized institution. However, with recent reductions in the price of computers, and certain advantages of small-sized medical institutions, the development of an efficient system configuration has become possible.

  6. A polynomial chaos approach to the analysis of vehicle dynamics under uncertainty

    NASA Astrophysics Data System (ADS)

    Kewlani, Gaurav; Crawford, Justin; Iagnemma, Karl

    2012-05-01

    The ability of ground vehicles to quickly and accurately analyse their dynamic response to a given input is critical to their safety and efficient autonomous operation. In field conditions, significant uncertainty is associated with terrain and/or vehicle parameter estimates, and this uncertainty must be considered in the analysis of vehicle motion dynamics. Here, polynomial chaos approaches that explicitly consider parametric uncertainty during modelling of vehicle dynamics are presented. They are shown to be computationally more efficient than the standard Monte Carlo scheme, and experimental results compared with the simulation results performed on ANVEL (a vehicle simulator) indicate that the method can be utilised for efficient and accurate prediction of vehicle motion in realistic scenarios.

  7. Deterministic Local Sensitivity Analysis of Augmented Systems - II: Applications to the QUENCH-04 Experiment Using the RELAP5/MOD3.2 Code System

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ionescu-Bujor, Mihaela; Jin Xuezhou; Cacuci, Dan G.

    2005-09-15

    The adjoint sensitivity analysis procedure for augmented systems for application to the RELAP5/MOD3.2 code system is illustrated. Specifically, the adjoint sensitivity model corresponding to the heat structure models in RELAP5/MOD3.2 is derived and subsequently augmented to the two-fluid adjoint sensitivity model (ASM-REL/TF). The end product, called ASM-REL/TFH, comprises the complete adjoint sensitivity model for the coupled fluid dynamics/heat structure packages of the large-scale simulation code RELAP5/MOD3.2. The ASM-REL/TFH model is validated by computing sensitivities to the initial conditions for various time-dependent temperatures in the test bundle of the Quench-04 reactor safety experiment. This experiment simulates the reflooding with water ofmore » uncovered, degraded fuel rods, clad with material (Zircaloy-4) that has the same composition and size as that used in typical pressurized water reactors. The most important response for the Quench-04 experiment is the time evolution of the cladding temperature of heated fuel rods. The ASM-REL/TFH model is subsequently used to perform an illustrative sensitivity analysis of this and other time-dependent temperatures within the bundle. The results computed by using the augmented adjoint sensitivity system, ASM-REL/TFH, highlight the reliability, efficiency, and usefulness of the adjoint sensitivity analysis procedure for computing time-dependent sensitivities.« less

  8. Safety of multidetector computed tomography pulmonary angiography to exclude pulmonary embolism in patients with a likely pretest clinical probability.

    PubMed

    Robert-Ebadi, H; Glauser, F; Planquette, B; Moumneh, T; Le Gal, G; Righini, M

    2017-08-01

    Essentials Safety of computed tomography (CTPA) to exclude pulmonary embolism (PE) in all patients is debated. We analysed the outcome of PE-likely outpatients left untreated after negative CTPA alone. The 3-month venous thromboembolic risk in these patients was very low (0.6%; 95% CI 0.2-2.3). Multidetector CTPA alone safely excludes PE in patients with likely clinical probability. Background In patients with suspected pulmonary embolism (PE) classified as having a likely or high pretest clinical probability, the need to perform additional testing after a negative multidetector computed tomography pulmonary angiography (CTPA) finding remains a matter of debate. Objectives To assess the safety of excluding PE by CTPA without additional imaging in patients with a likely pretest probability of PE. Patients/Methods We retrospectively analyzed patients included in two multicenter management outcome studies that assessed diagnostic algorithms for PE diagnosis. Results Two thousand five hundred and twenty-two outpatients with suspected PE were available for analysis. Of these 2522 patients, 845 had a likely clinical probability as assessed by use of the simplified revised Geneva score. Of all of these patients, 314 had the diagnosis of PE excluded by a negative CTPA finding alone without additional testing, and were left without anticoagulant treatment and followed up for 3 months. Two patients presented with a venous thromboembolism (VTE) during follow-up. Therefore, the 3-month VTE risk in likely-probability patients after a negative CTPA finding alone was 2/314 (0.6%; 95% confidence interval [CI] 0.2-2.3%). Conclusions In outpatients with suspected PE and a likely clinical probability as assessed by use of the simplified revised Geneva score, CTPA alone seems to be able to safely exclude PE, with a low 3-month VTE rate, which is similar to the VTE rate following the gold standard, i.e. pulmonary angiography. © 2017 International Society on Thrombosis and Haemostasis.

  9. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bates, T.R. Jr.; Tait, S.; Mumford, G.

    The authors discuss how improvements that can increase rig safety can be made in equipment, regulations, and stabilized personnel levels. With regard to equipment, exposure to material handling must be reduced through automation, and well-control technology must be improved by enhanced use of computers and better systems to handle gas. According to this analysis, regulations are needed that are global in scope and have had their costs-to-benefits fully and fairly assessed. Self regulation must be used effectively throughout the industry. Job security and wages should be made adequate to maintain an experienced, motivated, and safe work force.

  10. Development of a Software Safety Process and a Case Study of Its Use

    NASA Technical Reports Server (NTRS)

    Knight, J. C.

    1996-01-01

    Research in the year covered by this reporting period has been primarily directed toward: continued development of mock-ups of computer screens for operator of a digital reactor control system; development of a reactor simulation to permit testing of various elements of the control system; formal specification of user interfaces; fault-tree analysis including software; evaluation of formal verification techniques; and continued development of a software documentation system. Technical results relating to this grant and the remainder of the principal investigator's research program are contained in various reports and papers.

  11. FDA toxicity databases and real-time data entry.

    PubMed

    Arvidson, Kirk B

    2008-11-15

    Structure-searchable electronic databases are valuable new tools that are assisting the FDA in its mission to promptly and efficiently review incoming submissions for regulatory approval of new food additives and food contact substances. The Center for Food Safety and Applied Nutrition's Office of Food Additive Safety (CFSAN/OFAS), in collaboration with Leadscope, Inc., is consolidating genetic toxicity data submitted in food additive petitions from the 1960s to the present day. The Center for Drug Evaluation and Research, Office of Pharmaceutical Science's Informatics and Computational Safety Analysis Staff (CDER/OPS/ICSAS) is separately gathering similar information from their submissions. Presently, these data are distributed in various locations such as paper files, microfiche, and non-standardized toxicology memoranda. The organization of the data into a consistent, searchable format will reduce paperwork, expedite the toxicology review process, and provide valuable information to industry that is currently available only to the FDA. Furthermore, by combining chemical structures with genetic toxicity information, biologically active moieties can be identified and used to develop quantitative structure-activity relationship (QSAR) modeling and testing guidelines. Additionally, chemicals devoid of toxicity data can be compared to known structures, allowing for improved safety review through the identification and analysis of structural analogs. Four database frameworks have been created: bacterial mutagenesis, in vitro chromosome aberration, in vitro mammalian mutagenesis, and in vivo micronucleus. Controlled vocabularies for these databases have been established. The four separate genetic toxicity databases are compiled into a single, structurally-searchable database for easy accessibility of the toxicity information. Beyond the genetic toxicity databases described here, additional databases for subchronic, chronic, and teratogenicity studies have been prepared.

  12. A Multi-Stakeholder Perspective on the Use of Alternative Test Strategies for Nanomaterial Safety Assessment

    PubMed Central

    Nel, Andre E.; Nasser, Elina; Godwin, Hilary; Avery, David; Bahadori, Tina; Bergeson, Lynn; Beryt, Elizabeth; Bonner, James C.; Boverhof, Darrell; Carter, Janet; Castranova, Vince; DeShazo, J. R.; Hussain, Saber M.; Kane, Agnes B.; Klaessig, Fred; Kuempel, Eileen; Lafranconi, Mark; Landsiedel, Robert; Malloy, Timothy; Miller, Mary Beth; Morris, Jeffery; Moss, Kenneth; Oberdorster, Gunter; Pinkerton, Kent; Pleus, Richard C.; Shatkin, Jo Anne; Thomas, Rusty; Tolaymat, Thabet; Wang, Amy; Wong, Jeffrey

    2014-01-01

    There has been a conceptual shift in toxicological studies from describing what happens to explaining how the adverse outcome occurs, thereby enabling a deeper and improved understanding of how biomolecular and mechanistic profiling can inform hazard identification and improve risk assessment. Compared to traditional toxicology methods, which have a heavy reliance on animals, new approaches to generate toxicological data are becoming available for the safety assessment of chemicals, including high-throughput and high-content screening (HTS, HCS). With the emergence of nanotechnology, the exponential increase in the total number of engineered nanomaterials (ENMs) in research, development, and commercialization requires a robust scientific approach to screen ENM safety in humans and the environment rapidly and efficiently. Spurred by the developments in chemical testing, a promising new toxicological paradigm for ENMs is to use alternative test strategies (ATS), which reduce reliance on animal testing through the use of in vitro and in silico methods such as HTS, HCS, and computational modeling. Furthermore, this allows for the comparative analysis of large numbers of ENMs simultaneously and for hazard assessment at various stages of the product development process and overall life cycle. Using carbon nanotubes as a case study, a workshop bringing together national and international leaders from government, industry, and academia was convened at the University of California, Los Angeles to discuss the utility of ATS for decision-making analyses of ENMs. After lively discussions, a short list of generally shared viewpoints on this topic was generated, including a general view that ATS approaches for ENMs can significantly benefit chemical safety analysis. PMID:23924032

  13. Application of failure mode and effects analysis (FMEA) to pretreatment phases in tomotherapy.

    PubMed

    Broggi, Sara; Cantone, Marie Claire; Chiara, Anna; Di Muzio, Nadia; Longobardi, Barbara; Mangili, Paola; Veronese, Ivan

    2013-09-06

    The aim of this paper was the application of the failure mode and effects analysis (FMEA) approach to assess the risks for patients undergoing radiotherapy treatments performed by means of a helical tomotherapy unit. FMEA was applied to the preplanning imaging, volume determination, and treatment planning stages of the tomotherapy process and consisted of three steps: 1) identification of the involved subprocesses; 2) identification and ranking of the potential failure modes, together with their causes and effects, using the risk probability number (RPN) scoring system; and 3) identification of additional safety measures to be proposed for process quality and safety improvement. RPN upper threshold for little concern of risk was set at 125. A total of 74 failure modes were identified: 38 in the stage of preplanning imaging and volume determination, and 36 in the stage of planning. The threshold of 125 for RPN was exceeded in four cases: one case only in the phase of preplanning imaging and volume determination, and three cases in the stage of planning. The most critical failures appeared related to (i) the wrong or missing definition and contouring of the overlapping regions, (ii) the wrong assignment of the overlap priority to each anatomical structure, (iii) the wrong choice of the computed tomography calibration curve for dose calculation, and (iv) the wrong (or not performed) choice of the number of fractions in the planning station. On the basis of these findings, in addition to the safety strategies already adopted in the clinical practice, novel solutions have been proposed for mitigating the risk of these failures and to increase patient safety.

  14. Nuclear criticality safety: 5-day training course

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schlesser, J.A.

    1992-11-01

    This compilation of notes is presented as a source reference for the criticality safety course. It represents the contributions of many people, particularly Tom McLaughlin, the course's primary instructor. At the completion of this training course, the attendee will: be able to define terms commonly used in nuclear criticality safety; be able to appreciate the fundamentals of nuclear criticality safety; be able to identify factors which affect nuclear criticality safety; be able to identify examples of criticality controls as used at Los Alamos; be able to identify examples of circumstances present during criticality accidents; be able to identify examples ofmore » computer codes used by the nuclear criticality safety specialist; be able to identify examples of safety consciousness required in nuclear criticality safety.« less

  15. Computer program for afterheat temperature distribution for mobile nuclear power plant

    NASA Technical Reports Server (NTRS)

    Parker, W. G.; Vanbibber, L. E.

    1972-01-01

    ESATA computer program was developed to analyze thermal safety aspects of post-impacted mobile nuclear power plants. Program is written in FORTRAN 4 and designed for IBM 7094/7044 direct coupled system.

  16. Selective cultivation and rapid detection of Staphylococcus aureus by computer vision.

    PubMed

    Wang, Yong; Yin, Yongguang; Zhang, Chaonan

    2014-03-01

    In this paper, we developed a selective growth medium and a more rapid detection method based on computer vision for selective isolation and identification of Staphylococcus aureus from foods. The selective medium consisted of tryptic soy broth basal medium, 3 inhibitors (NaCl, K2 TeO3 , and phenethyl alcohol), and 2 accelerators (sodium pyruvate and glycine). After 4 h of selective cultivation, bacterial detection was accomplished using computer vision. The total analysis time was 5 h. Compared to the Baird-Parker plate count method, which requires 4 to 5 d, this new detection method offers great time savings. Moreover, our novel method had a correlation coefficient of greater than 0.998 when compared with the Baird-Parker plate count method. The detection range for S. aureus was 10 to 10(7) CFU/mL. Our new, rapid detection method for microorganisms in foods has great potential for routine food safety control and microbiological detection applications. © 2014 Institute of Food Technologists®

  17. EFFECT OF A ROAD SAFETY EDUCATION INTERVENTION ON ROAD SAFETY KNOWLEDGE OF UNIVERSITY DRIVERS IN IBADAN, NIGERIA.

    PubMed

    Olumide, A O; Owoaje, E T

    2016-06-01

    It is essential for drivers employed in the formal sector to have good knowledge of road safety in order to safeguard their lives and those of the staff they are employed to drive. The study was conducted to determine the effect of a road safety education intervention on road safety knowledge of drivers employed in the University of Ibadan, Nigeria. A quasi-experimental study of 98 intervention and 78 control drivers selected using a cluster sampling technique was conducted. The intervention comprised a two-day training on road safety and first aid. The drivers' knowledge of road safety was measured at baseline, immediately and 4-months post-intervention. Aggregate scores of road safety knowledge were computed giving minimum and maximum obtainable scores of 0 and 16 respectively. Change in mean scores over the three measurement periods was assessed using Repeated Measures Analysis of Variance (ANOVA). Independent t-test was used to compare the scores between intervention and control drivers at each of the assessment periods. Twenty-nine drivers did not complete the study (attrition rate = 16.5%). At baseline, mean road safety knowledge scores for the intervention and control drivers were 12.7±2.2 and 12.9± 2.3 (p = 0.510) respectively. Immediately and four months post intervention, the scores of the intervention drivers were 13.8±1.9 and 12.8±1.6; while scores for the controls were 13.3±2.0 and 13.2±1.8. Repeated measures ANOVA revealed that the increase in knowledge over the three assessment periods was not statistically significant. The intervention resulted in an initial increase in road safety knowledge of the intervention drivers. However, this was not sustained to the forth month post-intervention. This finding suggests periodic refresher trainings to sustain the knowledge acquired.

  18. 3D Simulation of External Flooding Events for the RISMC Pathway

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Prescott, Steven; Mandelli, Diego; Sampath, Ramprasad

    2015-09-01

    Incorporating 3D simulations as part of the Risk-Informed Safety Margins Characterization (RISMIC) Toolkit allows analysts to obtain a more complete picture of complex system behavior for events including external plant hazards. External events such as flooding have become more important recently – however these can be analyzed with existing and validated simulated physics toolkits. In this report, we describe these approaches specific to flooding-based analysis using an approach called Smoothed Particle Hydrodynamics. The theory, validation, and example applications of the 3D flooding simulation are described. Integrating these 3D simulation methods into computational risk analysis provides a spatial/visual aspect to themore » design, improves the realism of results, and can prove visual understanding to validate the analysis of flooding.« less

  19. A web-based information system for management and analysis of patient data after refractive eye surgery.

    PubMed

    Zuberbuhler, Bruno; Galloway, Peter; Reddy, Aravind; Saldana, Manuel; Gale, Richard

    2007-12-01

    The aim was to develop a software tool for refractive surgeons using a standard user-friendly web-based interface, providing the user with a secure environment to protect large volumes of patient data. The software application was named "Internet-based refractive analysis" (IBRA), and was programmed with the computer languages PHP, HTML and JavaScript, attached to the opensource MySQL database. IBRA facilitated internationally accepted presentation methods including the stability chart, the predictability chart and the safety chart; it was able to perform vector analysis for the course of a single patient or for group data. With the integrated nomogram calculation, treatment could be customised to reduce the postoperative refractive error. Multicenter functions permitted quality-control comparisons between different surgeons and laser units.

  20. ``Carbon Credits'' for Resource-Bounded Computations Using Amortised Analysis

    NASA Astrophysics Data System (ADS)

    Jost, Steffen; Loidl, Hans-Wolfgang; Hammond, Kevin; Scaife, Norman; Hofmann, Martin

    Bounding resource usage is important for a number of areas, notably real-time embedded systems and safety-critical systems. In this paper, we present a fully automatic static type-based analysis for inferring upper bounds on resource usage for programs involving general algebraic datatypes and full recursion. Our method can easily be used to bound any countable resource, without needing to revisit proofs. We apply the analysis to the important metrics of worst-case execution time, stack- and heap-space usage. Our results from several realistic embedded control applications demonstrate good matches between our inferred bounds and measured worst-case costs for heap and stack usage. For time usage we infer good bounds for one application. Where we obtain less tight bounds, this is due to the use of software floating-point libraries.

  1. THE EFFECTS OF COMPUTER-BASED FIRE SAFETY TRAINING ON THE KNOWLEDGE, ATTITUDES, AND PRACTICES OF CAREGIVERS

    PubMed Central

    Harrington, Susan S.; Walker, Bonnie L.

    2010-01-01

    Background Older adults in small residential board and care facilities are at a particularly high risk of fire death and injury because of their characteristics and environment. Methods The authors investigated computer-based instruction as a way to teach fire emergency planning to owners, operators, and staff of small residential board and care facilities. Participants (N = 59) were randomly assigned to a treatment or control group. Results Study participants who completed the training significantly improved their scores from pre- to posttest when compared to a control group. Participants indicated on the course evaluation that the computers were easy to use for training (97%) and that they would like to use computers for future training courses (97%). Conclusions This study demonstrates the potential for using interactive computer-based training as a viable alternative to instructor-led training to meet the fire safety training needs of owners, operators, and staff of small board and care facilities for the elderly. PMID:19263929

  2. 16 CFR 1115.14 - Time computations.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... Commercial Practices CONSUMER PRODUCT SAFETY COMMISSION CONSUMER PRODUCT SAFETY ACT REGULATIONS SUBSTANTIAL... spend a reasonable time for investigation and evaluation. (See § 1115.14(d).) (d) Time for investigation and evaluation. A subject firm may conduct a reasonably expeditious investigation in order to evaluate...

  3. Navigating through the domains of biology and chemistry

    EPA Science Inventory

    Developing computational toxicology methods to assist the risk assessment process has recently gained much attention both in regulatory agencies and industries. The FDA Center for Food Safety and Applied Nutrition’s Office of Food Additive Safety (CFSAN OFAS) uses (Q)SAR approach...

  4. The Microphysiology Systems Database for Analyzing and Modeling Compound Interactions with Human and Animal Organ Models

    PubMed Central

    Vernetti, Lawrence; Bergenthal, Luke; Shun, Tong Ying; Taylor, D. Lansing

    2016-01-01

    Abstract Microfluidic human organ models, microphysiology systems (MPS), are currently being developed as predictive models of drug safety and efficacy in humans. To design and validate MPS as predictive of human safety liabilities requires safety data for a reference set of compounds, combined with in vitro data from the human organ models. To address this need, we have developed an internet database, the MPS database (MPS-Db), as a powerful platform for experimental design, data management, and analysis, and to combine experimental data with reference data, to enable computational modeling. The present study demonstrates the capability of the MPS-Db in early safety testing using a human liver MPS to relate the effects of tolcapone and entacapone in the in vitro model to human in vivo effects. These two compounds were chosen to be evaluated as a representative pair of marketed drugs because they are structurally similar, have the same target, and were found safe or had an acceptable risk in preclinical and clinical trials, yet tolcapone induced unacceptable levels of hepatotoxicity while entacapone was found to be safe. Results demonstrate the utility of the MPS-Db as an essential resource for relating in vitro organ model data to the multiple biochemical, preclinical, and clinical data sources on in vivo drug effects. PMID:28781990

  5. Identifying black swans in NextGen: predicting human performance in off-nominal conditions.

    PubMed

    Wickens, Christopher D; Hooey, Becky L; Gore, Brian F; Sebok, Angelia; Koenicke, Corey S

    2009-10-01

    The objective is to validate a computational model of visual attention against empirical data--derived from a meta-analysis--of pilots' failure to notice safety-critical unexpected events. Many aircraft accidents have resulted, in part, because of failure to notice nonsalient unexpected events outside of foveal vision, illustrating the phenomenon of change blindness. A model of visual noticing, N-SEEV (noticing-salience, expectancy, effort, and value), was developed to predict these failures. First, 25 studies that reported objective data on miss rate for unexpected events in high-fidelity cockpit simulations were identified, and their miss rate data pooled across five variables (phase of flight, event expectancy, event location, presence of a head-up display, and presence of a highway-in-the-sky display). Second, the parameters of the N-SEEV model were tailored to mimic these dichotomies. The N-SEEV model output predicted variance in the obtained miss rate (r = .73). The individual miss rates of all six dichotomous conditions were predicted within 14%, and four of these were predicted within 7%. The N-SEEV model, developed on the basis of an independent data set, was able to successfully predict variance in this safety-critical measure of pilot response to abnormal circumstances, as collected from the literature. As new technology and procedures are envisioned for the future airspace, it is important to predict if these may compromise safety in terms of pilots' failing to notice unexpected events. Computational models such as N-SEEV support cost-effective means of making such predictions.

  6. Response Time Analysis and Test of Protection System Instrument Channels for APR1400 and OPR1000

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lee, Chang Jae; Han, Seung; Yun, Jae Hee

    2015-07-01

    Safety limits are required to maintain the integrity of physical barriers designed to prevent the uncontrolled release of radioactive materials in nuclear power plants. The safety analysis establishes two critical constraints that include an analytical limit in terms of a measured or calculated variable, and a specific time after the analytical limit is reached to begin protective action. Keeping with the nuclear regulations and industry standards, satisfying these two requirements will ensure that the safety limit will not be exceeded during the design basis event, either an anticipated operational occurrence or a postulated accident. Various studies on the setpoint determinationmore » methodology for the safety-related instrumentation have been actively performed to ensure that the requirement of the analytical limit is satisfied. In particular, the protection setpoint methodology for the advanced power reactor 1400 (APP1400) and the optimized power reactor 1000 (OPR1000) has been recently developed to cover both the design basis event and the beyond design basis event. The developed setpoint methodology has also been quantitatively validated using specific computer programs and setpoint calculations. However, the safety of nuclear power plants cannot be fully guaranteed by satisfying the requirement of the analytical limit. In spite of the response time verification requirements of nuclear regulations and industry standards, it is hard to find the studies on the systematically integrated methodology regarding the response time evaluation. In cases of APR1400 and OPR1000, the response time analysis for the plant protection system is partially included in the setpoint calculation and the response time test is separately performed via the specific plant procedure. The test technique has a drawback which is the difficulty to demonstrate completeness of timing test. The analysis technique has also a demerit of resulting in extreme times that not actually possible. Thus, the establishment of the systematic response time evaluation methodology is needed to justify the conformance to the response time requirement used in the safety analysis. This paper proposes the response time evaluation methodology for APR1400 and OPR1000 using the combined analysis and test technique to confirm that the plant protection system can meet the analytical response time assumed in the safety analysis. In addition, the results of the quantitative evaluation performed for APR1400 and OPR1000 are presented in this paper. The proposed response time analysis technique consists of defining the response time requirement, determining the critical signal path for the trip parameter, allocating individual response time to each component on the signal path, and analyzing the total response time for the trip parameter, and demonstrates that the total analyzed response time does not exceed the response time requirement. The proposed response time test technique is composed of defining the response time requirement, determining the critical signal path for the trip parameter, determining the test method for each component on the signal path, performing the response time test, and demonstrates that the total test result does not exceed the response time requirement. The total response time should be tested in a single test that covers from the sensor to the final actuation device on the instrument channel. When the total channel is not tested in a single test, separate tests on groups of components or single components including the total instrument channel shall be combined to verify the total channel response. For APR1400 and OPR1000, the ramp test technique is used for the pressure and differential pressure transmitters and the step function testing technique is applied to the signal processing equipment and final actuation device. As a result, it can be demonstrated that the response time requirement is satisfied by the combined analysis and test technique. Therefore, the proposed methodology in this paper plays a crucial role in guaranteeing the safety of the nuclear power plants systematically satisfying one of two critical requirements from the safety analysis. (authors)« less

  7. 14 CFR 417.221 - Time delay analysis.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... OF TRANSPORTATION LICENSING LAUNCH SAFETY Flight Safety Analysis § 417.221 Time delay analysis. (a) General. A flight safety analysis must include a time delay analysis that establishes the mean elapsed time between the violation of a flight termination rule and the time when the flight safety system is...

  8. 14 CFR 417.221 - Time delay analysis.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... OF TRANSPORTATION LICENSING LAUNCH SAFETY Flight Safety Analysis § 417.221 Time delay analysis. (a) General. A flight safety analysis must include a time delay analysis that establishes the mean elapsed time between the violation of a flight termination rule and the time when the flight safety system is...

  9. 10 CFR 1703.112 - Computation of time.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 10 Energy 4 2010-01-01 2010-01-01 false Computation of time. 1703.112 Section 1703.112 Energy DEFENSE NUCLEAR FACILITIES SAFETY BOARD PUBLIC INFORMATION AND REQUESTS § 1703.112 Computation of time. In... until the end of the next working day. Whenever a person has the right or is required to take some...

  10. 10 CFR 1703.112 - Computation of time.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 10 Energy 4 2011-01-01 2011-01-01 false Computation of time. 1703.112 Section 1703.112 Energy DEFENSE NUCLEAR FACILITIES SAFETY BOARD PUBLIC INFORMATION AND REQUESTS § 1703.112 Computation of time. In... until the end of the next working day. Whenever a person has the right or is required to take some...

  11. 10 CFR 1703.112 - Computation of time.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 10 Energy 4 2013-01-01 2013-01-01 false Computation of time. 1703.112 Section 1703.112 Energy DEFENSE NUCLEAR FACILITIES SAFETY BOARD PUBLIC INFORMATION AND REQUESTS § 1703.112 Computation of time. In... until the end of the next working day. Whenever a person has the right or is required to take some...

  12. 10 CFR 1703.112 - Computation of time.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 10 Energy 4 2014-01-01 2014-01-01 false Computation of time. 1703.112 Section 1703.112 Energy DEFENSE NUCLEAR FACILITIES SAFETY BOARD PUBLIC INFORMATION AND REQUESTS § 1703.112 Computation of time. In... until the end of the next working day. Whenever a person has the right or is required to take some...

  13. 10 CFR 1703.112 - Computation of time.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 10 Energy 4 2012-01-01 2012-01-01 false Computation of time. 1703.112 Section 1703.112 Energy DEFENSE NUCLEAR FACILITIES SAFETY BOARD PUBLIC INFORMATION AND REQUESTS § 1703.112 Computation of time. In... until the end of the next working day. Whenever a person has the right or is required to take some...

  14. Are Handheld Computers Dependable? A New Data Collection System for Classroom-Based Observations

    ERIC Educational Resources Information Center

    Adiguzel, Tufan; Vannest, Kimberly J.; Parker, Richard I.

    2009-01-01

    Very little research exists on the dependability of handheld computers used in public school classrooms. This study addresses four dependability criteria--reliability, maintainability, availability, and safety--to evaluate a data collection tool on a handheld computer. Data were collected from five sources: (1) time-use estimations by 19 special…

  15. Results of comparative RBMK neutron computation using VNIIEF codes (cell computation, 3D statics, 3D kinetics). Final report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Grebennikov, A.N.; Zhitnik, A.K.; Zvenigorodskaya, O.A.

    1995-12-31

    In conformity with the protocol of the Workshop under Contract {open_quotes}Assessment of RBMK reactor safety using modern Western Codes{close_quotes} VNIIEF performed a neutronics computation series to compare western and VNIIEF codes and assess whether VNIIEF codes are suitable for RBMK type reactor safety assessment computation. The work was carried out in close collaboration with M.I. Rozhdestvensky and L.M. Podlazov, NIKIET employees. The effort involved: (1) cell computations with the WIMS, EKRAN codes (improved modification of the LOMA code) and the S-90 code (VNIIEF Monte Carlo). Cell, polycell, burnup computation; (2) 3D computation of static states with the KORAT-3D and NEUmore » codes and comparison with results of computation with the NESTLE code (USA). The computations were performed in the geometry and using the neutron constants presented by the American party; (3) 3D computation of neutron kinetics with the KORAT-3D and NEU codes. These computations were performed in two formulations, both being developed in collaboration with NIKIET. Formulation of the first problem maximally possibly agrees with one of NESTLE problems and imitates gas bubble travel through a core. The second problem is a model of the RBMK as a whole with imitation of control and protection system controls (CPS) movement in a core.« less

  16. RDS - A systematic approach towards system thermal hydraulics input code development for a comprehensive deterministic safety analysis

    NASA Astrophysics Data System (ADS)

    Salim, Mohd Faiz; Roslan, Ridha; Ibrahim, Mohd Rizal Mamat @

    2014-02-01

    Deterministic Safety Analysis (DSA) is one of the mandatory requirements conducted for Nuclear Power Plant licensing process, with the aim of ensuring safety compliance with relevant regulatory acceptance criteria. DSA is a technique whereby a set of conservative deterministic rules and requirements are applied for the design and operation of facilities or activities. Computer codes are normally used to assist in performing all required analysis under DSA. To ensure a comprehensive analysis, the conduct of DSA should follow a systematic approach. One of the methodologies proposed is the Standardized and Consolidated Reference Experimental (and Calculated) Database (SCRED) developed by University of Pisa. Based on this methodology, the use of Reference Data Set (RDS) as a pre-requisite reference document for developing input nodalization was proposed. This paper shall describe the application of RDS with the purpose of assessing its effectiveness. Two RDS documents were developed for an Integral Test Facility of LOBI-MOD2 and associated Test A1-83. Data and information from various reports and drawings were referred in preparing the RDS. The results showed that by developing RDS, it has made possible to consolidate all relevant information in one single document. This is beneficial as it enables preservation of information, promotes quality assurance, allows traceability, facilitates continuous improvement, promotes solving of contradictions and finally assisting in developing thermal hydraulic input regardless of whichever code selected. However, some disadvantages were also recognized such as the need for experience in making engineering judgments, language barrier in accessing foreign information and limitation of resources. Some possible improvements are suggested to overcome these challenges.

  17. Transportation systems safety hazard analysis tool (SafetyHAT) user guide (version 1.0)

    DOT National Transportation Integrated Search

    2014-03-24

    This is a user guide for the transportation system Safety Hazard Analysis Tool (SafetyHAT) Version 1.0. SafetyHAT is a software tool that facilitates System Theoretic Process Analysis (STPA.) This user guide provides instructions on how to download, ...

  18. 10 CFR 830.206 - Preliminary documented safety analysis.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 10 Energy 4 2010-01-01 2010-01-01 false Preliminary documented safety analysis. 830.206 Section 830.206 Energy DEPARTMENT OF ENERGY NUCLEAR SAFETY MANAGEMENT Safety Basis Requirements § 830.206 Preliminary documented safety analysis. If construction begins after December 11, 2000, the contractor...

  19. [The use of fenspiride for the combined treatment of exacerbation of chronic laryngitis].

    PubMed

    Ryabova, M A

    The present study was carried out based at the Department of Otorhinolaryngology of I.P. Pavlov First State Medical University of Saint-Petersburg. The objective of this work was to elucidate the efficacy and safety of fenspiride therapy for the treatment of exacerbation of chronic laryngitis associated with an acute respiratory infection. The patients comprising the main group received fenspiride (Eurespal, 'Servier', France) at the standard dose in addition to the conventional therapy with the use of antibiotics, inhalation, and voice rest. The patients in the group of comparison were treated following the conventional protocol without fenspiride. The clinical symptoms evaluated based on the scoring system, the results of videolaryngoscopy, and computer-assisted analysis of the voice were compared before and after treatment in the patients of both groups. The results of the study have confirmed the high effectiveness and safety of fenspiride therapy of exacerbation of chronic laryngitis.

  20. Air Traffic Control: Weak Computer Security Practices Jeopardize Flight Safety

    DOT National Transportation Integrated Search

    1998-05-01

    Given the paramount importance of computer security of Air Traffic Control (ATC) systems, Congress asked the General Accounting Office to determine (1) whether the Fedcral Aviation Administration (FAA) is effectively managing physical security at ATC...

  1. RELAP-7 Development Updates

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhang, Hongbin; Zhao, Haihua; Gleicher, Frederick Nathan

    RELAP-7 is a nuclear systems safety analysis code being developed at the Idaho National Laboratory, and is the next generation tool in the RELAP reactor safety/systems analysis application series. RELAP-7 development began in 2011 to support the Risk Informed Safety Margins Characterization (RISMC) Pathway of the Light Water Reactor Sustainability (LWRS) program. The overall design goal of RELAP-7 is to take advantage of the previous thirty years of advancements in computer architecture, software design, numerical methods, and physical models in order to provide capabilities needed for the RISMC methodology and to support nuclear power safety analysis. The code is beingmore » developed based on Idaho National Laboratory’s modern scientific software development framework – MOOSE (the Multi-Physics Object-Oriented Simulation Environment). The initial development goal of the RELAP-7 approach focused primarily on the development of an implicit algorithm capable of strong (nonlinear) coupling of the dependent hydrodynamic variables contained in the 1-D/2-D flow models with the various 0-D system reactor components that compose various boiling water reactor (BWR) and pressurized water reactor nuclear power plants (NPPs). During Fiscal Year (FY) 2015, the RELAP-7 code has been further improved with expanded capability to support boiling water reactor (BWR) and pressurized water reactor NPPs analysis. The accumulator model has been developed. The code has also been coupled with other MOOSE-based applications such as neutronics code RattleSnake and fuel performance code BISON to perform multiphysics analysis. A major design requirement for the implicit algorithm in RELAP-7 is that it is capable of second-order discretization accuracy in both space and time, which eliminates the traditional first-order approximation errors. The second-order temporal is achieved by a second-order backward temporal difference, and the one-dimensional second-order accurate spatial discretization is achieved with the Galerkin approximation of Lagrange finite elements. During FY-2015, we have done numerical verification work to verify that the RELAP-7 code indeed achieves 2nd-order accuracy in both time and space for single phase models at the system level.« less

  2. A systematic investigation of computation models for predicting Adverse Drug Reactions (ADRs).

    PubMed

    Kuang, Qifan; Wang, MinQi; Li, Rong; Dong, YongCheng; Li, Yizhou; Li, Menglong

    2014-01-01

    Early and accurate identification of adverse drug reactions (ADRs) is critically important for drug development and clinical safety. Computer-aided prediction of ADRs has attracted increasing attention in recent years, and many computational models have been proposed. However, because of the lack of systematic analysis and comparison of the different computational models, there remain limitations in designing more effective algorithms and selecting more useful features. There is therefore an urgent need to review and analyze previous computation models to obtain general conclusions that can provide useful guidance to construct more effective computational models to predict ADRs. In the current study, the main work is to compare and analyze the performance of existing computational methods to predict ADRs, by implementing and evaluating additional algorithms that have been earlier used for predicting drug targets. Our results indicated that topological and intrinsic features were complementary to an extent and the Jaccard coefficient had an important and general effect on the prediction of drug-ADR associations. By comparing the structure of each algorithm, final formulas of these algorithms were all converted to linear model in form, based on this finding we propose a new algorithm called the general weighted profile method and it yielded the best overall performance among the algorithms investigated in this paper. Several meaningful conclusions and useful findings regarding the prediction of ADRs are provided for selecting optimal features and algorithms.

  3. Model-Based Safety Analysis

    NASA Technical Reports Server (NTRS)

    Joshi, Anjali; Heimdahl, Mats P. E.; Miller, Steven P.; Whalen, Mike W.

    2006-01-01

    System safety analysis techniques are well established and are used extensively during the design of safety-critical systems. Despite this, most of the techniques are highly subjective and dependent on the skill of the practitioner. Since these analyses are usually based on an informal system model, it is unlikely that they will be complete, consistent, and error free. In fact, the lack of precise models of the system architecture and its failure modes often forces the safety analysts to devote much of their effort to gathering architectural details about the system behavior from several sources and embedding this information in the safety artifacts such as the fault trees. This report describes Model-Based Safety Analysis, an approach in which the system and safety engineers share a common system model created using a model-based development process. By extending the system model with a fault model as well as relevant portions of the physical system to be controlled, automated support can be provided for much of the safety analysis. We believe that by using a common model for both system and safety engineering and automating parts of the safety analysis, we can both reduce the cost and improve the quality of the safety analysis. Here we present our vision of model-based safety analysis and discuss the advantages and challenges in making this approach practical.

  4. Needs analysis of a flexible computerized management infrastructure

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Usman, S.; Hajek, B. K.; Ali, S. F.

    2006-07-01

    The United States' Energy Policy Act of 2005 is expected to facilitate construction of new commercial nuclear power plants. In the meanwhile, current plants are in the process of obtaining licenses for extended operation beyond their predetermined design life. In this beneficial yet challenging situation, it seems desirable to develop a strategic plan for smooth and seamless transition from paper based procedure systems to computer based procedure systems for improved performance and safety of the existing nuclear power plants. Many utilities already maintain procedures using word processing software, but it is common to print paper copies for daily use. Atmore » this time it is highly desirable to better understand the collective as well as individual document management needs of a commercial nuclear power plant as they migrate to a computer based system. As a contributory role in initiating a strategic plan, this paper offers a comprehensive questionnaire that is suitable for conducting a survey to determine the related needs of the utilities. The questionnaire covers three major areas: Formatting and User Friendly Features; Technical and Environmental Considerations; and Safety, System Integrity and Regulatory Considerations. A plan to conduct the proposed survey is also outlined in the future work section of this paper. (authors)« less

  5. FRAMEWORK FOR EVALUATION OF PHYSIOLOGICALLY-BASED PHARMACOKINETIC MODELS FOR USE IN SAFETY OR RISK ASSESSMENT

    EPA Science Inventory

    ABSTRACT

    Proposed applications of increasingly sophisticated biologically-based computational models, such as physiologically-based pharmacokinetic (PBPK) models, raise the issue of how to evaluate whether the models are adequate for proposed uses including safety or risk ...

  6. Coast Guard : update on Marine Information for Safety and Law Enforcement System

    DOT National Transportation Integrated Search

    2001-10-01

    The Coast Guard is developing a web-based information system to replace an aging computer system that it uses to track safety and law-enforcement actions involving commercial and recreational vessels. In 1995 the Coast Guard awarded a contract to dev...

  7. Efficacy of Web-Based Instruction to Provide Training on Federal Motor Carrier Safety Regulations

    DOT National Transportation Integrated Search

    2011-05-01

    This report presents an evaluation of the current state-of-the-art Web-based instruction (WBI), reviews the current computer platforms of potential users of WBI, reviews the current status of WBI applications for Federal Motor Carrier Safety Administ...

  8. Evaluating chemical safety: ToxCast, Tipping Points and Virtual Tissues (Tamburro Symposium)

    EPA Science Inventory

    This presentation provides an overview of high-throughput toxicology at the NCCT using high-content imaging and computational models for analyzing chemical safety. In In particular, this work outlines the derivation of toxicological "tipping points" from in vitro concentration- a...

  9. Making the Hubble Space Telescope servicing mission safe

    NASA Technical Reports Server (NTRS)

    Bahr, N. J.; Depalo, S. V.

    1992-01-01

    The implementation of the HST system safety program is detailed. Numerous safety analyses are conducted through various phases of design, test, and fabrication, and results are presented to NASA management for discussion during dedicated safety reviews. Attention is given to the system safety assessment and risk analysis methodologies used, i.e., hazard analysis, fault tree analysis, and failure modes and effects analysis, and to how they are coupled with engineering and test analysis for a 'synergistic picture' of the system. Some preliminary safety analysis results, showing the relationship between hazard identification, control or abatement, and finally control verification, are presented as examples of this safety process.

  10. Data Discovery with IBM Watson

    NASA Astrophysics Data System (ADS)

    Fessler, J.

    2016-12-01

    BM Watson is a cognitive computing system that uses machine learning, statistical analysis, and natural language processing to find and understand the clues in questions posed to it. Watson was made famous when it bested two champions on TV's Jeopardy! show. Since then, Watson has evolved into a platform of cognitive services that can be trained on very granular fields up study. Watson is being used to support a number of subject domains, such as cancer research, public safety, engineering, and the intelligence community. IBM will be providing a presentation and demonstration on the Watson technology and will discuss its capabilities including Natural Language Processing, text analytics and enterprise search, as well as cognitive computing with deep Q&A. The team will also be giving examples of how IBM Watson technology is being used to support real-world problems across a number of public sector agencies

  11. NASA/Army Rotorcraft Transmission Research, a Review of Recent Significant Accomplishments

    NASA Technical Reports Server (NTRS)

    Krantz, Timothy L.

    1994-01-01

    A joint helicopter transmission research program between NASA Lewis Research Center and the U.S. Army Research Lab has existed since 1970. Research goals are to reduce weight and noise while increasing life, reliability, and safety. These research goals are achieved by the NASA/Army Mechanical Systems Technology Branch through both in-house research and cooperative research projects with university and industry partners. Some recent significant technical accomplishments produced by this cooperative research are reviewed. The following research projects are reviewed: oil-off survivability of tapered roller bearings, design and evaluation of high contact ratio gearing, finite element analysis of spiral bevel gears, computer numerical control grinding of spiral bevel gears, gear dynamics code validation, computer program for life and reliability of helicopter transmissions, planetary gear train efficiency study, and the Advanced Rotorcraft Transmission (ART) program.

  12. Cyclic structural analyses of anisotropic turbine blades for reusable space propulsion systems. [ssme fuel turbopump

    NASA Technical Reports Server (NTRS)

    Manderscheid, J. M.; Kaufman, A.

    1985-01-01

    Turbine blades for reusable space propulsion systems are subject to severe thermomechanical loading cycles that result in large inelastic strains and very short lives. These components require the use of anisotropic high-temperature alloys to meet the safety and durability requirements of such systems. To assess the effects on blade life of material anisotropy, cyclic structural analyses are being performed for the first stage high-pressure fuel turbopump blade of the space shuttle main engine. The blade alloy is directionally solidified MAR-M 246 alloy. The analyses are based on a typical test stand engine cycle. Stress-strain histories at the airfoil critical location are computed using the MARC nonlinear finite-element computer code. The MARC solutions are compared to cyclic response predictions from a simplified structural analysis procedure developed at the NASA Lewis Research Center.

  13. Graphics enhanced computer emulation for improved timing-race and fault tolerance control system analysis. [of Centaur liquid-fuel booster

    NASA Technical Reports Server (NTRS)

    Szatkowski, G. P.

    1983-01-01

    A computer simulation system has been developed for the Space Shuttle's advanced Centaur liquid fuel booster rocket, in order to conduct systems safety verification and flight operations training. This simulation utility is designed to analyze functional system behavior by integrating control avionics with mechanical and fluid elements, and is able to emulate any system operation, from simple relay logic to complex VLSI components, with wire-by-wire detail. A novel graphics data entry system offers a pseudo-wire wrap data base that can be easily updated. Visual subsystem operations can be selected and displayed in color on a six-monitor graphics processor. System timing and fault verification analyses are conducted by injecting component fault modes and min/max timing delays, and then observing system operation through a red line monitor.

  14. Technical note: Design flood under hydrological uncertainty

    NASA Astrophysics Data System (ADS)

    Botto, Anna; Ganora, Daniele; Claps, Pierluigi; Laio, Francesco

    2017-07-01

    Planning and verification of hydraulic infrastructures require a design estimate of hydrologic variables, usually provided by frequency analysis, and neglecting hydrologic uncertainty. However, when hydrologic uncertainty is accounted for, the design flood value for a specific return period is no longer a unique value, but is represented by a distribution of values. As a consequence, the design flood is no longer univocally defined, making the design process undetermined. The Uncertainty Compliant Design Flood Estimation (UNCODE) procedure is a novel approach that, starting from a range of possible design flood estimates obtained in uncertain conditions, converges to a single design value. This is obtained through a cost-benefit criterion with additional constraints that is numerically solved in a simulation framework. This paper contributes to promoting a practical use of the UNCODE procedure without resorting to numerical computation. A modified procedure is proposed by using a correction coefficient that modifies the standard (i.e., uncertainty-free) design value on the basis of sample length and return period only. The procedure is robust and parsimonious, as it does not require additional parameters with respect to the traditional uncertainty-free analysis. Simple equations to compute the correction term are provided for a number of probability distributions commonly used to represent the flood frequency curve. The UNCODE procedure, when coupled with this simple correction factor, provides a robust way to manage the hydrologic uncertainty and to go beyond the use of traditional safety factors. With all the other parameters being equal, an increase in the sample length reduces the correction factor, and thus the construction costs, while still keeping the same safety level.

  15. Evaluating and Enhancing Driving Ability Among Teens with Autism Spectrum Disorder (ASD)

    DTIC Science & Technology

    2014-10-01

    able to engage in the driving training, and none have experienced simulation adaptation syndrome. 15. SUBJECT TERMS Autism, Driving Safety , Driving...routine driving training (RT) required by the DMV, VRDS training + RT (VRDS-T) would lead to greater improvement in driving safety and less driving...improved driving safety above and beyond RT. We hypothesized that computer-generated feedback would be more palatable than human-generated feedback to

  16. How Adverse Outcome Pathways Can Aid the Development and Use of Computational Prediction Models for Regulatory Toxicology

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wittwehr, Clemens; Aladjov, Hristo; Ankley, Gerald

    Efforts are underway to transform regulatory toxicology and chemical safety assessment from a largely empirical science based on direct observation of apical toxicity outcomes in whole organism toxicity tests to a predictive one in which outcomes and risk are inferred from accumulated mechanistic understanding. The adverse outcome pathway (AOP) framework has emerged as a systematic approach for organizing knowledge that supports such inference. We argue that this systematic organization of knowledge can inform and help direct the design and development of computational prediction models that can further enhance the utility of mechanistic and in silico data for chemical safety assessment.more » Examples of AOP-informed model development and its application to the assessment of chemicals for skin sensitization and multiple modes of endocrine disruption are provided. The role of problem formulation, not only as a critical phase of risk assessment, but also as guide for both AOP and complementary model development described. Finally, a proposal for actively engaging the modeling community in AOP-informed computational model development is made. The contents serve as a vision for how AOPs can be leveraged to facilitate development of computational prediction models needed to support the next generation of chemical safety assessment.« less

  17. 47 CFR 87.143 - Transmitter control requirements.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... Section 87.143 Telecommunication FEDERAL COMMUNICATIONS COMMISSION (CONTINUED) SAFETY AND SPECIAL RADIO..., the control point for an automatically controlled enroute station is the computer facility which controls the transmitter. Any computer controlled transmitter must be equipped to automatically shut down...

  18. 48 CFR 1523.7002 - Waivers.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... ENVIRONMENTAL, CONSERVATION, OCCUPATIONAL SAFETY, AND DRUG-FREE WORKPLACE Energy-Efficient Computer Equipment 1523.7002 Waivers. (a) There are several types of computer equipment which technically fall under the... types of equipment: (1) LAN servers, including file servers; application servers; communication servers...

  19. 48 CFR 1523.7002 - Waivers.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... ENVIRONMENTAL, CONSERVATION, OCCUPATIONAL SAFETY, AND DRUG-FREE WORKPLACE Energy-Efficient Computer Equipment 1523.7002 Waivers. (a) There are several types of computer equipment which technically fall under the... types of equipment: (1) LAN servers, including file servers; application servers; communication servers...

  20. 47 CFR 87.143 - Transmitter control requirements.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... Section 87.143 Telecommunication FEDERAL COMMUNICATIONS COMMISSION (CONTINUED) SAFETY AND SPECIAL RADIO..., the control point for an automatically controlled enroute station is the computer facility which controls the transmitter. Any computer controlled transmitter must be equipped to automatically shut down...

  1. 48 CFR 1523.7002 - Waivers.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... ENVIRONMENTAL, CONSERVATION, OCCUPATIONAL SAFETY, AND DRUG-FREE WORKPLACE Energy-Efficient Computer Equipment 1523.7002 Waivers. (a) There are several types of computer equipment which technically fall under the... types of equipment: (1) LAN servers, including file servers; application servers; communication servers...

  2. 47 CFR 87.143 - Transmitter control requirements.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... Section 87.143 Telecommunication FEDERAL COMMUNICATIONS COMMISSION (CONTINUED) SAFETY AND SPECIAL RADIO..., the control point for an automatically controlled enroute station is the computer facility which controls the transmitter. Any computer controlled transmitter must be equipped to automatically shut down...

  3. 47 CFR 87.143 - Transmitter control requirements.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... Section 87.143 Telecommunication FEDERAL COMMUNICATIONS COMMISSION (CONTINUED) SAFETY AND SPECIAL RADIO..., the control point for an automatically controlled enroute station is the computer facility which controls the transmitter. Any computer controlled transmitter must be equipped to automatically shut down...

  4. 47 CFR 87.143 - Transmitter control requirements.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... Section 87.143 Telecommunication FEDERAL COMMUNICATIONS COMMISSION (CONTINUED) SAFETY AND SPECIAL RADIO..., the control point for an automatically controlled enroute station is the computer facility which controls the transmitter. Any computer controlled transmitter must be equipped to automatically shut down...

  5. Safety in the Automated Office.

    ERIC Educational Resources Information Center

    Graves, Pat R.; Greathouse, Lillian R.

    1990-01-01

    Office automation has introduced new hazards to the workplace: electrical hazards related to computer wiring, musculoskeletal problems resulting from use of computer terminals and design of work stations, and environmental concerns related to ventilation, noise levels, and office machine chemicals. (SK)

  6. Safety analysts training

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bolton, P.

    The purpose of this task was to support ESH-3 in providing Airborne Release Fraction and Respirable Fraction training to safety analysts at LANL who perform accident analysis, hazard analysis, safety analysis, and/or risk assessments at nuclear facilities. The task included preparation of materials for and the conduct of two 3-day training courses covering the following topics: safety analysis process; calculation model; aerosol physic concepts for safety analysis; and overview of empirically derived airborne release fractions and respirable fractions.

  7. A psychometric evaluation of the Chinese version of the nursing home survey on patient safety culture.

    PubMed

    Lin, Shu-Yuan; Tseng, Wei Ting; Hsu, Miao-Ju; Chiang, Hui-Ying; Tseng, Hui-Chen

    2017-12-01

    To test the psychometric properties of the Chinese version of the Nursing Home Survey on Patient Safety Culture scale among staff in long-term care facilities. The Nursing Home Survey on Patient Safety Culture scale is a standard tool for safety culture assessment in nursing homes. Extending its application to different types of long-term care facilities and varied ethnic populations is worth pursuing. A national random survey. A total of 306 managers and staff completed the Chinese version of the Nursing Home Survey on Patient Safety Culture scale among 30 long-term care facilities in Taiwan. Content validity and construct validity were tested by content validity index (CVI) and principal axis factor analysis (PAF) with Promax rotation. Concurrent validity was tested through correlations between the scale and two overall rating items. Reliability was computed by intraclass correlation coefficient and Cronbach's α coefficients. Statistical analyses such as descriptive, Pearson's and Spearman's rho correlations and PAF were completed. Scale-level and item-level CVIs (0.91-0.98) of the Chinese version of the Nursing Home Survey on Patient Safety Culture scale were satisfactory. Four-factor construct and merged item composition differed from the Nursing Home Survey on Patient Safety Culture scale, and it accounted for 53% of variance. Concurrent validity was evident by existing positive correlations between the scale and two overall ratings of resident safety. Cronbach's α coefficients of the subscales and the Chinese version of the Nursing Home Survey on Patient Safety Culture scale ranged from .76-.94. The Chinese version of the Nursing Home Survey on Patient Safety Culture scale identified essential dimensions to reflect the important features of a patient safety culture in long-term care facilities. The researchers introduced the Chinese version of the Nursing Home Survey on Patient Safety Culture for safety culture assessment in long-term care facilities, but further testing of the reliability of the scale in a large Chinese sample and in different long-term care facilities was recommended. The Chinese version of the Nursing Home Survey on Patient Safety Culture scale was developed to increase the users' intention towards safety culture assessment. It can identify areas for improvement, understand safety culture changes over time and evaluate the effectiveness of interventions. © 2017 John Wiley & Sons Ltd.

  8. Modelling and analysis of the sugar cataract development process using stochastic hybrid systems.

    PubMed

    Riley, D; Koutsoukos, X; Riley, K

    2009-05-01

    Modelling and analysis of biochemical systems such as sugar cataract development (SCD) are critical because they can provide new insights into systems, which cannot be easily tested with experiments; however, they are challenging problems due to the highly coupled chemical reactions that are involved. The authors present a stochastic hybrid system (SHS) framework for modelling biochemical systems and demonstrate the approach for the SCD process. A novel feature of the framework is that it allows modelling the effect of drug treatment on the system dynamics. The authors validate the three sugar cataract models by comparing trajectories computed by two simulation algorithms. Further, the authors present a probabilistic verification method for computing the probability of sugar cataract formation for different chemical concentrations using safety and reachability analysis methods for SHSs. The verification method employs dynamic programming based on a discretisation of the state space and therefore suffers from the curse of dimensionality. To analyse the SCD process, a parallel dynamic programming implementation that can handle large, realistic systems was developed. Although scalability is a limiting factor, this work demonstrates that the proposed method is feasible for realistic biochemical systems.

  9. Video-processing-based system for automated pedestrian data collection and analysis when crossing the street

    NASA Astrophysics Data System (ADS)

    Mansouri, Nabila; Watelain, Eric; Ben Jemaa, Yousra; Motamed, Cina

    2018-03-01

    Computer-vision techniques for pedestrian detection and tracking have progressed considerably and become widely used in several applications. However, a quick glance at the literature shows a minimal use of these techniques in pedestrian behavior and safety analysis, which might be due to the technical complexities facing the processing of pedestrian videos. To extract pedestrian trajectories from a video automatically, all road users must be detected and tracked during sequences, which is a challenging task, especially in a congested open-outdoor urban space. A multipedestrian tracker based on an interframe-detection-association process was proposed and evaluated. The tracker results are used to implement an automatic tool for pedestrians data collection when crossing the street based on video processing. The variations in the instantaneous speed allowed the detection of the street crossing phases (approach, waiting, and crossing). These were addressed for the first time in the pedestrian road security analysis to illustrate the causal relationship between pedestrian behaviors in the different phases. A comparison with a manual data collection method, by computing the root mean square error and the Pearson correlation coefficient, confirmed that the procedures proposed have significant potential to automate the data collection process.

  10. Improved processes for meeting the data requirements for implementing the Highway Safety Manual (HSM) and Safety Analyst in Florida.

    DOT National Transportation Integrated Search

    2014-03-01

    Recent research in highway safety has focused on the more advanced and statistically proven techniques of highway : safety analysis. This project focuses on the two most recent safety analysis tools, the Highway Safety Manual (HSM) : and SafetyAnalys...

  11. 20 CFR 725.520 - Computation of benefits.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... FEDERAL MINE SAFETY AND HEALTH ACT, AS AMENDED Payment of Benefits Benefit Rates § 725.520 Computation of benefits. (a) Basic rate. The amount of benefits payable to a beneficiary for a month is determined, in the first instance, by computing the “basic rate.” The basic rate is equal to 371/2 percent of the monthly...

  12. 20 CFR 725.520 - Computation of benefits.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... MINE SAFETY AND HEALTH ACT, AS AMENDED Payment of Benefits Benefit Rates § 725.520 Computation of benefits. (a) Basic rate. The amount of benefits payable to a beneficiary for a month is determined, in the first instance, by computing the “basic rate.” The basic rate is equal to 371/2 percent of the monthly...

  13. Reduce, Reuse, Recycle: Good Earth and the Electronics Dilemma

    ERIC Educational Resources Information Center

    Descy, Don E.

    2007-01-01

    According to the National Safety Council, 63 million computers became obsolete in 2005 alone, and it is estimated that the total number in storage in 2007 numbers upwards of 500 million computers (Earth 911, 2007). This article describes the steps that one should take before disposing of an obsolete computer. First and foremost, all personal…

  14. Can the Analytical Hierarchy Process Model Be Effectively Applied in the Prioritization of Information Assurance Defense In-Depth Measures? --A Quantitative Study

    ERIC Educational Resources Information Center

    Alexander, Rodney T.

    2017-01-01

    Organizational computing devices are increasingly becoming targets of cyber-attacks, and organizations have become dependent on the safety and security of their computer networks and their organizational computing devices. Business and government often use defense in-depth information assurance measures such as firewalls, intrusion detection…

  15. Computer Programs.

    ERIC Educational Resources Information Center

    Anderson, Tiffoni

    This module provides information on development and use of a Material Safety Data Sheet (MSDS) software program that seeks to link literacy skills education, safety training, and human-centered design. Section 1 discusses the development of the software program that helps workers understand the MSDSs that accompany the chemicals with which they…

  16. KENO-VI Primer: A Primer for Criticality Calculations with SCALE/KENO-VI Using GeeWiz

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bowman, Stephen M

    2008-09-01

    The SCALE (Standardized Computer Analyses for Licensing Evaluation) computer software system developed at Oak Ridge National Laboratory is widely used and accepted around the world for criticality safety analyses. The well-known KENO-VI three-dimensional Monte Carlo criticality computer code is one of the primary criticality safety analysis tools in SCALE. The KENO-VI primer is designed to help a new user understand and use the SCALE/KENO-VI Monte Carlo code for nuclear criticality safety analyses. It assumes that the user has a college education in a technical field. There is no assumption of familiarity with Monte Carlo codes in general or with SCALE/KENO-VImore » in particular. The primer is designed to teach by example, with each example illustrating two or three features of SCALE/KENO-VI that are useful in criticality analyses. The primer is based on SCALE 6, which includes the Graphically Enhanced Editing Wizard (GeeWiz) Windows user interface. Each example uses GeeWiz to provide the framework for preparing input data and viewing output results. Starting with a Quickstart section, the primer gives an overview of the basic requirements for SCALE/KENO-VI input and allows the user to quickly run a simple criticality problem with SCALE/KENO-VI. The sections that follow Quickstart include a list of basic objectives at the beginning that identifies the goal of the section and the individual SCALE/KENO-VI features that are covered in detail in the sample problems in that section. Upon completion of the primer, a new user should be comfortable using GeeWiz to set up criticality problems in SCALE/KENO-VI. The primer provides a starting point for the criticality safety analyst who uses SCALE/KENO-VI. Complete descriptions are provided in the SCALE/KENO-VI manual. Although the primer is self-contained, it is intended as a companion volume to the SCALE/KENO-VI documentation. (The SCALE manual is provided on the SCALE installation DVD.) The primer provides specific examples of using SCALE/KENO-VI for criticality analyses; the SCALE/KENO-VI manual provides information on the use of SCALE/KENO-VI and all its modules. The primer also contains an appendix with sample input files.« less

  17. Relating crash frequency and severity: evaluating the effectiveness of shoulder rumble strips on reducing fatal and major injury crashes.

    PubMed

    Wu, Kun-Feng; Donnell, Eric T; Aguero-Valverde, Jonathan

    2014-06-01

    To approach the goal of "Toward Zero Deaths," there is a need to develop an analysis paradigm to better understand the effects of a countermeasure on reducing the number of severe crashes. One of the goals in traffic safety research is to search for an effective treatment to reduce fatal and major injury crashes, referred to as severe crashes. To achieve this goal, the selection of promising countermeasures is of utmost importance, and relies on the effectiveness of candidate countermeasures in reducing severe crashes. Although it is important to precisely evaluate the effectiveness of candidate countermeasures in reducing the number of severe crashes at a site, the current state-of-the-practice often leads to biased estimates. While there have been a few advanced statistical models developed to mitigate the problem in practice, these models are computationally difficult to estimate because severe crashes are dispersed spatially and temporally, and cannot be integrated into the Highway Safety Manual framework, which develops a series of safety performance functions and crash modification factors to predict the number of crashes. Crash severity outcomes are generally integrated into the Highway Safety Manual using deterministic distributions rather than statistical models. Accounting for the variability in crash severity as a function geometric design, traffic flow, and other roadway and roadside features is afforded by estimating statistical models. Therefore, there is a need to develop a new analysis paradigm to resolve the limitations in the current Highway Safety Manual methods. We propose an approach which decomposes the severe crash frequency into a function of the change in the total number of crashes and the probability of a crash becoming a severe crash before and after a countermeasure is implemented. We tested this approach by evaluating the effectiveness of shoulder rumble strips on reducing the number of severe crashes. A total of 310 segments that have had shoulder rumble strips installed during 2002-2009 are included in the analysis. It was found that shoulder rumble strips reduce the total number of crashes, but have no statistically significant effect on reducing the probability of a severe crash outcome. Copyright © 2014 Elsevier Ltd. All rights reserved.

  18. 3D visualization of membrane failures in fuel cells

    NASA Astrophysics Data System (ADS)

    Singh, Yadvinder; Orfino, Francesco P.; Dutta, Monica; Kjeang, Erik

    2017-03-01

    Durability issues in fuel cells, due to chemical and mechanical degradation, are potential impediments in their commercialization. Hydrogen leak development across degraded fuel cell membranes is deemed a lifetime-limiting failure mode and potential safety issue that requires thorough characterization for devising effective mitigation strategies. The scope and depth of failure analysis has, however, been limited by the 2D nature of conventional imaging. In the present work, X-ray computed tomography is introduced as a novel, non-destructive technique for 3D failure analysis. Its capability to acquire true 3D images of membrane damage is demonstrated for the very first time. This approach has enabled unique and in-depth analysis resulting in novel findings regarding the membrane degradation mechanism; these are: significant, exclusive membrane fracture development independent of catalyst layers, localized thinning at crack sites, and demonstration of the critical impact of cracks on fuel cell durability. Evidence of crack initiation within the membrane is demonstrated, and a possible new failure mode different from typical mechanical crack development is identified. X-ray computed tomography is hereby established as a breakthrough approach for comprehensive 3D characterization and reliable failure analysis of fuel cell membranes, and could readily be extended to electrolyzers and flow batteries having similar structure.

  19. Clinical Pilot Study and Computational Modeling of Bitemporal Transcranial Direct Current Stimulation, and Safety of Repeated Courses of Treatment, in Major Depression.

    PubMed

    Ho, Kerrie-Anne; Bai, Siwei; Martin, Donel; Alonzo, Angelo; Dokos, Socrates; Loo, Colleen K

    2015-12-01

    This study aimed to examine a bitemporal (BT) transcranial direct current stimulation (tDCS) electrode montage for the treatment of depression through a clinical pilot study and computational modeling. The safety of repeated courses of stimulation was also examined. Four participants with depression who had previously received multiple courses of tDCS received a 4-week course of BT tDCS. Mood and neuropsychological function were assessed. The results were compared with previous courses of tDCS given to the same participants using different electrode montages. Computational modeling examined the electric field maps produced by the different montages. Three participants showed clinical improvement with BT tDCS (mean [SD] improvement, 49.6% [33.7%]). There were no adverse neuropsychological effects. Computational modeling showed that the BT montage activates the anterior cingulate cortices and brainstem, which are deep brain regions that are important for depression. However, a fronto-extracephalic montage stimulated these areas more effectively. No adverse effects were found in participants receiving up to 6 courses of tDCS. Bitemporal tDCS was safe and led to clinically meaningful efficacy in 3 of 4 participants. However, computational modeling suggests that the BT montage may not activate key brain regions in depression more effectively than another novel montage--fronto-extracephalic tDCS. There is also preliminary evidence to support the safety of up to 6 repeated courses of tDCS.

  20. Using argument notation to engineer biological simulations with increased confidence

    PubMed Central

    Alden, Kieran; Andrews, Paul S.; Polack, Fiona A. C.; Veiga-Fernandes, Henrique; Coles, Mark C.; Timmis, Jon

    2015-01-01

    The application of computational and mathematical modelling to explore the mechanics of biological systems is becoming prevalent. To significantly impact biological research, notably in developing novel therapeutics, it is critical that the model adequately represents the captured system. Confidence in adopting in silico approaches can be improved by applying a structured argumentation approach, alongside model development and results analysis. We propose an approach based on argumentation from safety-critical systems engineering, where a system is subjected to a stringent analysis of compliance against identified criteria. We show its use in examining the biological information upon which a model is based, identifying model strengths, highlighting areas requiring additional biological experimentation and providing documentation to support model publication. We demonstrate our use of structured argumentation in the development of a model of lymphoid tissue formation, specifically Peyer's Patches. The argumentation structure is captured using Artoo (www.york.ac.uk/ycil/software/artoo), our Web-based tool for constructing fitness-for-purpose arguments, using a notation based on the safety-critical goal structuring notation. We show how argumentation helps in making the design and structured analysis of a model transparent, capturing the reasoning behind the inclusion or exclusion of each biological feature and recording assumptions, as well as pointing to evidence supporting model-derived conclusions. PMID:25589574

  1. Using argument notation to engineer biological simulations with increased confidence.

    PubMed

    Alden, Kieran; Andrews, Paul S; Polack, Fiona A C; Veiga-Fernandes, Henrique; Coles, Mark C; Timmis, Jon

    2015-03-06

    The application of computational and mathematical modelling to explore the mechanics of biological systems is becoming prevalent. To significantly impact biological research, notably in developing novel therapeutics, it is critical that the model adequately represents the captured system. Confidence in adopting in silico approaches can be improved by applying a structured argumentation approach, alongside model development and results analysis. We propose an approach based on argumentation from safety-critical systems engineering, where a system is subjected to a stringent analysis of compliance against identified criteria. We show its use in examining the biological information upon which a model is based, identifying model strengths, highlighting areas requiring additional biological experimentation and providing documentation to support model publication. We demonstrate our use of structured argumentation in the development of a model of lymphoid tissue formation, specifically Peyer's Patches. The argumentation structure is captured using Artoo (www.york.ac.uk/ycil/software/artoo), our Web-based tool for constructing fitness-for-purpose arguments, using a notation based on the safety-critical goal structuring notation. We show how argumentation helps in making the design and structured analysis of a model transparent, capturing the reasoning behind the inclusion or exclusion of each biological feature and recording assumptions, as well as pointing to evidence supporting model-derived conclusions.

  2. Causes and prevention of splitting/bursting failure of concrete crossties: a computational study

    DOT National Transportation Integrated Search

    2017-09-17

    Concrete splitting/bursting is a well-known failure mode of concrete crossties that can compromise the crosstie integrity and raise railroad maintenance and track safety concerns. This paper presents a computational study aimed at better understandin...

  3. The design and realisation of the IXV Mission Analysis and Flight Mechanics

    NASA Astrophysics Data System (ADS)

    Haya-Ramos, Rodrigo; Blanco, Gonzalo; Pontijas, Irene; Bonetti, Davide; Freixa, Jordi; Parigini, Cristina; Bassano, Edmondo; Carducci, Riccardo; Sudars, Martins; Denaro, Angelo; Angelini, Roberto; Mancuso, Salvatore

    2016-07-01

    The Intermediate eXperimental Vehicle (IXV) is a suborbital re-entry demonstrator successfully launched in February 2015 focusing on the in-flight demonstration of a lifting body system with active aerodynamic control surfaces. This paper presents an overview of the Mission Analysis and Flight Mechanics of the IXV vehicle, which comprises computation of the End-to-End (launch to splashdown) design trajectories, characterisation of the Entry Corridor, assessment of the Mission Performances through Monte Carlo campaigns, contribution to the aerodynamic database, analysis of the Visibility and link budget from Ground Stations and GPS, support to safety analyses (off nominal footprints), specification of the Centre of Gravity box, selection of the Angle of Attack trim line to be flown and characterisation of the Flying Qualities performances. An initial analysis and comparison with the raw flight data obtained during the flight will be discussed and first lessons learned derived.

  4. RELAP-7 Software Verification and Validation Plan

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Smith, Curtis L.; Choi, Yong-Joon; Zou, Ling

    This INL plan comprehensively describes the software for RELAP-7 and documents the software, interface, and software design requirements for the application. The plan also describes the testing-based software verification and validation (SV&V) process—a set of specially designed software models used to test RELAP-7. The RELAP-7 (Reactor Excursion and Leak Analysis Program) code is a nuclear reactor system safety analysis code being developed at Idaho National Laboratory (INL). The code is based on the INL’s modern scientific software development framework – MOOSE (Multi-Physics Object-Oriented Simulation Environment). The overall design goal of RELAP-7 is to take advantage of the previous thirty yearsmore » of advancements in computer architecture, software design, numerical integration methods, and physical models. The end result will be a reactor systems analysis capability that retains and improves upon RELAP5’s capability and extends the analysis capability for all reactor system simulation scenarios.« less

  5. The development of a 3D risk analysis method.

    PubMed

    I, Yet-Pole; Cheng, Te-Lung

    2008-05-01

    Much attention has been paid to the quantitative risk analysis (QRA) research in recent years due to more and more severe disasters that have happened in the process industries. Owing to its calculation complexity, very few software, such as SAFETI, can really make the risk presentation meet the practice requirements. However, the traditional risk presentation method, like the individual risk contour in SAFETI, is mainly based on the consequence analysis results of dispersion modeling, which usually assumes that the vapor cloud disperses over a constant ground roughness on a flat terrain with no obstructions and concentration fluctuations, which is quite different from the real situations of a chemical process plant. All these models usually over-predict the hazardous regions in order to maintain their conservativeness, which also increases the uncertainty of the simulation results. On the other hand, a more rigorous model such as the computational fluid dynamics (CFD) model can resolve the previous limitations; however, it cannot resolve the complexity of risk calculations. In this research, a conceptual three-dimensional (3D) risk calculation method was proposed via the combination of results of a series of CFD simulations with some post-processing procedures to obtain the 3D individual risk iso-surfaces. It is believed that such technique will not only be limited to risk analysis at ground level, but also be extended into aerial, submarine, or space risk analyses in the near future.

  6. Nursing benefits of using an automated injection system for ictal brain single photon emission computed tomography.

    PubMed

    Vonhofen, Geraldine; Evangelista, Tonya; Lordeon, Patricia

    2012-04-01

    The traditional method of administering radioactive isotopes to pediatric patients undergoing ictal brain single photon emission computed tomography testing has been by manual injections. This method presents certain challenges for nursing, including time requirements and safety risks. This quality improvement project discusses the implementation of an automated injection system for isotope administration and its impact on staffing, safety, and nursing satisfaction. It was conducted in an epilepsy monitoring unit at a large urban pediatric facility. Results of this project showed a decrease in the number of nurses exposed to radiation and improved nursing satisfaction with the use of the automated injection system. In addition, there was a decrease in the number of nursing hours required during ictal brain single photon emission computed tomography testing.

  7. Design of a Fatigue Detection System for High-Speed Trains Based on Driver Vigilance Using a Wireless Wearable EEG.

    PubMed

    Zhang, Xiaoliang; Li, Jiali; Liu, Yugang; Zhang, Zutao; Wang, Zhuojun; Luo, Dianyuan; Zhou, Xiang; Zhu, Miankuan; Salman, Waleed; Hu, Guangdi; Wang, Chunbai

    2017-03-01

    The vigilance of the driver is important for railway safety, despite not being included in the safety management system (SMS) for high-speed train safety. In this paper, a novel fatigue detection system for high-speed train safety based on monitoring train driver vigilance using a wireless wearable electroencephalograph (EEG) is presented. This system is designed to detect whether the driver is drowsiness. The proposed system consists of three main parts: (1) a wireless wearable EEG collection; (2) train driver vigilance detection; and (3) early warning device for train driver. In the first part, an 8-channel wireless wearable brain-computer interface (BCI) device acquires the locomotive driver's brain EEG signal comfortably under high-speed train-driving conditions. The recorded data are transmitted to a personal computer (PC) via Bluetooth. In the second step, a support vector machine (SVM) classification algorithm is implemented to determine the vigilance level using the Fast Fourier transform (FFT) to extract the EEG power spectrum density (PSD). In addition, an early warning device begins to work if fatigue is detected. The simulation and test results demonstrate the feasibility of the proposed fatigue detection system for high-speed train safety.

  8. National Dam Safety Program. Nelson Dam (Inventory Number VA 12501), James River Basin, Nelson County, Virginia. Phase I Inspection Report.

    DTIC Science & Technology

    1981-06-01

    during tropical storm Camille. 5.4 Flood Potential: The 100-Year Flood, 1/2 PMF, and PH? were developed by use of the HEC-l computer program (Reference 2...Appendix V) and routed through the reservoir using the NWS-Dambreak computer program (Reference 3, Appendix V). Clark’s Tc and R coefficients for...AD-AO" 330 ARMY ENGINEER DISTRICT NORFOLK VA F/6 13/13 NATIONAL DAM SAFETY PROGRAM . NELSON DAM (INVENTORY NUMBER VA 12--ETC(U) JUN 81 B 0 TARANUNCL

  9. Airport Simulations Using Distributed Computational Resources

    NASA Technical Reports Server (NTRS)

    McDermott, William J.; Maluf, David A.; Gawdiak, Yuri; Tran, Peter; Clancy, Daniel (Technical Monitor)

    2002-01-01

    The Virtual National Airspace Simulation (VNAS) will improve the safety of Air Transportation. In 2001, using simulation and information management software running over a distributed network of super-computers, researchers at NASA Ames, Glenn, and Langley Research Centers developed a working prototype of a virtual airspace. This VNAS prototype modeled daily operations of the Atlanta airport by integrating measured operational data and simulation data on up to 2,000 flights a day. The concepts and architecture developed by NASA for this prototype are integral to the National Airspace Simulation to support the development of strategies improving aviation safety, identifying precursors to component failure.

  10. Finite Element Analysis of a NASA National Transonic Facility Wind Tunnel Balance

    NASA Technical Reports Server (NTRS)

    Lindell, Michael C.

    1996-01-01

    This paper presents the results of finite element analyses and correlation studies performed on a NASA National Transonic Facility (NTF) Wind Tunnel balance. In the past NASA has relied primarily on classical hand analyses, coupled with relatively large safety factors, for predicting maximum stresses in wind tunnel balances. Now, with the significant advancements in computer technology and sophistication of general purpose analysis codes, it is more reasonable to pursue finite element analyses of these balances. The correlation studies of the present analyses show very good agreement between the analyses and data measured with strain gages and therefore the studies give higher confidence for using finite element analyses to analyze and optimize balance designs in the future.

  11. Finite Element Analysis of a NASA National Transonic Facility Wide Tunnel Balance

    NASA Technical Reports Server (NTRS)

    Lindell, Michael C. (Editor)

    1999-01-01

    This paper presents the results of finite element analyses and correlation studies performed on a NASA National Transonic Facility (NTF) Wind Tunnel balance. In the past NASA has relied primarily on classical hand analyses, coupled with relatively large safety factors, for predicting maximum stresses in wind tunnel balances. Now, with the significant advancements in computer technology and sophistication of general purpose analysis codes, it is more reasonable to pursue finite element analyses of these balances. The correlation studies of the present analyses show very good agreement between the analyses and data measured with strain gages and therefore the studies give higher confidence for using finite element analyses to analyze and optimize balance designs in the future.

  12. An introduction to metabolomics and its potential application in veterinary science.

    PubMed

    Jones, Oliver A H; Cheung, Victoria L

    2007-10-01

    Metabolomics has been found to be applicable to a wide range of fields, including the study of gene function, toxicology, plant sciences, environmental analysis, clinical diagnostics, nutrition, and the discrimination of organism genotypes. This approach combines high-throughput sample analysis with computer-assisted multivariate pattern-recognition techniques. It is increasingly being deployed in toxico- and pharmacokinetic studies in the pharmaceutical industry, especially during the safety assessment of candidate drugs in human medicine. However, despite the potential of this technique to reduce both costs and the numbers of animals used for research, examples of the application of metabolomics in veterinary research are, thus far, rare. Here we give an introduction to metabolomics and discuss its potential in the field of veterinary science.

  13. Efficient runner safety assessment during early design phase and root cause analysis

    NASA Astrophysics Data System (ADS)

    Liang, Q. W.; Lais, S.; Gentner, C.; Braun, O.

    2012-11-01

    Fatigue related problems in Francis turbines, especially high head Francis turbines, have been published several times in the last years. During operation the runner is exposed to various steady and unsteady hydraulic loads. Therefore the analysis of forced response of the runner structure requires a combined approach of fluid dynamics and structural dynamics. Due to the high complexity of the phenomena and due to the limitation of computer power, the numerical prediction was in the past too expensive and not feasible for the use as standard design tool. However, due to continuous improvement of the knowledge and the simulation tools such complex analysis has become part of the design procedure in ANDRITZ HYDRO. This article describes the application of most advanced analysis techniques in runner safety check (RSC), including steady state CFD analysis, transient CFD analysis considering rotor stator interaction (RSI), static FE analysis and modal analysis in water considering the added mass effect, in the early design phase. This procedure allows a very efficient interaction between the hydraulic designer and the mechanical designer during the design phase, such that a risk of failure can be detected and avoided in an early design stage.The RSC procedure can also be applied to a root cause analysis (RCA) both to find out the cause of failure and to quickly define a technical solution to meet the safety criteria. An efficient application to a RCA of cracks in a Francis runner is quoted in this article as an example. The results of the RCA are presented together with an efficient and inexpensive solution whose effectiveness could be proven again by applying the described RSC technics. It is shown that, with the RSC procedure developed and applied as standard procedure in ANDRITZ HYDRO such a failure is excluded in an early design phase. Moreover, the RSC procedure is compatible with different commercial and open source codes and can be easily adapted to apply for other types of turbines, such as pump turbines and Pelton runners.

  14. Prospective Analysis of the Safety and Efficacy of Percutaneous Cryoablation for pT1NxMx Biopsy-Proven Renal Cell Carcinoma

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rodriguez, Ronald; Cizman, Ziga; Hong, Kelvin

    2011-06-15

    Purpose: Our objective was to determine the efficacy and safety of image-guided, percutaneous cryoablation for American Joint Committee on Cancer pT1ANxMx and pT1BNxMx biopsy-proven renal cell carcinoma (RCC). Materials and Methods: Computed tomography (CT)-guided, percutaneous cryoablation was used to treat 117 renal lesions in 113 consecutive patients with pT1NxMx RCC. All 117 ablations were included in the safety analysis, and complications were categorized according to Common Terminology Criteria for Adverse Events version 3.0 (CTCAE v3.0). Eighty-one lesions were biopsy-proven RCC and were included in the efficacy analysis. Technical success was defined as the 'ice-ball' covering the entire lesion plus amore » minimum 5-mm margin. Efficacy was defined as complete lack of enhancement and continuous decrease in size on subsequent follow-up imaging studies. Results: Technical success was 100%, with 15% of ablations requiring air or saline injection to prevent nontarget ablation. We recorded a 7% rate of clinically significant complications (CTCAE category {>=}2) and 0% mortality. Renal function was not adversely affected. Seventy percent of patients were discharged to home on the same day. Efficacy was 98.7% for a median follow-up of 67 weeks (range 7-172). For the subgroup of patients that reached a median follow-up of 2 (n = 59) and 3 years (n = 13), efficacy was 98.3 and 92.3%, respectively. Cancer specific survival was 100%. Conclusions: CT-guided, percutaneous cryoablation has an excellent safety and efficacy profile for stage T1A and T1B RCC; however, longer follow-up is needed to compare it with other nephron-sparing surgical treatments. It is a great option for nonsurgical patients, those in whom renal function cannot be further sacrificed, and those at risk for metachronous lesions.« less

  15. The application of CFD to the modelling of fires in complex geometries

    NASA Astrophysics Data System (ADS)

    Burns, A. D.; Clarke, D. S.; Guilbert, P.; Jones, I. P.; Simcox, S.; Wilkes, N. S.

    The application of Computational Fluid Dynamics (CFD) to industrial safety is a challenging activity. In particular it involves the interaction of several different physical processes, including turbulence, combustion, radiation, buoyancy, compressible flow and shock waves in complex three-dimensional geometries. In addition, there may be multi-phase effects arising, for example, from sprinkler systems for extinguishing fires. The FLOW3D software (1-3) from Computational Fluid Dynamics Services (CFDS) is in widespread use in industrial safety problems, both within AEA Technology, and also by CFDS's commercial customers, for example references (4-13). This paper discusses some other applications of FLOW3D to safety problems. These applications illustrate the coupling of the gas flows with radiation models and combustion models, particularly for complex geometries where simpler radiation models are not applicable.

  16. Software development for safety-critical medical applications

    NASA Technical Reports Server (NTRS)

    Knight, John C.

    1992-01-01

    There are many computer-based medical applications in which safety and not reliability is the overriding concern. Reduced, altered, or no functionality of such systems is acceptable as long as no harm is done. A precise, formal definition of what software safety means is essential, however, before any attempt can be made to achieve it. Without this definition, it is not possible to determine whether a specific software entity is safe. A set of definitions pertaining to software safety will be presented and a case study involving an experimental medical device will be described. Some new techniques aimed at improving software safety will also be discussed.

  17. Airline Safety and Economy

    NASA Technical Reports Server (NTRS)

    1993-01-01

    This video documents efforts at NASA Langley Research Center to improve safety and economy in aircraft. Featured are the cockpit weather information needs computer system, which relays real time weather information to the pilot, and efforts to improve techniques to detect structural flaws and corrosion, such as the thermal bond inspection system.

  18. 29 CFR 5.32 - Overtime payments.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... THE CONTRACT WORK HOURS AND SAFETY STANDARDS ACT) Interpretation of the Fringe Benefits Provisions of... Contract Work Hours and Safety Standards Act, and the Walsh-Healey Public Contracts Act whenever the... computed on a regular or basic rate of $3.00 an hour. However, in some cases a question of fact may be...

  19. 29 CFR 5.32 - Overtime payments.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... THE CONTRACT WORK HOURS AND SAFETY STANDARDS ACT) Interpretation of the Fringe Benefits Provisions of... Contract Work Hours and Safety Standards Act, and the Walsh-Healey Public Contracts Act whenever the... computed on a regular or basic rate of $3.00 an hour. However, in some cases a question of fact may be...

  20. A tiered approach to incorporate exposure and pharmacokinetics considerations in in vitro based safety assessment

    EPA Science Inventory

    Application of in vitro based safety assessment requires reconciling chemical concentrations sufficient to produce bioactivity in vitro with those that trigger a molecular initiating event at the relevant in vivo target site. To address such need, computational tools such as phy...

  1. ACCE Submission to Public Consultation to "Enhancing Online Safety for Children"

    ERIC Educational Resources Information Center

    Henderson, Michael; de Zwart, Melissa

    2014-01-01

    This article represents the submission of the "Australian Council for Computers in Education's" (AACE) response to the Australian Government's Department of Communications' initiative for "Enhancing Online Safety for Children." Henderson and de Zwart agree that children and their educators and caregivers are in serious need…

  2. Computation Directorate Annual Report 2003

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Crawford, D L; McGraw, J R; Ashby, S F

    Big computers are icons: symbols of the culture, and of the larger computing infrastructure that exists at Lawrence Livermore. Through the collective effort of Laboratory personnel, they enable scientific discovery and engineering development on an unprecedented scale. For more than three decades, the Computation Directorate has supplied the big computers that enable the science necessary for Laboratory missions and programs. Livermore supercomputing is uniquely mission driven. The high-fidelity weapon simulation capabilities essential to the Stockpile Stewardship Program compel major advances in weapons codes and science, compute power, and computational infrastructure. Computation's activities align with this vital mission of the Departmentmore » of Energy. Increasingly, non-weapons Laboratory programs also rely on computer simulation. World-class achievements have been accomplished by LLNL specialists working in multi-disciplinary research and development teams. In these teams, Computation personnel employ a wide array of skills, from desktop support expertise, to complex applications development, to advanced research. Computation's skilled professionals make the Directorate the success that it has become. These individuals know the importance of the work they do and the many ways it contributes to Laboratory missions. They make appropriate and timely decisions that move the entire organization forward. They make Computation a leader in helping LLNL achieve its programmatic milestones. I dedicate this inaugural Annual Report to the people of Computation in recognition of their continuing contributions. I am proud that we perform our work securely and safely. Despite increased cyber attacks on our computing infrastructure from the Internet, advanced cyber security practices ensure that our computing environment remains secure. Through Integrated Safety Management (ISM) and diligent oversight, we address safety issues promptly and aggressively. The safety of our employees, whether at work or at home, is a paramount concern. Even as the Directorate meets today's supercomputing requirements, we are preparing for the future. We are investigating open-source cluster technology, the basis of our highly successful Mulitprogrammatic Capability Resource (MCR). Several breakthrough discoveries have resulted from MCR calculations coupled with theory and experiment, prompting Laboratory scientists to demand ever-greater capacity and capability. This demand is being met by a new 23-TF system, Thunder, with architecture modeled on MCR. In preparation for the ''after-next'' computer, we are researching technology even farther out on the horizon--cell-based computers. Assuming that the funding and the technology hold, we will acquire the cell-based machine BlueGene/L within the next 12 months.« less

  3. Feasibility and safety of augmented-reality glass for computed tomography-assisted percutaneous revascularization of coronary chronic total occlusion: A single center prospective pilot study.

    PubMed

    Opolski, Maksymilian P; Debski, Artur; Borucki, Bartosz A; Staruch, Adam D; Kepka, Cezary; Rokicki, Jakub K; Sieradzki, Bartosz; Witkowski, Adam

    2017-11-01

    Percutaneous coronary intervention (PCI) of chronic total occlusion (CTO) may be facilitated by projection of coronary computed tomography angiography (CTA) datasets in the catheterization laboratory. There is no data on the feasibility and safety outcomes of CTA-assisted CTO PCI using a wearable augmented-reality glass. A total of 15 patients scheduled for elective antegrade CTO intervention were prospectively enrolled and underwent preprocedural coronary CTA. Three-dimensional and curved multiplanar CT reconstructions were transmitted to a head-mounted hands-free computer worn by interventional cardiologists during CTO PCI to provide additional information on CTO tortuosity and calcification. The results of CTO PCI using a wearable computer were compared with a time-matched prospective angiographic registry of 59 patients undergoing antegrade CTO PCI without a wearable computer. Operators' satisfaction was assessed by a 5-point Likert scale. Mean age was 64 ± 8 years and the mean J-CTO score was 2.1 ± 0.9 in the CTA-assisted group. The voice-activated co-registration and review of CTA images in a wearable computer during CTO PCI were feasible and highly rated by PCI operators (4.7/5 points). There were no major adverse cardiovascular events. Compared with standard CTO PCI, CTA-assisted recanalization of CTO using a wearable computer showed more frequent selection of the first-choice stiff wire (0% vs 40%, p < 0.001) and lower contrast exposure (166 ± 52 vs 134 ± 43 ml, p = 0.03). Overall CTO success rates and safety outcomes remained similar between both groups. CTA-assisted CTO PCI using an augmented-reality glass is feasible and safe, and might reduce the resources required for the interventional treatment of CTO. Copyright © 2017 Society of Cardiovascular Computed Tomography. Published by Elsevier Inc. All rights reserved.

  4. Medication safety and knowledge-based functions: a stepwise approach against information overload.

    PubMed

    Patapovas, Andrius; Dormann, Harald; Sedlmayr, Brita; Kirchner, Melanie; Sonst, Anja; Müller, Fabian; Pfistermeister, Barbara; Plank-Kiegele, Bettina; Vogler, Renate; Maas, Renke; Criegee-Rieck, Manfred; Prokosch, Hans-Ulrich; Bürkle, Thomas

    2013-09-01

    The aim was to improve medication safety in an emergency department (ED) by enhancing the integration and presentation of safety information for drug therapy. Based on an evaluation of safety of drug therapy issues in the ED and a review of computer-assisted intervention technologies we redesigned an electronic case sheet and implemented computer-assisted interventions into the routine work flow. We devised a four step system of alerts, and facilitated access to different levels of drug information. System use was analyzed over a period of 6 months. In addition, physicians answered a survey based on the technology acceptance model TAM2. The new application was implemented in an informal manner to avoid work flow disruption. Log files demonstrated that step I, 'valid indication' was utilized for 3% of the recorded drugs and step II 'tooltip for well-known drug risks' for 48% of the drugs. In the questionnaire, the computer-assisted interventions were rated better than previous paper based measures (checklists, posters) with regard to usefulness, support of work and information quality. A stepwise assisting intervention received positive user acceptance. Some intervention steps have been seldom used, others quite often. We think that we were able to avoid over-alerting and work flow intrusion in a critical ED environment. © 2013 The Authors. British Journal of Clinical Pharmacology © 2013 The British Pharmacological Society.

  5. FDA toxicity databases and real-time data entry

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Arvidson, Kirk B.

    Structure-searchable electronic databases are valuable new tools that are assisting the FDA in its mission to promptly and efficiently review incoming submissions for regulatory approval of new food additives and food contact substances. The Center for Food Safety and Applied Nutrition's Office of Food Additive Safety (CFSAN/OFAS), in collaboration with Leadscope, Inc., is consolidating genetic toxicity data submitted in food additive petitions from the 1960s to the present day. The Center for Drug Evaluation and Research, Office of Pharmaceutical Science's Informatics and Computational Safety Analysis Staff (CDER/OPS/ICSAS) is separately gathering similar information from their submissions. Presently, these data are distributedmore » in various locations such as paper files, microfiche, and non-standardized toxicology memoranda. The organization of the data into a consistent, searchable format will reduce paperwork, expedite the toxicology review process, and provide valuable information to industry that is currently available only to the FDA. Furthermore, by combining chemical structures with genetic toxicity information, biologically active moieties can be identified and used to develop quantitative structure-activity relationship (QSAR) modeling and testing guidelines. Additionally, chemicals devoid of toxicity data can be compared to known structures, allowing for improved safety review through the identification and analysis of structural analogs. Four database frameworks have been created: bacterial mutagenesis, in vitro chromosome aberration, in vitro mammalian mutagenesis, and in vivo micronucleus. Controlled vocabularies for these databases have been established. The four separate genetic toxicity databases are compiled into a single, structurally-searchable database for easy accessibility of the toxicity information. Beyond the genetic toxicity databases described here, additional databases for subchronic, chronic, and teratogenicity studies have been prepared.« less

  6. Application of failure mode and effects analysis (FMEA) to pretreatment phases in tomotherapy

    PubMed Central

    Broggi, Sara; Cantone, Marie Claire; Chiara, Anna; Muzio, Nadia Di; Longobardi, Barbara; Mangili, Paola

    2013-01-01

    The aim of this paper was the application of the failure mode and effects analysis (FMEA) approach to assess the risks for patients undergoing radiotherapy treatments performed by means of a helical tomotherapy unit. FMEA was applied to the preplanning imaging, volume determination, and treatment planning stages of the tomotherapy process and consisted of three steps: 1) identification of the involved subprocesses; 2) identification and ranking of the potential failure modes, together with their causes and effects, using the risk probability number (RPN) scoring system; and 3) identification of additional safety measures to be proposed for process quality and safety improvement. RPN upper threshold for little concern of risk was set at 125. A total of 74 failure modes were identified: 38 in the stage of preplanning imaging and volume determination, and 36 in the stage of planning. The threshold of 125 for RPN was exceeded in four cases: one case only in the phase of preplanning imaging and volume determination, and three cases in the stage of planning. The most critical failures appeared related to (i) the wrong or missing definition and contouring of the overlapping regions, (ii) the wrong assignment of the overlap priority to each anatomical structure, (iii) the wrong choice of the computed tomography calibration curve for dose calculation, and (iv) the wrong (or not performed) choice of the number of fractions in the planning station. On the basis of these findings, in addition to the safety strategies already adopted in the clinical practice, novel solutions have been proposed for mitigating the risk of these failures and to increase patient safety. PACS number: 87.55.Qr PMID:24036868

  7. Crew interface analysis: Selected articles on space human factors research, 1987 - 1991

    NASA Technical Reports Server (NTRS)

    Bagian, Tandi (Compiler)

    1993-01-01

    As part of the Flight Crew Support Division at NASA, the Crew Interface Analysis Section is dedicated to the study of human factors in the manned space program. It assumes a specialized role that focuses on answering operational questions pertaining to NASA's Space Shuttle and Space Station Freedom Programs. One of the section's key contributions is to provide knowledge and information about human capabilities and limitations that promote optimal spacecraft and habitat design and use to enhance crew safety and productivity. The section provides human factors engineering for the ongoing missions as well as proposed missions that aim to put human settlements on the Moon and Mars. Research providing solutions to operational issues is the primary objective of the Crew Interface Analysis Section. The studies represent such subdisciplines as ergonomics, space habitability, man-computer interaction, and remote operator interaction.

  8. Aviation safety research and transportation/hazard avoidance and elimination

    NASA Technical Reports Server (NTRS)

    Sonnenschein, C. M.; Dimarzio, C.; Clippinger, D.; Toomey, D.

    1976-01-01

    Data collected by the Scanning Laser Doppler Velocimeter System (SLDVS) was analyzed to determine the feasibility of the SLDVS for monitoring aircraft wake vortices in an airport environment. Data were collected on atmospheric vortices and analyzed. Over 1600 landings were monitored at Kennedy International Airport and by the end of the test period 95 percent of the runs with large aircraft were producing usable results in real time. The transport was determined in real time and post analysis using algorithms which performed centroids on the highest amplitude in the thresholded spectrum. Making use of other parameters of the spectrum, vortex flow fields were studied along with the time histories of peak velocities and amplitudes. The post analysis of the data was accomplished with a CDC-6700 computer using several programs developed for LDV data analysis.

  9. Advancing Usability Evaluation through Human Reliability Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ronald L. Boring; David I. Gertman

    2005-07-01

    This paper introduces a novel augmentation to the current heuristic usability evaluation methodology. The SPAR-H human reliability analysis method was developed for categorizing human performance in nuclear power plants. Despite the specialized use of SPAR-H for safety critical scenarios, the method also holds promise for use in commercial off-the-shelf software usability evaluations. The SPAR-H method shares task analysis underpinnings with human-computer interaction, and it can be easily adapted to incorporate usability heuristics as performance shaping factors. By assigning probabilistic modifiers to heuristics, it is possible to arrive at the usability error probability (UEP). This UEP is not a literal probabilitymore » of error but nonetheless provides a quantitative basis to heuristic evaluation. When combined with a consequence matrix for usability errors, this method affords ready prioritization of usability issues.« less

  10. A hydrogen-oxygen rocket engine coolant passage design program (RECOP) for fluid-cooled thrust chambers and nozzles

    NASA Technical Reports Server (NTRS)

    Tomsik, Thomas M.

    1994-01-01

    The design of coolant passages in regeneratively cooled thrust chambers is critical to the operation and safety of a rocket engine system. Designing a coolant passage is a complex thermal and hydraulic problem requiring an accurate understanding of the heat transfer between the combustion gas and the coolant. Every major rocket engine company has invested in the development of thrust chamber computer design and analysis tools; two examples are Rocketdyne's REGEN code and Aerojet's ELES program. In an effort to augment current design capabilities for government and industry, the NASA Lewis Research Center is developing a computer model to design coolant passages for advanced regeneratively cooled thrust chambers. The RECOP code incorporates state-of-the-art correlations, numerical techniques and design methods, certainly minimum requirements for generating optimum designs of future space chemical engines. A preliminary version of the RECOP model was recently completed and code validation work is in progress. This paper introduces major features of RECOP and compares the analysis to design points for the first test case engine; the Pratt & Whitney RL10A-3-3A thrust chamber.

  11. 14 CFR 417.231 - Collision avoidance analysis.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 14 Aeronautics and Space 4 2010-01-01 2010-01-01 false Collision avoidance analysis. 417.231..., DEPARTMENT OF TRANSPORTATION LICENSING LAUNCH SAFETY Flight Safety Analysis § 417.231 Collision avoidance analysis. (a) General. A flight safety analysis must include a collision avoidance analysis that...

  12. The ELF in Your Library.

    ERIC Educational Resources Information Center

    McKimmie, Tim; Smith, Jeanette

    1994-01-01

    Presents an overview of the issues related to extremely low frequency (ELF) radiation from computer video display terminals. Highlights include electromagnetic fields; measuring ELF; computer use in libraries; possible health effects; electromagnetic radiation; litigation and legislation; standards and safety; and what libraries can do. (Contains…

  13. 10 CFR 2.4 - Definitions.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... and safety or the common defense and security; security measures for the physical protection and... computer that contains the participant's name, e-mail address, and participant's digital signature, proves... inspection. It is also the place where NRC makes computer terminals available to access the Publicly...

  14. 10 CFR 2.4 - Definitions.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... and safety or the common defense and security; security measures for the physical protection and... computer that contains the participant's name, e-mail address, and participant's digital signature, proves... inspection. It is also the place where NRC makes computer terminals available to access the Publicly...

  15. Design Evaluation of Wind Turbine Spline Couplings Using an Analytical Model: Preprint

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Guo, Y.; Keller, J.; Wallen, R.

    2015-02-01

    Articulated splines are commonly used in the planetary stage of wind turbine gearboxes for transmitting the driving torque and improving load sharing. Direct measurement of spline loads and performance is extremely challenging because of limited accessibility. This paper presents an analytical model for the analysis of articulated spline coupling designs. For a given torque and shaft misalignment, this analytical model quickly yields insights into relationships between the spline design parameters and resulting loads; bending, contact, and shear stresses; and safety factors considering various heat treatment methods. Comparisons of this analytical model against previously published computational approaches are also presented.

  16. Light aircraft crash safety program

    NASA Technical Reports Server (NTRS)

    Thomson, R. G.; Hayduk, R. J.

    1974-01-01

    NASA is embarked upon research and development tasks aimed at providing the general aviation industry with a reliable crashworthy airframe design technology. The goals of the NASA program are: reliable analytical techniques for predicting the nonlinear behavior of structures; significant design improvements of airframes; and simulated full-scale crash test data. The analytical tools will include both simplified procedures for estimating energy absorption characteristics and more complex computer programs for analysis of general airframe structures under crash loading conditions. The analytical techniques being developed both in-house and under contract are described, and a comparison of some analytical predictions with experimental results is shown.

  17. NASA HPCC Technology for Aerospace Analysis and Design

    NASA Technical Reports Server (NTRS)

    Schulbach, Catherine H.

    1999-01-01

    The Computational Aerosciences (CAS) Project is part of NASA's High Performance Computing and Communications Program. Its primary goal is to accelerate the availability of high-performance computing technology to the US aerospace community-thus providing the US aerospace community with key tools necessary to reduce design cycle times and increase fidelity in order to improve safety, efficiency and capability of future aerospace vehicles. A complementary goal is to hasten the emergence of a viable commercial market within the aerospace community for the advantage of the domestic computer hardware and software industry. The CAS Project selects representative aerospace problems (especially design) and uses them to focus efforts on advancing aerospace algorithms and applications, systems software, and computing machinery to demonstrate vast improvements in system performance and capability over the life of the program. Recent demonstrations have served to assess the benefits of possible performance improvements while reducing the risk of adopting high-performance computing technology. This talk will discuss past accomplishments in providing technology to the aerospace community, present efforts, and future goals. For example, the times to do full combustor and compressor simulations (of aircraft engines) have been reduced by factors of 320:1 and 400:1 respectively. While this has enabled new capabilities in engine simulation, the goal of an overnight, dynamic, multi-disciplinary, 3-dimensional simulation of an aircraft engine is still years away and will require new generations of high-end technology.

  18. Analysis of crash proportion by vehicle type at traffic analysis zone level: A mixed fractional split multinomial logit modeling approach with spatial effects.

    PubMed

    Lee, Jaeyoung; Yasmin, Shamsunnahar; Eluru, Naveen; Abdel-Aty, Mohamed; Cai, Qing

    2018-02-01

    In traffic safety literature, crash frequency variables are analyzed using univariate count models or multivariate count models. In this study, we propose an alternative approach to modeling multiple crash frequency dependent variables. Instead of modeling the frequency of crashes we propose to analyze the proportion of crashes by vehicle type. A flexible mixed multinomial logit fractional split model is employed for analyzing the proportions of crashes by vehicle type at the macro-level. In this model, the proportion allocated to an alternative is probabilistically determined based on the alternative propensity as well as the propensity of all other alternatives. Thus, exogenous variables directly affect all alternatives. The approach is well suited to accommodate for large number of alternatives without a sizable increase in computational burden. The model was estimated using crash data at Traffic Analysis Zone (TAZ) level from Florida. The modeling results clearly illustrate the applicability of the proposed framework for crash proportion analysis. Further, the Excess Predicted Proportion (EPP)-a screening performance measure analogous to Highway Safety Manual (HSM), Excess Predicted Average Crash Frequency is proposed for hot zone identification. Using EPP, a statewide screening exercise by the various vehicle types considered in our analysis was undertaken. The screening results revealed that the spatial pattern of hot zones is substantially different across the various vehicle types considered. Copyright © 2017 Elsevier Ltd. All rights reserved.

  19. RAVEN: a GUI and an Artificial Intelligence Engine in a Dynamic PRA Framework

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    C. Rabiti; D. Mandelli; A. Alfonsi

    Increases in computational power and pressure for more accurate simulations and estimations of accident scenario consequences are driving the need for Dynamic Probabilistic Risk Assessment (PRA) [1] of very complex models. While more sophisticated algorithms and computational power address the back end of this challenge, the front end is still handled by engineers that need to extract meaningful information from the large amount of data and build these complex models. Compounding this problem is the difficulty in knowledge transfer and retention, and the increasing speed of software development. The above-described issues would have negatively impacted deployment of the new highmore » fidelity plant simulator RELAP-7 (Reactor Excursion and Leak Analysis Program) at Idaho National Laboratory. Therefore, RAVEN that was initially focused to be the plant controller for RELAP-7 will help mitigate future RELAP-7 software engineering risks. In order to accomplish this task, Reactor Analysis and Virtual Control Environment (RAVEN) has been designed to provide an easy to use Graphical User Interface (GUI) for building plant models and to leverage artificial intelligence algorithms in order to reduce computational time, improve results, and help the user to identify the behavioral pattern of the Nuclear Power Plants (NPPs). In this paper we will present the GUI implementation and its current capability status. We will also introduce the support vector machine algorithms and show our evaluation of their potentiality in increasing the accuracy and reducing the computational costs of PRA analysis. In this evaluation we will refer to preliminary studies performed under the Risk Informed Safety Margins Characterization (RISMC) project of the Light Water Reactors Sustainability (LWRS) campaign [3]. RISMC simulation needs and algorithm testing are currently used as a guidance to prioritize RAVEN developments relevant to PRA.« less

  20. Integrating Data Sources for Process Sustainability ...

    EPA Pesticide Factsheets

    To perform a chemical process sustainability assessment requires significant data about chemicals, process design specifications, and operating conditions. The required information includes the identity of the chemicals used, the quantities of the chemicals within the context of the sustainability assessment, physical properties of these chemicals, equipment inventory, as well as health, environmental, and safety properties of the chemicals. Much of this data are currently available to the process engineer either from the process design in the chemical process simulation software or online through chemical property and environmental, health, and safety databases. Examples of these databases include the U.S. Environmental Protection Agency’s (USEPA’s) Aggregated Computational Toxicology Resource (ACToR), National Institute for Occupational Safety and Health’s (NIOSH’s) Hazardous Substance Database (HSDB), and National Institute of Standards and Technology’s (NIST’s) Chemistry Webbook. This presentation will provide methods and procedures for extracting chemical identity and flow information from process design tools (such as chemical process simulators) and chemical property information from the online databases. The presentation will also demonstrate acquisition and compilation of the data for use in the EPA’s GREENSCOPE process sustainability analysis tool. This presentation discusses acquisition of data for use in rapid LCI development.

  1. Use of Groundwater Lifetime Expectancy for the Performance Assessment of Deep Geologic Radioactive Waste Repositories.

    NASA Astrophysics Data System (ADS)

    Cornaton, F.; Park, Y.; Normani, S.; Sudicky, E.; Sykes, J.

    2005-12-01

    Long-term solutions for the disposal of toxic wastes usually involve isolation of the wastes in a deep subsurface geologic environment. In the case of spent nuclear fuel, the safety of the host repository depends on two main barriers: the engineered barrier and the natural geological barrier. If radionuclide leakage occurs from the engineered barrier, the geological medium represents the ultimate barrier that is relied upon to ensure safety. Consequently, an evaluation of radionuclide travel times from the repository to the biosphere is critically important in a performance assessment analysis. In this study, we develop a travel time framework based on the concept of groundwater lifetime expectancy as a safety indicator. Lifetime expectancy characterizes the time radionuclides will spend in the subsurface after their release from the repository and prior to discharging into the biosphere. The probability density function of lifetime expectancy is computed throughout the host rock by solving the backward-in-time solute transport equation subject to a properly posed set of boundary conditions. It can then be used to define optimal repository locations. In a second step, the risk associated with selected sites can be evaluated by simulating an appropriate contaminant release history. The proposed methodology is applied in the context of a typical Canadian Shield environment. Based on a statistically-generated three-dimension network of fracture zones embedded in the granitic host rock, the sensitivity and the uncertainty of lifetime expectancy to the hydraulic and dispersive properties of the fracture network, including the impact of conditioning via their surface expressions, is computed in order to demonstrate the utility of the methodology.

  2. A meta-model for computer executable dynamic clinical safety checklists.

    PubMed

    Nan, Shan; Van Gorp, Pieter; Lu, Xudong; Kaymak, Uzay; Korsten, Hendrikus; Vdovjak, Richard; Duan, Huilong

    2017-12-12

    Safety checklist is a type of cognitive tool enforcing short term memory of medical workers with the purpose of reducing medical errors caused by overlook and ignorance. To facilitate the daily use of safety checklists, computerized systems embedded in the clinical workflow and adapted to patient-context are increasingly developed. However, the current hard-coded approach of implementing checklists in these systems increase the cognitive efforts of clinical experts and coding efforts for informaticists. This is due to the lack of a formal representation format that is both understandable by clinical experts and executable by computer programs. We developed a dynamic checklist meta-model with a three-step approach. Dynamic checklist modeling requirements were extracted by performing a domain analysis. Then, existing modeling approaches and tools were investigated with the purpose of reusing these languages. Finally, the meta-model was developed by eliciting domain concepts and their hierarchies. The feasibility of using the meta-model was validated by two case studies. The meta-model was mapped to specific modeling languages according to the requirements of hospitals. Using the proposed meta-model, a comprehensive coronary artery bypass graft peri-operative checklist set and a percutaneous coronary intervention peri-operative checklist set have been developed in a Dutch hospital and a Chinese hospital, respectively. The result shows that it is feasible to use the meta-model to facilitate the modeling and execution of dynamic checklists. We proposed a novel meta-model for the dynamic checklist with the purpose of facilitating creating dynamic checklists. The meta-model is a framework of reusing existing modeling languages and tools to model dynamic checklists. The feasibility of using the meta-model is validated by implementing a use case in the system.

  3. A Bayesian procedure for evaluating the frequency of calibration factor updates in highway safety manual (HSM) applications.

    PubMed

    Saha, Dibakar; Alluri, Priyanka; Gan, Albert

    2017-01-01

    The Highway Safety Manual (HSM) presents statistical models to quantitatively estimate an agency's safety performance. The models were developed using data from only a few U.S. states. To account for the effects of the local attributes and temporal factors on crash occurrence, agencies are required to calibrate the HSM-default models for crash predictions. The manual suggests updating calibration factors every two to three years, or preferably on an annual basis. Given that the calibration process involves substantial time, effort, and resources, a comprehensive analysis of the required calibration factor update frequency is valuable to the agencies. Accordingly, the objective of this study is to evaluate the HSM's recommendation and determine the required frequency of calibration factor updates. A robust Bayesian estimation procedure is used to assess the variation between calibration factors computed annually, biennially, and triennially using data collected from over 2400 miles of segments and over 700 intersections on urban and suburban facilities in Florida. Bayesian model yields a posterior distribution of the model parameters that give credible information to infer whether the difference between calibration factors computed at specified intervals is credibly different from the null value which represents unaltered calibration factors between the comparison years or in other words, zero difference. The concept of the null value is extended to include the range of values that are practically equivalent to zero. Bayesian inference shows that calibration factors based on total crash frequency are required to be updated every two years in cases where the variations between calibration factors are not greater than 0.01. When the variations are between 0.01 and 0.05, calibration factors based on total crash frequency could be updated every three years. Copyright © 2016 Elsevier Ltd. All rights reserved.

  4. Public safety answering point readiness for wireless E-911 in New York State.

    PubMed

    Bailey, Bob W; Scott, Jay M; Brown, Lawrence H

    2003-01-01

    To determine the level of wireless enhanced 911 readiness among New York's primary public safety answering points. This descriptive study utilized a simple, single-page survey that was distributed in August 2001, with telephone follow-up concluding in January 2002. Surveys were distributed to directors of the primary public safety answering points in each of New York's 62 counties. Information was requested regarding current readiness for providing wireless enhanced 911 service, hardware and software needs for implementing the service, and the estimated costs for obtaining the necessary hardware and software. Two directors did not respond and could not be contacted by telephone; three declined participation; one did not operate an answering point; and seven provided incomplete responses, resulting in usable data from 49 (79%) of the state's public safety answering points. Only 27% of the responding public safety answering points were currently wireless enhanced 911 ready. Specific needs included obtaining or upgrading computer systems (16%), computer-aided dispatch systems (53%), mapping software (71%), telephone systems (27%), and local exchange carrier trunk lines (42%). The total estimated hardware and software costs for achieving wireless enhanced 911 readiness was between 16 million and 20 million dollars. New York's primary public safety answering points are not currently ready to provide wireless enhanced 911 service, and the cost for achieving readiness could be as high as 20 million dollars.

  5. RELAP-7 Code Assessment Plan and Requirement Traceability Matrix

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yoo, Junsoo; Choi, Yong-joon; Smith, Curtis L.

    2016-10-01

    The RELAP-7, a safety analysis code for nuclear reactor system, is under development at Idaho National Laboratory (INL). Overall, the code development is directed towards leveraging the advancements in computer science technology, numerical solution methods and physical models over the last decades. Recently, INL has also been putting an effort to establish the code assessment plan, which aims to ensure an improved final product quality through the RELAP-7 development process. The ultimate goal of this plan is to propose a suitable way to systematically assess the wide range of software requirements for RELAP-7, including the software design, user interface, andmore » technical requirements, etc. To this end, we first survey the literature (i.e., international/domestic reports, research articles) addressing the desirable features generally required for advanced nuclear system safety analysis codes. In addition, the V&V (verification and validation) efforts as well as the legacy issues of several recently-developed codes (e.g., RELAP5-3D, TRACE V5.0) are investigated. Lastly, this paper outlines the Requirement Traceability Matrix (RTM) for RELAP-7 which can be used to systematically evaluate and identify the code development process and its present capability.« less

  6. Methodology, status and plans for development and assessment of TUF and CATHENA codes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Luxat, J.C.; Liu, W.S.; Leung, R.K.

    1997-07-01

    An overview is presented of the Canadian two-fluid computer codes TUF and CATHENA with specific focus on the constraints imposed during development of these codes and the areas of application for which they are intended. Additionally a process for systematic assessment of these codes is described which is part of a broader, industry based initiative for validation of computer codes used in all major disciplines of safety analysis. This is intended to provide both the licensee and the regulator in Canada with an objective basis for assessing the adequacy of codes for use in specific applications. Although focused specifically onmore » CANDU reactors, Canadian experience in developing advanced two-fluid codes to meet wide-ranging application needs while maintaining past investment in plant modelling provides a useful contribution to international efforts in this area.« less

  7. Structural Evaluation of Exo-Skeletal Engine Fan Blades

    NASA Technical Reports Server (NTRS)

    Kuguoglu, Latife; Abumeri, Galib; Chamis, Christos C.

    2003-01-01

    The available computational simulation capability is used to demonstrate the structural viability of composite fan blades of innovative Exo-Skeletal Engine (ESE) developed at NASA Glenn Research Center for a subsonic mission. Full structural analysis and progressive damage evaluation of ESE composite fan blade is conducted through the NASA in-house computational simulation software system EST/BEST. The results of structural assessment indicate that longitudinal stresses acting on the blade are in compression. At a design speed of 2000 rpm, pressure and suction surface outer most ply stresses in longitudinal, transverse and shear direction are much lower than the corresponding composite ply strengths. Damage is initiated at 4870 rpm and blade fracture takes place at rotor speed of 7735 rpm. Damage volume is 51 percent. The progressive damage, buckling, stress and strength results indicate that the design at hand is very sound because of the factor of safety, damage tolerance, and buckling load of 6811 rpm.

  8. The role of mobile computed tomography in mass fatality incidents.

    PubMed

    Rutty, Guy N; Robinson, Claire E; BouHaidar, Ralph; Jeffery, Amanda J; Morgan, Bruno

    2007-11-01

    Mobile multi-detector computed tomography (MDCT) scanners are potentially available to temporary mortuaries and can be operational within 20 min of arrival. We describe, to our knowledge, the first use of mobile MDCT for a mass fatality incident. A mobile MDCT scanner attended the disaster mortuary after a five vehicle road traffic incident. Five out of six bodies were successfully imaged by MDCT in c. 15 min per body. Subsequent full radiological analysis took c. 1 h per case. The results were compared to the autopsy examinations. We discuss the advantages and disadvantages of imaging with mobile MDCT in relation to mass fatality work, illustrating the body pathway process, and its role in the identification of the pathology, personal effects, and health and safety hazards. We propose that the adoption of a single modality of mobile MDCT could replace the current use of multiple radiological sources within a mass fatality mortuary.

  9. Development of a Microsoft Excel tool for one-parameter Rasch model of continuous items: an application to a safety attitude survey.

    PubMed

    Chien, Tsair-Wei; Shao, Yang; Kuo, Shu-Chun

    2017-01-10

    Many continuous item responses (CIRs) are encountered in healthcare settings, but no one uses item response theory's (IRT) probabilistic modeling to present graphical presentations for interpreting CIR results. A computer module that is programmed to deal with CIRs is required. To present a computer module, validate it, and verify its usefulness in dealing with CIR data, and then to apply the model to real healthcare data in order to show how the CIR that can be applied to healthcare settings with an example regarding a safety attitude survey. Using Microsoft Excel VBA (Visual Basic for Applications), we designed a computer module that minimizes the residuals and calculates model's expected scores according to person responses across items. Rasch models based on a Wright map and on KIDMAP were demonstrated to interpret results of the safety attitude survey. The author-made CIR module yielded OUTFIT mean square (MNSQ) and person measures equivalent to those yielded by professional Rasch Winsteps software. The probabilistic modeling of the CIR module provides messages that are much more valuable to users and show the CIR advantage over classic test theory. Because of advances in computer technology, healthcare users who are familiar to MS Excel can easily apply the study CIR module to deal with continuous variables to benefit comparisons of data with a logistic distribution and model fit statistics.

  10. A Systematic Investigation of Computation Models for Predicting Adverse Drug Reactions (ADRs)

    PubMed Central

    Kuang, Qifan; Wang, MinQi; Li, Rong; Dong, YongCheng; Li, Yizhou; Li, Menglong

    2014-01-01

    Background Early and accurate identification of adverse drug reactions (ADRs) is critically important for drug development and clinical safety. Computer-aided prediction of ADRs has attracted increasing attention in recent years, and many computational models have been proposed. However, because of the lack of systematic analysis and comparison of the different computational models, there remain limitations in designing more effective algorithms and selecting more useful features. There is therefore an urgent need to review and analyze previous computation models to obtain general conclusions that can provide useful guidance to construct more effective computational models to predict ADRs. Principal Findings In the current study, the main work is to compare and analyze the performance of existing computational methods to predict ADRs, by implementing and evaluating additional algorithms that have been earlier used for predicting drug targets. Our results indicated that topological and intrinsic features were complementary to an extent and the Jaccard coefficient had an important and general effect on the prediction of drug-ADR associations. By comparing the structure of each algorithm, final formulas of these algorithms were all converted to linear model in form, based on this finding we propose a new algorithm called the general weighted profile method and it yielded the best overall performance among the algorithms investigated in this paper. Conclusion Several meaningful conclusions and useful findings regarding the prediction of ADRs are provided for selecting optimal features and algorithms. PMID:25180585

  11. Disrupting Aviation: An Exploratory Study of the Opportunities and Risks of Tablet Computers in Commercial Flight Operations

    ERIC Educational Resources Information Center

    Boyne, Matthew

    2013-01-01

    Commercial flight operational safety has dramatically improved in the last 30 years because of enhanced crew coordination, communication, leadership and team development. Technology insertion into cockpit operations, however, has been shown to create crew distractions, resulting in flight safety risks, limited use given policy limitations and…

  12. 16 CFR 1211.5 - General testing parameters.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... § 1211.4(c) for compliance with the Standard for Safety for Tests for Safety-Related Controls Employing... vibration level of 5g is to be used for the Vibration Test. (6) When a Computational Investigation is... tested. (8) The Endurance test is to be conducted concurrently with the Operational test. The control...

  13. 16 CFR § 1211.5 - General testing parameters.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... covered by § 1211.4(c) for compliance with the Standard for Safety for Tests for Safety-Related Controls... vibration level of 5g is to be used for the Vibration Test. (6) When a Computational Investigation is... tested. (8) The Endurance test is to be conducted concurrently with the Operational test. The control...

  14. 16 CFR 1211.5 - General testing parameters.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... § 1211.4(c) for compliance with the Standard for Safety for Tests for Safety-Related Controls Employing... vibration level of 5g is to be used for the Vibration Test. (6) When a Computational Investigation is... tested. (8) The Endurance test is to be conducted concurrently with the Operational test. The control...

  15. 16 CFR 1211.5 - General testing parameters.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... § 1211.4(c) for compliance with the Standard for Safety for Tests for Safety-Related Controls Employing... vibration level of 5g is to be used for the Vibration Test. (6) When a Computational Investigation is... tested. (8) The Endurance test is to be conducted concurrently with the Operational test. The control...

  16. Games that ''Work'': Using Computer Games to Teach Alcohol-Affected Children about Fire and Street Safety

    ERIC Educational Resources Information Center

    Coles, Claire D.; Strickland, Dorothy C.; Padgett, Lynne; Bellmoff, Lynnae

    2007-01-01

    Unintentional injuries are a leading cause of death and disability for children. Those with developmental disabilities, including children affected by prenatal alcohol exposure, are at highest risk for injuries. Although teaching safety skills is recommended to prevent injury, cognitive limitations and behavioral problems characteristic of…

  17. Food safety education using an interactive multimedia kiosk in a WIC setting: correlates of client satisfaction and practical issues.

    PubMed

    Trepka, Mary Jo; Newman, Frederick L; Huffman, Fatma G; Dixon, Zisca

    2010-01-01

    To assess acceptability of food safety education delivered by interactive multimedia (IMM) in a Supplemental Nutrition Program for Women, Infants and Children Program (WIC) clinic. Female clients or caregivers (n=176) completed the food-handling survey; then an IMM food safety education program on a computer kiosk. Satisfaction with program, participant demographics, and change in food-handling behavior were assessed by univariate analyses. Over 90% of the participants enjoyed the kiosk, and most (87.5%) reported using computers a lot. Compared with participants with education beyond high school, participants with less education were more likely to report enjoying the kiosk (98.2% vs 88.1%, P = .007), preferred learning with the kiosk (91.7% vs 79.1%, P = .02), and would like to learn about other topics using IMM (95.4% vs 86.6%, P = .04). Food safety education delivered by IMM was well accepted by inner-city WIC clinic clients, including those with less education. Copyright 2010 Society for Nutrition Education. Published by Elsevier Inc. All rights reserved.

  18. Computing Q-D Relationships for Storage of Rocket Fuels

    NASA Technical Reports Server (NTRS)

    Jester, Keith

    2005-01-01

    The Quantity Distance Measurement Tool is a GIS BASEP computer program that aids safety engineers by calculating quantity-distance (Q-D) relationships for vessels that contain explosive chemicals used in testing rocket engines. (Q-D relationships are standard relationships between specified quantities of specified explosive materials and minimum distances by which they must be separated from persons, objects, and other explosives to obtain specified types and degrees of protection.) The program uses customized geographic-information-system (GIS) software and calculates Q-D relationships in accordance with NASA's Safety Standard For Explosives, Propellants, and Pyrotechnics. Displays generated by the program enable the identification of hazards, showing the relationships of propellant-storage-vessel safety buffers to inhabited facilities and public roads. Current Q-D information is calculated and maintained in graphical form for all vessels that contain propellants or other chemicals, the explosiveness of which is expressed in TNT equivalents [amounts of trinitrotoluene (TNT) having equivalent explosive effects]. The program is useful in the acquisition, siting, construction, and/or modification of storage vessels and other facilities in the development of an improved test-facility safety program.

  19. Advanced Software V&V for Civil Aviation and Autonomy

    NASA Technical Reports Server (NTRS)

    Brat, Guillaume P.

    2017-01-01

    With the advances in high-computing platform (e.g., advanced graphical processing units or multi-core processors), computationally-intensive software techniques such as the ones used in artificial intelligence or formal methods have provided us with an opportunity to further increase safety in the aviation industry. Some of these techniques have facilitated building safety at design time, like in aircraft engines or software verification and validation, and others can introduce safety benefits during operations as long as we adapt our processes. In this talk, I will present how NASA is taking advantage of these new software techniques to build in safety at design time through advanced software verification and validation, which can be applied earlier and earlier in the design life cycle and thus help also reduce the cost of aviation assurance. I will then show how run-time techniques (such as runtime assurance or data analytics) offer us a chance to catch even more complex problems, even in the face of changing and unpredictable environments. These new techniques will be extremely useful as our aviation systems become more complex and more autonomous.

  20. Comprehensive Safety Analysis 2010 Safety Measurement System (SMS) Methodology, Version 2.1 Revised December 2010

    DOT National Transportation Integrated Search

    2010-12-01

    This report documents the Safety Measurement System (SMS) methodology developed to support the Comprehensive Safety Analysis 2010 (CSA 2010) Initiative for the Federal Motor Carrier Safety Administration (FMCSA). The SMS is one of the major tools for...

  1. Overview of Energy Systems` safety analysis report programs. Safety Analysis Report Update Program

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1992-03-01

    The primary purpose of an Safety Analysis Report (SAR) is to provide a basis for judging the adequacy of a facility`s safety. The SAR documents the safety analyses that systematically identify the hazards posed by the facility, analyze the consequences and risk of potential accidents, and describe hazard control measures that protect the health and safety of the public and employees. In addition, some SARs document, as Technical Safety Requirements (TSRs, which include Technical Specifications and Operational Safety Requirements), technical and administrative requirements that ensure the facility is operated within prescribed safety limits. SARs also provide conveniently summarized information thatmore » may be used to support procedure development, training, inspections, and other activities necessary to facility operation. This ``Overview of Energy Systems Safety Analysis Report Programs`` Provides an introduction to the programs and processes used in the development and maintenance of the SARs. It also summarizes some of the uses of the SARs within Energy Systems and DOE.« less

  2. Software Safety Progress in NASA

    NASA Technical Reports Server (NTRS)

    Radley, Charles F.

    1995-01-01

    NASA has developed guidelines for development and analysis of safety-critical software. These guidelines have been documented in a Guidebook for Safety Critical Software Development and Analysis. The guidelines represent a practical 'how to' approach, to assist software developers and safety analysts in cost effective methods for software safety. They provide guidance in the implementation of the recent NASA Software Safety Standard NSS-1740.13 which was released as 'Interim' version in June 1994, scheduled for formal adoption late 1995. This paper is a survey of the methods in general use, resulting in the NASA guidelines for safety critical software development and analysis.

  3. C-Band Airport Surface Communications System Engineering-Initial High-Level Safety Risk Assessment and Mitigation

    NASA Technical Reports Server (NTRS)

    Zelkin, Natalie; Henriksen, Stephen

    2011-01-01

    This document is being provided as part of ITT's NASA Glenn Research Center Aerospace Communication Systems Technical Support (ACSTS) contract: "New ATM Requirements--Future Communications, C-Band and L-Band Communications Standard Development." ITT has completed a safety hazard analysis providing a preliminary safety assessment for the proposed C-band (5091- to 5150-MHz) airport surface communication system. The assessment was performed following the guidelines outlined in the Federal Aviation Administration Safety Risk Management Guidance for System Acquisitions document. The safety analysis did not identify any hazards with an unacceptable risk, though a number of hazards with a medium risk were documented. This effort represents an initial high-level safety hazard analysis and notes the triggers for risk reassessment. A detailed safety hazards analysis is recommended as a follow-on activity to assess particular components of the C-band communication system after the profile is finalized and system rollout timing is determined. A security risk assessment has been performed by NASA as a parallel activity. While safety analysis is concerned with a prevention of accidental errors and failures, the security threat analysis focuses on deliberate attacks. Both processes identify the events that affect operation of the system; and from a safety perspective the security threats may present safety risks.

  4. Applying the School Health Index to a nationally representative sample of schools: update for 2006.

    PubMed

    Brener, Nancy D; Pejavara, Anu; McManus, Tim

    2011-02-01

    The School Health Index (SHI) is a tool designed to help schools assess the extent to which they are implementing practices included in the research-based guidelines and strategies for school health and safety programs developed by the Centers for Disease Control and Prevention (CDC). CDC previously analyzed data from the 2000 School Health Policies and Programs Study (SHPPS) to determine the percentage of US schools meeting the recommendations in the SHI. A new edition of the SHI (2005) and the availability of 2006 SHPPS data made it necessary to update and repeat the analysis. SHPPS 2006 data were collected through computer-assisted personal interviews with faculty and staff in a nationally representative sample of schools. The data were then matched to SHI items to calculate the percentage of schools meeting the recommendations in 4 areas: school health and safety policies and environment, health education, physical education and other physical activity programs, and nutrition services. In accordance with the earlier findings, the present analysis indicated that schools nationwide were focusing their efforts on a few policies and programs rather than addressing the entire set of recommendations in the SHI. The percentage of items related to nutrition that schools met remained high, and an increase occurred in the percentage of items that schools met related to school health and safety policies and environment. More work needs to be done to assist schools in implementing school health policies and practices; this analysis helps identify specific areas where improvement is needed. © Published 2011. This article is a US Government work and is in the public domain in the USA.

  5. DUKSUP: A Computer Program for High Thrust Launch Vehicle Trajectory Design and Optimization

    NASA Technical Reports Server (NTRS)

    Williams, C. H.; Spurlock, O. F.

    2014-01-01

    From the late 1960's through 1997, the leadership of NASA's Intermediate and Large class unmanned expendable launch vehicle projects resided at the NASA Lewis (now Glenn) Research Center (LeRC). One of LeRC's primary responsibilities --- trajectory design and performance analysis --- was accomplished by an internally-developed analytic three dimensional computer program called DUKSUP. Because of its Calculus of Variations-based optimization routine, this code was generally more capable of finding optimal solutions than its contemporaries. A derivation of optimal control using the Calculus of Variations is summarized including transversality, intermediate, and final conditions. The two point boundary value problem is explained. A brief summary of the code's operation is provided, including iteration via the Newton-Raphson scheme and integration of variational and motion equations via a 4th order Runge-Kutta scheme. Main subroutines are discussed. The history of the LeRC trajectory design efforts in the early 1960's is explained within the context of supporting the Centaur upper stage program. How the code was constructed based on the operation of the Atlas/Centaur launch vehicle, the limits of the computers of that era, the limits of the computer programming languages, and the missions it supported are discussed. The vehicles DUKSUP supported (Atlas/Centaur, Titan/Centaur, and Shuttle/Centaur) are briefly described. The types of missions, including Earth orbital and interplanetary, are described. The roles of flight constraints and their impact on launch operations are detailed (such as jettisoning hardware on heating, Range Safety, ground station tracking, and elliptical parking orbits). The computer main frames on which the code was hosted are described. The applications of the code are detailed, including independent check of contractor analysis, benchmarking, leading edge analysis, and vehicle performance improvement assessments. Several of DUKSUP's many major impacts on launches are discussed including Intelsat, Voyager, Pioneer Venus, HEAO, Galileo, and Cassini.

  6. DUKSUP: A Computer Program for High Thrust Launch Vehicle Trajectory Design and Optimization

    NASA Technical Reports Server (NTRS)

    Spurlock, O. Frank; Williams, Craig H.

    2015-01-01

    From the late 1960s through 1997, the leadership of NASAs Intermediate and Large class unmanned expendable launch vehicle projects resided at the NASA Lewis (now Glenn) Research Center (LeRC). One of LeRCs primary responsibilities --- trajectory design and performance analysis --- was accomplished by an internally-developed analytic three dimensional computer program called DUKSUP. Because of its Calculus of Variations-based optimization routine, this code was generally more capable of finding optimal solutions than its contemporaries. A derivation of optimal control using the Calculus of Variations is summarized including transversality, intermediate, and final conditions. The two point boundary value problem is explained. A brief summary of the codes operation is provided, including iteration via the Newton-Raphson scheme and integration of variational and motion equations via a 4th order Runge-Kutta scheme. Main subroutines are discussed. The history of the LeRC trajectory design efforts in the early 1960s is explained within the context of supporting the Centaur upper stage program. How the code was constructed based on the operation of the AtlasCentaur launch vehicle, the limits of the computers of that era, the limits of the computer programming languages, and the missions it supported are discussed. The vehicles DUKSUP supported (AtlasCentaur, TitanCentaur, and ShuttleCentaur) are briefly described. The types of missions, including Earth orbital and interplanetary, are described. The roles of flight constraints and their impact on launch operations are detailed (such as jettisoning hardware on heating, Range Safety, ground station tracking, and elliptical parking orbits). The computer main frames on which the code was hosted are described. The applications of the code are detailed, including independent check of contractor analysis, benchmarking, leading edge analysis, and vehicle performance improvement assessments. Several of DUKSUPs many major impacts on launches are discussed including Intelsat, Voyager, Pioneer Venus, HEAO, Galileo, and Cassini.

  7. How Adverse Outcome Pathways Can Aid the Development and Use of Computational Prediction Models for Regulatory Toxicology

    PubMed Central

    Aladjov, Hristo; Ankley, Gerald; Byrne, Hugh J.; de Knecht, Joop; Heinzle, Elmar; Klambauer, Günter; Landesmann, Brigitte; Luijten, Mirjam; MacKay, Cameron; Maxwell, Gavin; Meek, M. E. (Bette); Paini, Alicia; Perkins, Edward; Sobanski, Tomasz; Villeneuve, Dan; Waters, Katrina M.; Whelan, Maurice

    2017-01-01

    Efforts are underway to transform regulatory toxicology and chemical safety assessment from a largely empirical science based on direct observation of apical toxicity outcomes in whole organism toxicity tests to a predictive one in which outcomes and risk are inferred from accumulated mechanistic understanding. The adverse outcome pathway (AOP) framework provides a systematic approach for organizing knowledge that may support such inference. Likewise, computational models of biological systems at various scales provide another means and platform to integrate current biological understanding to facilitate inference and extrapolation. We argue that the systematic organization of knowledge into AOP frameworks can inform and help direct the design and development of computational prediction models that can further enhance the utility of mechanistic and in silico data for chemical safety assessment. This concept was explored as part of a workshop on AOP-Informed Predictive Modeling Approaches for Regulatory Toxicology held September 24–25, 2015. Examples of AOP-informed model development and its application to the assessment of chemicals for skin sensitization and multiple modes of endocrine disruption are provided. The role of problem formulation, not only as a critical phase of risk assessment, but also as guide for both AOP and complementary model development is described. Finally, a proposal for actively engaging the modeling community in AOP-informed computational model development is made. The contents serve as a vision for how AOPs can be leveraged to facilitate development of computational prediction models needed to support the next generation of chemical safety assessment. PMID:27994170

  8. Safety System Design for Technology Education. A Safety Guide for Technology Education Courses K-12.

    ERIC Educational Resources Information Center

    North Carolina State Dept. of Public Instruction, Raleigh. Div. of Vocational Education.

    This manual is designed to involve both teachers and students in planning and controlling a safety system for technology education classrooms. The safety program involves students in the design and maintenance of the system by including them in the analysis of the classroom environment, job safety analysis, safety inspection, and machine safety…

  9. 14 CFR Appendix B of Part 415 - Safety Review Document Outline

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... Performed by Certified Personnel 4.0Flight Safety (§ 415.115) 4.1Initial Flight Safety Analysis 4.1.1Flight Safety Sub-Analyses, Methods, and Assumptions 4.1.2Sample Calculation and Products 4.1.3 Launch Specific Updates and Final Flight Safety Analysis Data 4.2Radionuclide Data (where applicable) 4.3Flight Safety...

  10. 14 CFR Appendix B of Part 415 - Safety Review Document Outline

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... Performed by Certified Personnel 4.0Flight Safety (§ 415.115) 4.1Initial Flight Safety Analysis 4.1.1Flight Safety Sub-Analyses, Methods, and Assumptions 4.1.2Sample Calculation and Products 4.1.3 Launch Specific Updates and Final Flight Safety Analysis Data 4.2Radionuclide Data (where applicable) 4.3Flight Safety...

  11. 14 CFR Appendix B of Part 415 - Safety Review Document Outline

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... Performed by Certified Personnel 4.0Flight Safety (§ 415.115) 4.1Initial Flight Safety Analysis 4.1.1Flight Safety Sub-Analyses, Methods, and Assumptions 4.1.2Sample Calculation and Products 4.1.3 Launch Specific Updates and Final Flight Safety Analysis Data 4.2Radionuclide Data (where applicable) 4.3Flight Safety...

  12. The FITS model office ergonomics program: a model for best practice.

    PubMed

    Chim, Justine M Y

    2014-01-01

    An effective office ergonomics program can predict positive results in reducing musculoskeletal injury rates, enhancing productivity, and improving staff well-being and job satisfaction. Its objective is to provide a systematic solution to manage the potential risk of musculoskeletal disorders among computer users in an office setting. A FITS Model office ergonomics program is developed. The FITS Model Office Ergonomics Program has been developed which draws on the legislative requirements for promoting the health and safety of workers using computers for extended periods as well as previous research findings. The Model is developed according to the practical industrial knowledge in ergonomics, occupational health and safety management, and human resources management in Hong Kong and overseas. This paper proposes a comprehensive office ergonomics program, the FITS Model, which considers (1) Furniture Evaluation and Selection; (2) Individual Workstation Assessment; (3) Training and Education; (4) Stretching Exercises and Rest Break as elements of an effective program. An experienced ergonomics practitioner should be included in the program design and implementation. Through the FITS Model Office Ergonomics Program, the risk of musculoskeletal disorders among computer users can be eliminated or minimized, and workplace health and safety and employees' wellness enhanced.

  13. 14 CFR 417.227 - Toxic release hazard analysis.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 14 Aeronautics and Space 4 2010-01-01 2010-01-01 false Toxic release hazard analysis. 417.227..., DEPARTMENT OF TRANSPORTATION LICENSING LAUNCH SAFETY Flight Safety Analysis § 417.227 Toxic release hazard analysis. A flight safety analysis must establish flight commit criteria that protect the public from any...

  14. 41 CFR 102-80.130 - Who must perform the equivalent level of safety analysis?

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ...-SAFETY AND ENVIRONMENTAL MANAGEMENT Accident and Fire Prevention Equivalent Level of Safety Analysis... 41 Public Contracts and Property Management 3 2010-07-01 2010-07-01 false Who must perform the equivalent level of safety analysis? 102-80.130 Section 102-80.130 Public Contracts and Property Management...

  15. A Synthetic Vision Preliminary Integrated Safety Analysis

    NASA Technical Reports Server (NTRS)

    Hemm, Robert; Houser, Scott

    2001-01-01

    This report documents efforts to analyze a sample of aviation safety programs, using the LMI-developed integrated safety analysis tool to determine the change in system risk resulting from Aviation Safety Program (AvSP) technology implementation. Specifically, we have worked to modify existing system safety tools to address the safety impact of synthetic vision (SV) technology. Safety metrics include reliability, availability, and resultant hazard. This analysis of SV technology is intended to be part of a larger effort to develop a model that is capable of "providing further support to the product design and development team as additional information becomes available". The reliability analysis portion of the effort is complete and is fully documented in this report. The simulation analysis is still underway; it will be documented in a subsequent report. The specific goal of this effort is to apply the integrated safety analysis to SV technology. This report also contains a brief discussion of data necessary to expand the human performance capability of the model, as well as a discussion of human behavior and its implications for system risk assessment in this modeling environment.

  16. The Garden Banks 388 horizontal tree design and development

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Granhaug, O.; Soul, J.

    1995-12-31

    This paper describes the Horizontal Subsea Production Tree System, later referred to as a SpoolTree{trademark}, developed for the Enserch Garden Banks 388 field in the Gulf of Mexico. The paper starts with a project overview followed by a comparison between the SpoolTree and the Conventional Tree design. A brief discussion explains why Enserch elected to use the SpoolTree for this field development, including available technology, workover frequency, cost etc. The rigorous safety analysis carried out for the subsea production equipment is then explained in depth. The paper continues with a technical discussion of the main features specific to the SpoolTreemore » design and the Garden Banks 388 field development. Issues discussed include the SpoolTree itself, BOP Adapter Plate (for control during installation, workover and production), Tubing Hanger and pressure barrier design, debris cap design, downhole communication (SCSSV, chemical injection, pressure and temperature) ROV intervention, template wellbay insert design and other relevant issues. The use of computer based 3-D modelling tool is also briefly described. The experience and results described in this paper have direct application to numerous subsea development prospects worldwide, particularly in deep water. In addition, the ``system development`` aspect of the project is relevant to most marine equipment development projects. This includes the use of safety analysis techniques, 3-D computer modelling tools and clearly defined engineering procedures. A full account of the final design configuration of the SpoolTree system is given in the paper. A summary of the experience gained during the extensive testing at the factory and during the template integration tests is also provided.« less

  17. SAGES TAVAC safety and effectiveness analysis: da Vinci ® Surgical System (Intuitive Surgical, Sunnyvale, CA).

    PubMed

    Tsuda, Shawn; Oleynikov, Dmitry; Gould, Jon; Azagury, Dan; Sandler, Bryan; Hutter, Matthew; Ross, Sharona; Haas, Eric; Brody, Fred; Satava, Richard

    2015-10-01

    The da Vinci(®) Surgical System (Intuitive Surgical, Sunnyvale, CA, USA) is a computer-assisted (robotic) surgical system designed to enable and enhance minimally invasive surgery. The Food and Drug Administration (FDA) has cleared computer-assisted surgical systems for use by trained physicians in an operating room environment for laparoscopic surgical procedures in general, cardiac, colorectal, gynecologic, head and neck, thoracic and urologic surgical procedures. There are substantial numbers of peer-reviewed papers regarding the da Vinci(®) Surgical System, and a thoughtful assessment of evidence framed by clinical opinion is warranted. The SAGES da Vinci(®) TAVAC sub-committee performed a literature review of the da Vinci(®) Surgical System regarding gastrointestinal surgery. Conclusions by the sub-committee were vetted by the SAGES TAVAC Committee and SAGES Executive Board. Following revisions, the document was evaluated by the TAVAC Committee and Executive Board again for final approval. Several conclusions were drawn based on expert opinion organized by safety, efficacy, and cost for robotic foregut, bariatric, hepatobiliary/pancreatic, colorectal surgery, and single-incision cholecystectomy. Gastrointestinal surgery with the da Vinci(®) Surgical System is safe and comparable, but not superior to standard laparoscopic approaches. Although clinically acceptable, its use may be costly for select gastrointestinal procedures. Current data are limited to the da Vinci(®) Surgical System; further analyses are needed.

  18. A Multi-Methods Approach to HRA and Human Performance Modeling: A Field Assessment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jacques Hugo; David I Gertman

    2012-06-01

    The Advanced Test Reactor (ATR) is a research reactor at the Idaho National Laboratory is primarily designed and used to test materials to be used in other, larger-scale and prototype reactors. The reactor offers various specialized systems and allows certain experiments to be run at their own temperature and pressure. The ATR Canal temporarily stores completed experiments and used fuel. It also has facilities to conduct underwater operations such as experiment examination or removal. In reviewing the ATR safety basis, a number of concerns were identified involving the ATR canal. A brief study identified ergonomic issues involving the manual handlingmore » of fuel elements in the canal that may increase the probability of human error and possible unwanted acute physical outcomes to the operator. In response to this concern, that refined the previous HRA scoping analysis by determining the probability of the inadvertent exposure of a fuel element to the air during fuel movement and inspection was conducted. The HRA analysis employed the SPAR-H method and was supplemented by information gained from a detailed analysis of the fuel inspection and transfer tasks. This latter analysis included ergonomics, work cycles, task duration, and workload imposed by tool and workplace characteristics, personal protective clothing, and operational practices that have the potential to increase physical and mental workload. Part of this analysis consisted of NASA-TLX analyses, combined with operational sequence analysis, computational human performance analysis (CHPA), and 3D graphical modeling to determine task failures and precursors to such failures that have safety implications. Experience in applying multiple analysis techniques in support of HRA methods is discussed.« less

  19. Design of a Fatigue Detection System for High-Speed Trains Based on Driver Vigilance Using a Wireless Wearable EEG

    PubMed Central

    Zhang, Xiaoliang; Li, Jiali; Liu, Yugang; Zhang, Zutao; Wang, Zhuojun; Luo, Dianyuan; Zhou, Xiang; Zhu, Miankuan; Salman, Waleed; Hu, Guangdi; Wang, Chunbai

    2017-01-01

    The vigilance of the driver is important for railway safety, despite not being included in the safety management system (SMS) for high-speed train safety. In this paper, a novel fatigue detection system for high-speed train safety based on monitoring train driver vigilance using a wireless wearable electroencephalograph (EEG) is presented. This system is designed to detect whether the driver is drowsiness. The proposed system consists of three main parts: (1) a wireless wearable EEG collection; (2) train driver vigilance detection; and (3) early warning device for train driver. In the first part, an 8-channel wireless wearable brain-computer interface (BCI) device acquires the locomotive driver’s brain EEG signal comfortably under high-speed train-driving conditions. The recorded data are transmitted to a personal computer (PC) via Bluetooth. In the second step, a support vector machine (SVM) classification algorithm is implemented to determine the vigilance level using the Fast Fourier transform (FFT) to extract the EEG power spectrum density (PSD). In addition, an early warning device begins to work if fatigue is detected. The simulation and test results demonstrate the feasibility of the proposed fatigue detection system for high-speed train safety. PMID:28257073

  20. The 12th International Conference on Computer Safety, Reliability and Security

    DTIC Science & Technology

    1993-10-29

    then used [10]. The adequacy of the proposed methodology is shown through the design and the validation of a simple control system: a train set example...satisfying the safety condition. 4 Conclusions In this paper we have presented a methodology which can be used for the design of safety-critical systems...has a Burner but no Detector (or the Detector is permanently non -active). The PA: G1 for this design is shown in Fig 3a. The probability matrices are

Top