Sample records for accident analysis software

  1. Waste management facility accident analysis (WASTE ACC) system: software for analysis of waste management alternatives

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kohout, E.F.; Folga, S.; Mueller, C.

    1996-03-01

    This paper describes the Waste Management Facility Accident Analysis (WASTE{underscore}ACC) software, which was developed at Argonne National Laboratory (ANL) to support the US Department of Energy`s (DOE`s) Waste Management (WM) Programmatic Environmental Impact Statement (PEIS). WASTE{underscore}ACC is a decision support and database system that is compatible with Microsoft{reg_sign} Windows{trademark}. It assesses potential atmospheric releases from accidents at waste management facilities. The software provides the user with an easy-to-use tool to determine the risk-dominant accident sequences for the many possible combinations of process technologies, waste and facility types, and alternative cases described in the WM PEIS. In addition, its structure willmore » allow additional alternative cases and assumptions to be tested as part of the future DOE programmatic decision-making process. The WASTE{underscore}ACC system demonstrates one approach to performing a generic, systemwide evaluation of accident risks at waste management facilities. The advantages of WASTE{underscore}ACC are threefold. First, the software gets waste volume and radiological profile data that were used to perform other WM PEIS-related analyses directly from the WASTE{underscore}MGMT system. Second, the system allows for a consistent analysis across all sites and waste streams, which enables decision makers to understand more fully the trade-offs among various policy options and scenarios. Third, the system is easy to operate; even complex scenario runs are completed within minutes.« less

  2. Benchmarking MARS (accident management software) with the Browns Ferry fire

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dawson, S.M.; Liu, L.Y.; Raines, J.C.

    1992-01-01

    The MAAP Accident Response System (MARS) is a userfriendly computer software developed to provide management and engineering staff with the most needed insights, during actual or simulated accidents, of the current and future conditions of the plant based on current plant data and its trends. To demonstrate the reliability of the MARS code in simulatng a plant transient, MARS is being benchmarked with the available reactor pressure vessel (RPV) pressure and level data from the Browns Ferry fire. The MRS software uses the Modular Accident Analysis Program (MAAP) code as its basis to calculate plant response under accident conditions. MARSmore » uses a limited set of plant data to initialize and track the accidnt progression. To perform this benchmark, a simulated set of plant data was constructed based on actual report data containing the information necessary to initialize MARS and keep track of plant system status throughout the accident progression. The initial Browns Ferry fire data were produced by performing a MAAP run to simulate the accident. The remaining accident simulation used actual plant data.« less

  3. Predicting System Accidents with Model Analysis During Hybrid Simulation

    NASA Technical Reports Server (NTRS)

    Malin, Jane T.; Fleming, Land D.; Throop, David R.

    2002-01-01

    Standard discrete event simulation is commonly used to identify system bottlenecks and starving and blocking conditions in resources and services. The CONFIG hybrid discrete/continuous simulation tool can simulate such conditions in combination with inputs external to the simulation. This provides a means for evaluating the vulnerability to system accidents of a system's design, operating procedures, and control software. System accidents are brought about by complex unexpected interactions among multiple system failures , faulty or misleading sensor data, and inappropriate responses of human operators or software. The flows of resource and product materials play a central role in the hazardous situations that may arise in fluid transport and processing systems. We describe the capabilities of CONFIG for simulation-time linear circuit analysis of fluid flows in the context of model-based hazard analysis. We focus on how CONFIG simulates the static stresses in systems of flow. Unlike other flow-related properties, static stresses (or static potentials) cannot be represented by a set of state equations. The distribution of static stresses is dependent on the specific history of operations performed on a system. We discuss the use of this type of information in hazard analysis of system designs.

  4. Software analysis handbook: Software complexity analysis and software reliability estimation and prediction

    NASA Technical Reports Server (NTRS)

    Lee, Alice T.; Gunn, Todd; Pham, Tuan; Ricaldi, Ron

    1994-01-01

    This handbook documents the three software analysis processes the Space Station Software Analysis team uses to assess space station software, including their backgrounds, theories, tools, and analysis procedures. Potential applications of these analysis results are also presented. The first section describes how software complexity analysis provides quantitative information on code, such as code structure and risk areas, throughout the software life cycle. Software complexity analysis allows an analyst to understand the software structure, identify critical software components, assess risk areas within a software system, identify testing deficiencies, and recommend program improvements. Performing this type of analysis during the early design phases of software development can positively affect the process, and may prevent later, much larger, difficulties. The second section describes how software reliability estimation and prediction analysis, or software reliability, provides a quantitative means to measure the probability of failure-free operation of a computer program, and describes the two tools used by JSC to determine failure rates and design tradeoffs between reliability, costs, performance, and schedule.

  5. Visualization of Traffic Accidents

    NASA Technical Reports Server (NTRS)

    Wang, Jie; Shen, Yuzhong; Khattak, Asad

    2010-01-01

    Traffic accidents have tremendous impact on society. Annually approximately 6.4 million vehicle accidents are reported by police in the US and nearly half of them result in catastrophic injuries. Visualizations of traffic accidents using geographic information systems (GIS) greatly facilitate handling and analysis of traffic accidents in many aspects. Environmental Systems Research Institute (ESRI), Inc. is the world leader in GIS research and development. ArcGIS, a software package developed by ESRI, has the capabilities to display events associated with a road network, such as accident locations, and pavement quality. But when event locations related to a road network are processed, the existing algorithm used by ArcGIS does not utilize all the information related to the routes of the road network and produces erroneous visualization results of event locations. This software bug causes serious problems for applications in which accurate location information is critical for emergency responses, such as traffic accidents. This paper aims to address this problem and proposes an improved method that utilizes all relevant information of traffic accidents, namely, route number, direction, and mile post, and extracts correct event locations for accurate traffic accident visualization and analysis. The proposed method generates a new shape file for traffic accidents and displays them on top of the existing road network in ArcGIS. Visualization of traffic accidents along Hampton Roads Bridge Tunnel is included to demonstrate the effectiveness of the proposed method.

  6. Applying STAMP in Accident Analysis

    NASA Technical Reports Server (NTRS)

    Leveson, Nancy; Daouk, Mirna; Dulac, Nicolas; Marais, Karen

    2003-01-01

    Accident models play a critical role in accident investigation and analysis. Most traditional models are based on an underlying chain of events. These models, however, have serious limitations when used for complex, socio-technical systems. Previously, Leveson proposed a new accident model (STAMP) based on system theory. In STAMP, the basic concept is not an event but a constraint. This paper shows how STAMP can be applied to accident analysis using three different views or models of the accident process and proposes a notation for describing this process.

  7. Insights Gained from Forensic Analysis with MELCOR of the Fukushima-Daiichi Accidents.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Andrews, Nathan C.; Gauntt, Randall O.

    Since the accidents at Fukushima-Daiichi, Sandia National Laboratories has been modeling these accident scenarios using the severe accident analysis code, MELCOR. MELCOR is a widely used computer code developed at Sandia National Laboratories since ~1982 for the U.S. Nuclear Regulatory Commission. Insights from the modeling of these accidents is being used to better inform future code development and potentially improved accident management. To date, our necessity to better capture in-vessel thermal-hydraulic and ex-vessel melt coolability and concrete interactions has led to the implementation of new models. The most recent analyses, presented in this paper, have been in support of themore » of the Organization for Economic Cooperation and Development Nuclear Energy Agency’s (OECD/NEA) Benchmark Study of the Accident at the Fukushima Daiichi Nuclear Power Station (BSAF) Project. The goal of this project is to accurately capture the source term from all three releases and then model the atmospheric dispersion. In order to do this, a forensic approach is being used in which available plant data and release timings is being used to inform the modeled MELCOR accident scenario. For example, containment failures, core slumping events and lower head failure timings are all enforced parameters in these analyses. This approach is fundamentally different from a blind code assessment analysis often used in standard problem exercises. The timings of these events are informed by representative spikes or decreases in plant data. The combination of improvements to the MELCOR source code resulting from analysis previous accident analysis and this forensic approach has allowed Sandia to generate representative and plausible source terms for all three accidents at Fukushima Daiichi out to three weeks after the accident to capture both early and late releases. In particular, using the source terms developed by MELCOR, the MACCS software code, which models atmospheric dispersion

  8. Single pilot IFR accident data analysis

    NASA Technical Reports Server (NTRS)

    Harris, D. F.

    1983-01-01

    The aircraft accident data recorded by the National Transportation and Safety Board (NTSR) for 1964-1979 were analyzed to determine what problems exist in the general aviation (GA) single pilot instrument flight rule (SPIFR) environment. A previous study conducted in 1978 for the years 1964-1975 provided a basis for comparison. This effort was generally limited to SPIFR pilot error landing phase accidents but includes some SPIFR takeoff and enroute accident analysis as well as some dual pilot IFR accident analysis for comparison. Analysis was performed for 554 accidents of which 39% (216) occurred during the years 1976-1979.

  9. A study on industrial accident rate forecasting and program development of estimated zero accident time in Korea.

    PubMed

    Kim, Tae-gu; Kang, Young-sig; Lee, Hyung-won

    2011-01-01

    To begin a zero accident campaign for industry, the first thing is to estimate the industrial accident rate and the zero accident time systematically. This paper considers the social and technical change of the business environment after beginning the zero accident campaign through quantitative time series analysis methods. These methods include sum of squared errors (SSE), regression analysis method (RAM), exponential smoothing method (ESM), double exponential smoothing method (DESM), auto-regressive integrated moving average (ARIMA) model, and the proposed analytic function method (AFM). The program is developed to estimate the accident rate, zero accident time and achievement probability of an efficient industrial environment. In this paper, MFC (Microsoft Foundation Class) software of Visual Studio 2008 was used to develop a zero accident program. The results of this paper will provide major information for industrial accident prevention and be an important part of stimulating the zero accident campaign within all industrial environments.

  10. Aircraft accidents : method of analysis

    NASA Technical Reports Server (NTRS)

    1931-01-01

    The revised report includes the chart for the analysis of aircraft accidents, combining consideration of the immediate causes, underlying causes, and results of accidents, as prepared by the special committee, with a number of the definitions clarified. A brief statement of the organization and work of the special committee and of the Committee on Aircraft Accidents; and statistical tables giving a comparison of the types of accidents and causes of accidents in the military services on the one hand and in civil aviation on the other, together with explanations of some of the important differences noted in these tables.

  11. Analysis of Occupational Accidents in Underground and Surface Mining in Spain Using Data-Mining Techniques

    PubMed Central

    Sanmiquel, Lluís; Bascompta, Marc; Rossell, Josep M.; Anticoi, Hernán Francisco; Guash, Eduard

    2018-01-01

    An analysis of occupational accidents in the mining sector was conducted using the data from the Spanish Ministry of Employment and Social Safety between 2005 and 2015, and data-mining techniques were applied. Data was processed with the software Weka. Two scenarios were chosen from the accidents database: surface and underground mining. The most important variables involved in occupational accidents and their association rules were determined. These rules are composed of several predictor variables that cause accidents, defining its characteristics and context. This study exposes the 20 most important association rules in the sector—either surface or underground mining—based on the statistical confidence levels of each rule as obtained by Weka. The outcomes display the most typical immediate causes, along with the percentage of accidents with a basis in each association rule. The most important immediate cause is body movement with physical effort or overexertion, and the type of accident is physical effort or overexertion. On the other hand, the second most important immediate cause and type of accident are different between the two scenarios. Data-mining techniques were chosen as a useful tool to find out the root cause of the accidents. PMID:29518921

  12. Analysis of Occupational Accidents in Underground and Surface Mining in Spain Using Data-Mining Techniques.

    PubMed

    Sanmiquel, Lluís; Bascompta, Marc; Rossell, Josep M; Anticoi, Hernán Francisco; Guash, Eduard

    2018-03-07

    An analysis of occupational accidents in the mining sector was conducted using the data from the Spanish Ministry of Employment and Social Safety between 2005 and 2015, and data-mining techniques were applied. Data was processed with the software Weka. Two scenarios were chosen from the accidents database: surface and underground mining. The most important variables involved in occupational accidents and their association rules were determined. These rules are composed of several predictor variables that cause accidents, defining its characteristics and context. This study exposes the 20 most important association rules in the sector-either surface or underground mining-based on the statistical confidence levels of each rule as obtained by Weka. The outcomes display the most typical immediate causes, along with the percentage of accidents with a basis in each association rule. The most important immediate cause is body movement with physical effort or overexertion, and the type of accident is physical effort or overexertion. On the other hand, the second most important immediate cause and type of accident are different between the two scenarios. Data-mining techniques were chosen as a useful tool to find out the root cause of the accidents.

  13. Developing techniques for cause-responsibility analysis of occupational accidents.

    PubMed

    Jabbari, Mousa; Ghorbani, Roghayeh

    2016-11-01

    The aim of this study was to specify the causes of occupational accidents, determine social responsibility and the role of groups involved in work-related accidents. This study develops occupational accidents causes tree, occupational accidents responsibility tree, and occupational accidents component-responsibility analysis worksheet; based on these methods, it develops cause-responsibility analysis (CRA) techniques, and for testing them, analyzes 100 fatal/disabling occupational accidents in the construction setting that were randomly selected from all the work-related accidents in Tehran, Iran, over a 5-year period (2010-2014). The main result of this study involves two techniques for CRA: occupational accidents tree analysis (OATA) and occupational accidents components analysis (OACA), used in parallel for determination of responsible groups and responsibilities rate. From the results, we find that the management group of construction projects has 74.65% responsibility of work-related accidents. The developed techniques are purposeful for occupational accidents investigation/analysis, especially for the determination of detailed list of tasks, responsibilities, and their rates. Therefore, it is useful for preventing work-related accidents by focusing on the responsible group's duties. Copyright © 2016 Elsevier Ltd. All rights reserved.

  14. An analysis of aircraft accidents involving fires

    NASA Technical Reports Server (NTRS)

    Lucha, G. V.; Robertson, M. A.; Schooley, F. A.

    1975-01-01

    All U. S. Air Carrier accidents between 1963 and 1974 were studied to assess the extent of total personnel and aircraft damage which occurred in accidents and in accidents involving fire. Published accident reports and NTSB investigators' factual backup files were the primary sources of data. Although it was frequently not possible to assess the relative extent of fire-caused damage versus impact damage using the available data, the study established upper and lower bounds for deaths and damage due specifically to fire. In 12 years there were 122 accidents which involved airframe fires. Eighty-seven percent of the fires occurred after impact, and fuel leakage from ruptured tanks or severed lines was the most frequently cited cause. A cost analysis was performed for 300 serious accidents, including 92 serious accidents which involved fire. Personal injury costs were outside the scope of the cost analysis, but data on personnel injury judgements as well as settlements received from the CAB are included for reference.

  15. Single pilot IFR accident data analysis

    NASA Technical Reports Server (NTRS)

    Harris, D. F.; Morrisete, J. A.

    1982-01-01

    The aircraft accident data recorded and maintained by the National Transportation Safety Board for 1964 to 1979 were analyzed to determine what problems exist in the general aviation single pilot instrument flight rules environment. A previous study conducted in 1978 for the years 1964 to 1975 provided a basis for comparison. The purpose was to determine what changes, if any, have occurred in trends and cause-effect relationships reported in the earlier study. The increasing numbers have been tied to measures of activity to produce accident rates which in turn were analyzed in terms of change. Where anomalies or unusually high accident rates were encountered, further analysis was conducted to isolate pertinent patterns of cause factors and/or experience levels of involved pilots. The bulk of the effort addresses accidents in the landing phase of operations. A detailed analysis was performed on controlled/uncontrolled collisions and their unique attributes delineated. Estimates of day vs. night general aviation activity and accident rates were obtained.

  16. Reactor Safety Gap Evaluation of Accident Tolerant Components and Severe Accident Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Farmer, Mitchell T.; Bunt, R.; Corradini, M.

    The overall objective of this study was to conduct a technology gap evaluation on accident tolerant components and severe accident analysis methodologies with the goal of identifying any data and/or knowledge gaps that may exist, given the current state of light water reactor (LWR) severe accident research, and additionally augmented by insights obtained from the Fukushima accident. The ultimate benefit of this activity is that the results can be used to refine the Department of Energy’s (DOE) Reactor Safety Technology (RST) research and development (R&D) program plan to address key knowledge gaps in severe accident phenomena and analyses that affectmore » reactor safety and that are not currently being addressed by the industry or the Nuclear Regulatory Commission (NRC).« less

  17. Aircraft Loss-of-Control Accident Analysis

    NASA Technical Reports Server (NTRS)

    Belcastro, Christine M.; Foster, John V.

    2010-01-01

    Loss of control remains one of the largest contributors to fatal aircraft accidents worldwide. Aircraft loss-of-control accidents are complex in that they can result from numerous causal and contributing factors acting alone or (more often) in combination. Hence, there is no single intervention strategy to prevent these accidents. To gain a better understanding into aircraft loss-of-control events and possible intervention strategies, this paper presents a detailed analysis of loss-of-control accident data (predominantly from Part 121), including worst case combinations of causal and contributing factors and their sequencing. Future potential risks are also considered.

  18. Distributed and Collaborative Software Analysis

    NASA Astrophysics Data System (ADS)

    Ghezzi, Giacomo; Gall, Harald C.

    Throughout the years software engineers have come up with a myriad of specialized tools and techniques that focus on a certain type of software analysissoftware analysis such as source code analysis, co-change analysis or bug prediction. However, easy and straight forward synergies between these analyses and tools rarely exist because of their stand-alone nature, their platform dependence, their different input and output formats and the variety of data to analyze. As a consequence, distributed and collaborative software analysiscollaborative software analysis scenarios and in particular interoperability are severely limited. We describe a distributed and collaborative software analysis platform that allows for a seamless interoperability of software analysis tools across platform, geographical and organizational boundaries. We realize software analysis tools as services that can be accessed and composed over the Internet. These distributed analysis services shall be widely accessible in our incrementally augmented Software Analysis Broker software analysis broker where organizations and tool providers can register and share their tools. To allow (semi-) automatic use and composition of these tools, they are classified and mapped into a software analysis taxonomy and adhere to specific meta-models and ontologiesontologies for their category of analysis.

  19. [The characteristics of computer simulation of traffic accidents].

    PubMed

    Zou, Dong-Hua; Liu, Ning-Guo; Chen, Jian-Guo; Jin, Xian-Long; Zhang, Xiao-Yun; Zhang, Jian-Hua; Chen, Yi-Jiu

    2008-12-01

    To reconstruct the collision process of traffic accident and the injury mode of the victim by computer simulation technology in forensic assessment of traffic accident. Forty actual accidents were reconstructed by stimulation software and high performance computer based on analysis of the trace evidences at the scene, damage of the vehicles and injury of the victims, with 2 cases discussed in details. The reconstruction correlated very well in 28 cases, well in 9 cases, and suboptimal in 3 cases with the above parameters. Accurate reconstruction of the accident would be helpful for assessment of the injury mechanism of the victims. Reconstruction of the collision process of traffic accident and the injury mechanism of the victim by computer simulation is useful in traffic accident assessment.

  20. Aircraft accidents : method of analysis

    NASA Technical Reports Server (NTRS)

    1929-01-01

    This report on a method of analysis of aircraft accidents has been prepared by a special committee on the nomenclature, subdivision, and classification of aircraft accidents organized by the National Advisory Committee for Aeronautics in response to a request dated February 18, 1928, from the Air Coordination Committee consisting of the Assistant Secretaries for Aeronautics in the Departments of War, Navy, and Commerce. The work was undertaken in recognition of the difficulty of drawing correct conclusions from efforts to analyze and compare reports of aircraft accidents prepared by different organizations using different classifications and definitions. The air coordination committee's request was made "in order that practices used may henceforth conform to a standard and be universally comparable." the purpose of the special committee therefore was to prepare a basis for the classification and comparison of aircraft accidents, both civil and military. (author)

  1. Application of forensic image analysis in accident investigations.

    PubMed

    Verolme, Ellen; Mieremet, Arjan

    2017-09-01

    Forensic investigations are primarily meant to obtain objective answers that can be used for criminal prosecution. Accident analyses are usually performed to learn from incidents and to prevent similar events from occurring in the future. Although the primary goal may be different, the steps in which information is gathered, interpreted and weighed are similar in both types of investigations, implying that forensic techniques can be of use in accident investigations as well. The use in accident investigations usually means that more information can be obtained from the available information than when used in criminal investigations, since the latter require a higher evidence level. In this paper, we demonstrate the applicability of forensic techniques for accident investigations by presenting a number of cases from one specific field of expertise: image analysis. With the rapid spread of digital devices and new media, a wealth of image material and other digital information has become available for accident investigators. We show that much information can be distilled from footage by using forensic image analysis techniques. These applications show that image analysis provides information that is crucial for obtaining the sequence of events and the two- and three-dimensional geometry of an accident. Since accident investigation focuses primarily on learning from accidents and prevention of future accidents, and less on the blame that is crucial for criminal investigations, the field of application of these forensic tools may be broader than would be the case in purely legal sense. This is an important notion for future accident investigations. Copyright © 2017 Elsevier B.V. All rights reserved.

  2. An analysis of pilot error-related aircraft accidents

    NASA Technical Reports Server (NTRS)

    Kowalsky, N. B.; Masters, R. L.; Stone, R. B.; Babcock, G. L.; Rypka, E. W.

    1974-01-01

    A multidisciplinary team approach to pilot error-related U.S. air carrier jet aircraft accident investigation records successfully reclaimed hidden human error information not shown in statistical studies. New analytic techniques were developed and applied to the data to discover and identify multiple elements of commonality and shared characteristics within this group of accidents. Three techniques of analysis were used: Critical element analysis, which demonstrated the importance of a subjective qualitative approach to raw accident data and surfaced information heretofore unavailable. Cluster analysis, which was an exploratory research tool that will lead to increased understanding and improved organization of facts, the discovery of new meaning in large data sets, and the generation of explanatory hypotheses. Pattern recognition, by which accidents can be categorized by pattern conformity after critical element identification by cluster analysis.

  3. Development of Database for Accident Analysis in Indian Mines

    NASA Astrophysics Data System (ADS)

    Tripathy, Debi Prasad; Guru Raghavendra Reddy, K.

    2016-10-01

    Mining is a hazardous industry and high accident rates associated with underground mining is a cause of deep concern. Technological developments notwithstanding, rate of fatal accidents and reportable incidents have not shown corresponding levels of decline. This paper argues that adoption of appropriate safety standards by both mine management and the government may result in appreciable reduction in accident frequency. This can be achieved by using the technology in improving the working conditions, sensitising workers and managers about causes and prevention of accidents. Inputs required for a detailed analysis of an accident include information on location, time, type, cost of accident, victim, nature of injury, personal and environmental factors etc. Such information can be generated from data available in the standard coded accident report form. This paper presents a web based application for accident analysis in Indian mines during 2001-2013. An accident database (SafeStat) prototype based on Intranet of the TCP/IP agreement, as developed by the authors, is also discussed.

  4. Traffic accident in Cuiabá-MT: an analysis through the data mining technology.

    PubMed

    Galvão, Noemi Dreyer; de Fátima Marin, Heimar

    2010-01-01

    The traffic road accidents (ATT) are non-intentional events with an important magnitude worldwide, mainly in the urban centers. This article aims to analyzes data related to the victims of ATT recorded by the Justice Secretariat and Public Security (SEJUSP) in hospital morbidity and mortality incidence at the city of Cuiabá-MT during 2006, using data mining technology. An observational, retrospective and exploratory study of the secondary data bases was carried out. The three database selected were related using the probabilistic method, through the free software RecLink. One hundred and thirty-nine (139) real pairs of victims of ATT were obtained. In this related database the data mining technology was applied with the software WEKA using the Apriori algorithm. The result generated 10 best rules, six of them were considered according to the parameters established that indicated a useful and comprehensible knowledge to characterize the victims of accidents in Cuiabá. Finally, the findings of the associative rules showed peculiarities of the road traffic accident victims in Cuiabá and highlight the need of prevention measures in the collision accidents for males.

  5. Categorizing accident sequences in the external radiotherapy for risk analysis

    PubMed Central

    2013-01-01

    Purpose This study identifies accident sequences from the past accidents in order to help the risk analysis application to the external radiotherapy. Materials and Methods This study reviews 59 accidental cases in two retrospective safety analyses that have collected the incidents in the external radiotherapy extensively. Two accident analysis reports that accumulated past incidents are investigated to identify accident sequences including initiating events, failure of safety measures, and consequences. This study classifies the accidents by the treatments stages and sources of errors for initiating events, types of failures in the safety measures, and types of undesirable consequences and the number of affected patients. Then, the accident sequences are grouped into several categories on the basis of similarity of progression. As a result, these cases can be categorized into 14 groups of accident sequence. Results The result indicates that risk analysis needs to pay attention to not only the planning stage, but also the calibration stage that is committed prior to the main treatment process. It also shows that human error is the largest contributor to initiating events as well as to the failure of safety measures. This study also illustrates an event tree analysis for an accident sequence initiated in the calibration. Conclusion This study is expected to provide sights into the accident sequences for the prospective risk analysis through the review of experiences. PMID:23865005

  6. Canister Storage Building (CSB) Design Basis Accident Analysis Documentation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    CROWE, R.D.; PIEPHO, M.G.

    2000-03-23

    This document provided the detailed accident analysis to support HNF-3553, Spent Nuclear Fuel Project Final Safety Analysis Report, Annex A, ''Canister Storage Building Final Safety Analysis Report''. All assumptions, parameters, and models used to provide the analysis of the design basis accidents are documented to support the conclusions in the Canister Storage Building Final Safety Analysis Report.

  7. A Qualitative Study on Organizational Factors Affecting Occupational Accidents

    PubMed Central

    ESKANDARI, Davood; JAFARI, Mohammad Javad; MEHRABI, Yadollah; KIAN, Mostafa Pouya; CHARKHAND, Hossein; MIRGHOTBI, Mostafa

    2017-01-01

    Background: Technical, human, operational and organizational factors have been influencing the sequence of occupational accidents. Among them, organizational factors play a major role in causing occupational accidents. The aim of this research was to understand the Iranian safety experts’ experiences and perception of organizational factors. Methods: This qualitative study was conducted in 2015 by using the content analysis technique. Data were collected through semi-structured interviews with 17 safety experts working in Iranian universities and industries and analyzed with a conventional qualitative content analysis method using the MAXQDA software. Results: Eleven organizational factors’ sub-themes were identified: management commitment, management participation, employee involvement, communication, blame culture, education and training, job satisfaction, interpersonal relationship, supervision, continuous improvement, and reward system. The participants considered these factors as effective on occupational accidents. Conclusion: The mentioned 11 organizational factors are probably involved in occupational accidents in Iran. Naturally, improving organizational factors can increase the safety performance and reduce occupational accidents. PMID:28435824

  8. A Qualitative Study on Organizational Factors Affecting Occupational Accidents.

    PubMed

    Eskandari, Davood; Jafari, Mohammad Javad; Mehrabi, Yadollah; Kian, Mostafa Pouya; Charkhand, Hossein; Mirghotbi, Mostafa

    2017-03-01

    Technical, human, operational and organizational factors have been influencing the sequence of occupational accidents. Among them, organizational factors play a major role in causing occupational accidents. The aim of this research was to understand the Iranian safety experts' experiences and perception of organizational factors. This qualitative study was conducted in 2015 by using the content analysis technique. Data were collected through semi-structured interviews with 17 safety experts working in Iranian universities and industries and analyzed with a conventional qualitative content analysis method using the MAXQDA software. Eleven organizational factors' sub-themes were identified: management commitment, management participation, employee involvement, communication, blame culture, education and training, job satisfaction, interpersonal relationship, supervision, continuous improvement, and reward system. The participants considered these factors as effective on occupational accidents. The mentioned 11 organizational factors are probably involved in occupational accidents in Iran. Naturally, improving organizational factors can increase the safety performance and reduce occupational accidents.

  9. The potential risk of toxoplasmosis for traffic accidents: A systematic review and meta-analysis.

    PubMed

    Gohardehi, Shaban; Sharif, Mehdi; Sarvi, Shahabeddin; Moosazadeh, Mahmood; Alizadeh-Navaei, Reza; Hosseini, Seyed Abdollah; Amouei, Afsaneh; Pagheh, Abdolsattar; Sadeghi, Mitra; Daryani, Ahmad

    2018-06-12

    Toxoplasmosis is a prevalent infectious disease. Although most people infected by Toxoplasma gondii are asymptomatic, evidence has suggested that this disease might affect some aspects of a host's behavior and associate with schizophrenia, suicide attempt, changes in various aspects of personality, and poor neurocognitive performance. These associations may play roles in increasing the risk of a number of incidents, such as traffic accidents, among infected people. In this regard, this study aimed to provide summary estimates for the available data on the potential risk of toxoplasmosis for traffic accidents. To this end, using a number of search terms, i.e. toxoplasmosis, Toxoplasma gondii, traffic accident, road accident, car accident, crash, and prevalence, literature searches (up to October 1, 2017) were carried out via 6 databases. The meta-analysis was conducted using the StatsDirect statistical software and a P-value less than 0.05 was regarded as significant in all statistical analyses. Out of 1841 identified studies, 9 studies were finally considered eligible for carrying out this systematic review. Reviewing results of these studies indicated that 5 out of 9 studies reported a significant relationship between Toxoplasma gondii and traffic accidents. Additionally, data related to gender showed significant differences between infected and control men and women. Considering age, reviewing the results of these studies revealed a significant difference between the infected people and the Toxoplasma-negative subjects under 45 years of age. However, no significant difference was found between the two groups aged 45 or older. Given these results, it can be concluded that Toxoplasma gondii significantly increases the risk of having traffic accidents. Copyright © 2018 Elsevier Inc. All rights reserved.

  10. A cluster analysis on road traffic accidents using genetic algorithms

    NASA Astrophysics Data System (ADS)

    Saharan, Sabariah; Baragona, Roberto

    2017-04-01

    The analysis of traffic road accidents is increasingly important because of the accidents cost and public road safety. The availability or large data sets makes the study of factors that affect the frequency and severity accidents are viable. However, the data are often highly unbalanced and overlapped. We deal with the data set of the road traffic accidents recorded in Christchurch, New Zealand, from 2000-2009 with a total of 26440 accidents. The data is in a binary set and there are 50 factors road traffic accidents with four level of severity. We used genetic algorithm for the analysis because we are in the presence of a large unbalanced data set and standard clustering like k-means algorithm may not be suitable for the task. The genetic algorithm based on clustering for unknown K, (GCUK) has been used to identify the factors associated with accidents of different levels of severity. The results provided us with an interesting insight into the relationship between factors and accidents severity level and suggest that the two main factors that contributes to fatal accidents are "Speed greater than 60 km h" and "Did not see other people until it was too late". A comparison with the k-means algorithm and the independent component analysis is performed to validate the results.

  11. Validity and consistency assessment of accident analysis methods in the petroleum industry.

    PubMed

    Ahmadi, Omran; Mortazavi, Seyed Bagher; Khavanin, Ali; Mokarami, Hamidreza

    2017-11-17

    Accident analysis is the main aspect of accident investigation. It includes the method of connecting different causes in a procedural way. Therefore, it is important to use valid and reliable methods for the investigation of different causal factors of accidents, especially the noteworthy ones. This study aimed to prominently assess the accuracy (sensitivity index [SI]) and consistency of the six most commonly used accident analysis methods in the petroleum industry. In order to evaluate the methods of accident analysis, two real case studies (process safety and personal accident) from the petroleum industry were analyzed by 10 assessors. The accuracy and consistency of these methods were then evaluated. The assessors were trained in the workshop of accident analysis methods. The systematic cause analysis technique and bowtie methods gained the greatest SI scores for both personal and process safety accidents, respectively. The best average results of the consistency in a single method (based on 10 independent assessors) were in the region of 70%. This study confirmed that the application of methods with pre-defined causes and a logic tree could enhance the sensitivity and consistency of accident analysis.

  12. Indonesian Sea Accident Analysis (Case Study From 2003 – 2013)

    NASA Astrophysics Data System (ADS)

    Arya Dewanto, Y.; Faturachman, D.

    2018-03-01

    There are so many accidents in sea transportation in Indonesia. Most of the accidents happen because of low concern aspects of the safety and security of the crew. In sailing, a man as transport users to interact with the ship and the surrounding environment (including other ships, cruise lines, ports, and the situation of local conditions). These interactions are sometimes very complex and related to various aspects of. Aware of the multiplicity of aspects related to the third of these factors, seeking the safety of cruise through a reduction in the number of accidents and the risk of death and serious injuries due to accidents and goods transported is certainly not enough attempted through mono-sector approach, but rather takes a multi-sector approach to the efforts. In this paper, we described the Indonesian Sea Transportation accident analysis for eleven years divided into four items: total of ship accident type, ship accident factor, total of casualties, region of ship accidents. All data founded from Marine Court (Mahkamah Pelayaran). From that 4 items we can find Indonesia Sea Accident Analysis from 2003-2013.

  13. Civil helicopter wire strike assessment study. Volume 2: Accident analysis briefs

    NASA Technical Reports Server (NTRS)

    Tuomela, C. H.; Brennan, M. F.

    1980-01-01

    A description and analysis of each of the 208 civil helicopter wire strike accidents reported to the National Transportation Safety Board (NTSB) for the ten year period 1970-1979 is given. The accident analysis briefs were based on pilot reports, FAA investigation reports, and such accident photographs as were made available. Briefs were grouped by year and, within year, by NTSB accident report number.

  14. Implementation of numerical simulation techniques in analysis of the accidents in complex technological systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Klishin, G.S.; Seleznev, V.E.; Aleoshin, V.V.

    1997-12-31

    Gas industry enterprises such as main pipelines, compressor gas transfer stations, gas extracting complexes belong to the energy intensive industry. Accidents there can result into the catastrophes and great social, environmental and economic losses. Annually, according to the official data several dozens of large accidents take place at the pipes in the USA and Russia. That is why prevention of the accidents, analysis of the mechanisms of their development and prediction of their possible consequences are acute and important tasks nowadays. The accidents reasons are usually of a complicated character and can be presented as a complex combination of natural,more » technical and human factors. Mathematical and computer simulations are safe, rather effective and comparatively inexpensive methods of the accident analysis. It makes it possible to analyze different mechanisms of a failure occurrence and development, to assess its consequences and give recommendations to prevent it. Besides investigation of the failure cases, numerical simulation techniques play an important role in the treatment of the diagnostics results of the objects and in further construction of mathematical prognostic simulations of the object behavior in the period of time between two inspections. While solving diagnostics tasks and in the analysis of the failure cases, the techniques of theoretical mechanics, of qualitative theory of different equations, of mechanics of a continuous medium, of chemical macro-kinetics and optimizing techniques are implemented in the Conversion Design Bureau {number_sign}5 (DB{number_sign}5). Both universal and special numerical techniques and software (SW) are being developed in DB{number_sign}5 for solution of such tasks. Almost all of them are calibrated on the calculations of the simulated and full-scale experiments performed at the VNIIEF and MINATOM testing sites. It is worth noting that in the long years of work there has been established a fruitful and

  15. Analysis of construction accidents in Spain, 2003-2008.

    PubMed

    López Arquillos, Antonio; Rubio Romero, Juan Carlos; Gibb, Alistair

    2012-12-01

    The research objective for this paper is to obtain a new extended and updated insight to the likely causes of construction accidents in Spain, in order to identify suitable mitigating actions. The paper analyzes all construction sector accidents in Spain between 2003 and 2008. Ten variables were chosen and the influence of each variable is evaluated with respect to the severity of the accident. The descriptive analysis is based on a total of 1,163,178 accidents. Results showed that the severity of accidents was related to variables including age, CNAE (National Classification of Economic Activities) code, size of company, length of service, location of accident, day of the week, days of absence, deviation, injury, and climatic zones. According to data analyzed, a large company is not always necessarily safer than a small company in the aspect of fatal accidents, experienced workers do not have the best accident fatality rates, and accidents occurring away from the usual workplace had more severe consequences. Results obtained in this paper can be used by companies in their occupational safety strategies, and in their safety training programs. Copyright © 2012 National Safety Council and Elsevier Ltd. All rights reserved.

  16. Efficacy of a Newly Designed Cephalometric Analysis Software for McNamara Analysis in Comparison with Dolphin Software.

    PubMed

    Nouri, Mahtab; Hamidiaval, Shadi; Akbarzadeh Baghban, Alireza; Basafa, Mohammad; Fahim, Mohammad

    2015-01-01

    Cephalometric norms of McNamara analysis have been studied in various populations due to their optimal efficiency. Dolphin cephalometric software greatly enhances the conduction of this analysis for orthodontic measurements. However, Dolphin is very expensive and cannot be afforded by many clinicians in developing countries. A suitable alternative software program in Farsi/English will greatly help Farsi speaking clinicians. The present study aimed to develop an affordable Iranian cephalometric analysis software program and compare it with Dolphin, the standard software available on the market for cephalometric analysis. In this diagnostic, descriptive study, 150 lateral cephalograms of normal occlusion individuals were selected in Mashhad and Qazvin, two major cities of Iran mainly populated with Fars ethnicity, the main Iranian ethnic group. After tracing the cephalograms, the McNamara analysis standards were measured both with Dolphin and the new software. The cephalometric software was designed using Microsoft Visual C++ program in Windows XP. Measurements made with the new software were compared with those of Dolphin software on both series of cephalograms. The validity and reliability were tested using intra-class correlation coefficient. Calculations showed a very high correlation between the results of the Iranian cephalometric analysis software and Dolphin. This confirms the validity and optimal efficacy of the newly designed software (ICC 0.570-1.0). According to our results, the newly designed software has acceptable validity and reliability and can be used for orthodontic diagnosis, treatment planning and assessment of treatment outcome.

  17. Software Reliability Analysis of NASA Space Flight Software: A Practical Experience

    PubMed Central

    Sukhwani, Harish; Alonso, Javier; Trivedi, Kishor S.; Mcginnis, Issac

    2017-01-01

    In this paper, we present the software reliability analysis of the flight software of a recently launched space mission. For our analysis, we use the defect reports collected during the flight software development. We find that this software was developed in multiple releases, each release spanning across all software life-cycle phases. We also find that the software releases were developed and tested for four different hardware platforms, spanning from off-the-shelf or emulation hardware to actual flight hardware. For releases that exhibit reliability growth or decay, we fit Software Reliability Growth Models (SRGM); otherwise we fit a distribution function. We find that most releases exhibit reliability growth, with Log-Logistic (NHPP) and S-Shaped (NHPP) as the best-fit SRGMs. For the releases that experience reliability decay, we investigate the causes for the same. We find that such releases were the first software releases to be tested on a new hardware platform, and hence they encountered major hardware integration issues. Also such releases seem to have been developed under time pressure in order to start testing on the new hardware platform sooner. Such releases exhibit poor reliability growth, and hence exhibit high predicted failure rate. Other problems include hardware specification changes and delivery delays from vendors. Thus, our analysis provides critical insights and inputs to the management to improve the software development process. As NASA has moved towards a product line engineering for its flight software development, software for future space missions will be developed in a similar manner and hence the analysis results for this mission can be considered as a baseline for future flight software missions. PMID:29278255

  18. Software Reliability Analysis of NASA Space Flight Software: A Practical Experience.

    PubMed

    Sukhwani, Harish; Alonso, Javier; Trivedi, Kishor S; Mcginnis, Issac

    2016-01-01

    In this paper, we present the software reliability analysis of the flight software of a recently launched space mission. For our analysis, we use the defect reports collected during the flight software development. We find that this software was developed in multiple releases, each release spanning across all software life-cycle phases. We also find that the software releases were developed and tested for four different hardware platforms, spanning from off-the-shelf or emulation hardware to actual flight hardware. For releases that exhibit reliability growth or decay, we fit Software Reliability Growth Models (SRGM); otherwise we fit a distribution function. We find that most releases exhibit reliability growth, with Log-Logistic (NHPP) and S-Shaped (NHPP) as the best-fit SRGMs. For the releases that experience reliability decay, we investigate the causes for the same. We find that such releases were the first software releases to be tested on a new hardware platform, and hence they encountered major hardware integration issues. Also such releases seem to have been developed under time pressure in order to start testing on the new hardware platform sooner. Such releases exhibit poor reliability growth, and hence exhibit high predicted failure rate. Other problems include hardware specification changes and delivery delays from vendors. Thus, our analysis provides critical insights and inputs to the management to improve the software development process. As NASA has moved towards a product line engineering for its flight software development, software for future space missions will be developed in a similar manner and hence the analysis results for this mission can be considered as a baseline for future flight software missions.

  19. 'Remixing Rasmussen': The evolution of Accimaps within systemic accident analysis.

    PubMed

    Waterson, Patrick; Jenkins, Daniel P; Salmon, Paul M; Underwood, Peter

    2017-03-01

    Throughout Jens Rasmussen's career there has been a continued emphasis on the development of methods, techniques and tools for accident analysis and investigation. In this paper we focus on the evolution and development of one specific example, namely Accimaps and their use for accident analysis. We describe the origins of Accimaps followed by a review of 27 studies which have applied and adapted Accimaps over the period 2000-2015 to a range of domains and types of accident. Aside from demonstrating the versatility and popularity of the method, part of the motivation for the review of the use of Accimaps is to address the question of what constitutes a sound, usable, valid and reliable approach to systemic accident analysis. The findings from the review demonstrate continuity with the work carried out by Rasmussen, as well as significant variation (e.g., changes to the Accimap, used of additional theoretical and practice-oriented perspectives on safety). We conclude the paper with some speculations regarding future extension and adaptation of the Accimap approach including the possibility of using hybrid models for accident analysis. Copyright © 2016 Elsevier Ltd. All rights reserved.

  20. On-Orbit Software Analysis

    NASA Technical Reports Server (NTRS)

    Moran, Susanne I.

    2004-01-01

    The On-Orbit Software Analysis Research Infusion Project was done by Intrinsyx Technologies Corporation (Intrinsyx) at the National Aeronautics and Space Administration (NASA) Ames Research Center (ARC). The Project was a joint collaborative effort between NASA Codes IC and SL, Kestrel Technology (Kestrel), and Intrinsyx. The primary objectives of the Project were: Discovery and verification of software program properties and dependencies, Detection and isolation of software defects across different versions of software, and Compilation of historical data and technical expertise for future applications

  1. Analysis of traffic accident data in Kentucky (1986-1990)

    DOT National Transportation Integrated Search

    1991-09-01

    This report includes an analysis of traffic accident data in Kentucky for the years of 1986-1990. A primary objectve of this study was to determine average statistics for kentucky highways. Average and critical number and rates of accidents were calc...

  2. Analysis of traffic accident data in Kentucky (1994-1998)

    DOT National Transportation Integrated Search

    1999-09-01

    This report includes an analysis of traffic accident data in Kentucky for the years of 1994 through 1998. A primary objective of this study was to determine average accident statistics for Kentucky highways. Average and critical numbers and rates of ...

  3. Inertial Upper Stage (IUS) software analysis

    NASA Technical Reports Server (NTRS)

    Grayson, W. L.; Nickel, C. E.; Rose, P. L.; Singh, R. P.

    1979-01-01

    The Inertial Upper Stage (IUS) System, an extension of the Space Transportation System (STS) operating regime to include higher orbits, orbital plane changes, geosynchronous orbits, and interplanetary trajectories is presented. The IUS software design, the IUS software interfaces with other systems, and the cost effectiveness in software verification are described. Tasks of the IUS discussed include: (1) design analysis; (2) validation requirements analysis; (3) interface analysis; and (4) requirements analysis.

  4. Corporate cost of occupational accidents: an activity-based analysis.

    PubMed

    Rikhardsson, Pall M; Impgaard, Martin

    2004-03-01

    The systematic accident cost analysis (SACA) project was carried out during 2001 by The Aarhus School of Business and PricewaterhouseCoopers Denmark with financial support from The Danish National Working Environment Authority. Its focused on developing and testing a method for evaluating occupational costs of companies for use by occupational health and safety professionals. The method was tested in nine Danish companies within three different industry sectors and the costs of 27 selected occupational accidents in these companies were calculated. One of the main conclusions is that the SACA method could be used in all of the companies without revisions. The evaluation of accident cost showed that 2/3 of the costs of occupational accidents are visible in the Danish corporate accounting systems reviewed while 1/3 is hidden from management view. The highest cost of occupational accidents for a company with 3.600 employees was estimated to approximately US$ 682.000. The paper includes an introduction regarding accident cost analysis in companies, a presentation of the SACA project methodology and the SACA method itself, a short overview of some of the results of the SACA project and a conclusion. Further information about the project is available at http://www.asb.dk/saca.

  5. Analysis of traffic accidents in Romania, 2009.

    PubMed

    Călinoiu, Geovana; Minca, Dana Galieta; Furtunescu, Florentina Ligia

    2012-01-01

    This paper aimed to underline the main consequences of traffic accidents in Romania 2009 and their associated causes or circumstances. We identified some problematic geographic areas, some critical months or moments of the day and also the most frequent causes; all these should become targets for the future planning. The current analysis provides some priority criteria for public health interventions. So, the future national road safety strategy should be in line with the EU objectives, but also with the national priorities. Romania is far away from the average EU target for 2010 of halving the death by traffic accidents registered in 2001. To describe the circumstances and the consequences related to traffic accidents registered in Romania, for the year 2009. An ecological study was conducted. The traffic accidents circumstances were analyzed in terms of magnitude, geographic space, time and cause. The consequences were analyzed as affected people and damaged cars. A total of 28,627 traffic accidents were registered in Romania during the year 2009. 2,796 people were killed and 27,968 were hospitalized and 42,443 cars were damaged. 3 of 4 accidents were caused by violations on behalf of the car drivers. Most common violations in car drivers were excess of speed and priority violations (52.4%). Among the pedestrians, 7 of 10 accidents were caused by illegal crossing. A higher number of accidents occurred during the summer months and during the evening hours (from 5.00 pm till 8.00 pm). The traffic accidents represent a real public health problem in Romania and a serious burden for the health system. The gap between Romania and the other EU member states needs to be diminished in the next decade. In this purpose, the future national road safety strategy should be in line with the EU objectives, but also with the national priorities. Research is needed to understand the causes and the socio-economical impact of traffic accidents and to define appropriate national

  6. Analysis of traffic accident data in Kentucky (1995-1999)

    DOT National Transportation Integrated Search

    2000-09-01

    This report includes an analysis of traffic accidents data in Kentucky for the year 1995-1999. A primary objective of this study was to determined average and critical numbers and rates of accidents for various types of highways in rural and urban ar...

  7. GIS based analysis of Intercity Fatal Road Traffic Accidents in Iran

    PubMed Central

    Alizadeh, A; Zare, M; Darparesh, M; Mohseni, S; Soleimani-Ahmadi, M

    2015-01-01

    Road traffic accidents including intercity car traffic accidents (ICTAs) are among the most important causes of morbidity and mortality due to the growing number of vehicles, risky behaviors, and changes in lifestyle of the general population. A sound knowledge of the geographical distribution of car traffic accidents can be considered as an approach towards the accident causation and it can be used as an administrative tool in allocating the sources for traffic accidents prevention. This study was conducted to investigate the geographical distribution and the time trend of fatal intercity car traffic accidents in Iran. To conduct this descriptive study, all Iranian intercity road traffic mortality data were obtained from the Police reports in the Statistical Yearbook of the Governor’s Budget and Planning. The obtained data were for 17 complete Iranian calendar years from March 1997 to March 2012. The incidence rate (IR) of fatal ICTAs for each year was calculated as the total number of fatal ICTAs in every 100000 population in specified time intervals. Figures and maps indicating the trends and geographical distribution of fatal ICTAs were prepared while using Microsoft Excel and ArcGis9.2 software. The number of fatal car accidents showed a general increasing trend from 3000 in 1996 to 13500 in 2012. The incidence of fatal intercity car accidents has changed from six in 100000 population in 1996 to 18 in 100000 population in 2012. GIS based data showed that the incidence rate of ICTAs in different provinces of Iran was very divergent. The highest incidence of fatal ICTAs was in Semnan province (IR= 35.2), followed by North Khorasan (IR=22.7), and South Khorasan (IR=22). The least incidence of fatal ICTAs was in Tehran province (IR=2.4) followed by Khozestan (IR=6.5), and Eastern Azarbayejan (IR=6.6). The compensation cost of fatal ICTAs also showed an increasing trend during the studied period. Since an increasing amount of money was being paid yearly for the

  8. GIS based analysis of Intercity Fatal Road Traffic Accidents in Iran.

    PubMed

    Alizadeh, A; Zare, M; Darparesh, M; Mohseni, S; Soleimani-Ahmadi, M

    2015-01-01

    Road traffic accidents including intercity car traffic accidents (ICTAs) are among the most important causes of morbidity and mortality due to the growing number of vehicles, risky behaviors, and changes in lifestyle of the general population. A sound knowledge of the geographical distribution of car traffic accidents can be considered as an approach towards the accident causation and it can be used as an administrative tool in allocating the sources for traffic accidents prevention. This study was conducted to investigate the geographical distribution and the time trend of fatal intercity car traffic accidents in Iran. To conduct this descriptive study, all Iranian intercity road traffic mortality data were obtained from the Police reports in the Statistical Yearbook of the Governor's Budget and Planning. The obtained data were for 17 complete Iranian calendar years from March 1997 to March 2012. The incidence rate (IR) of fatal ICTAs for each year was calculated as the total number of fatal ICTAs in every 100000 population in specified time intervals. Figures and maps indicating the trends and geographical distribution of fatal ICTAs were prepared while using Microsoft Excel and ArcGis9.2 software. The number of fatal car accidents showed a general increasing trend from 3000 in 1996 to 13500 in 2012. The incidence of fatal intercity car accidents has changed from six in 100000 population in 1996 to 18 in 100000 population in 2012. GIS based data showed that the incidence rate of ICTAs in different provinces of Iran was very divergent. The highest incidence of fatal ICTAs was in Semnan province (IR= 35.2), followed by North Khorasan (IR=22.7), and South Khorasan (IR=22). The least incidence of fatal ICTAs was in Tehran province (IR=2.4) followed by Khozestan (IR=6.5), and Eastern Azarbayejan (IR=6.6). The compensation cost of fatal ICTAs also showed an increasing trend during the studied period. Since an increasing amount of money was being paid yearly for the car

  9. Analysis of helium purification system capability during water ingress accident in RDE

    NASA Astrophysics Data System (ADS)

    Sriyono; Kusmastuti, Rahayu; Bakhri, Syaiful; Sunaryo, Geni Rina

    2018-02-01

    The water ingress accident caused by steam generator tube rupture (SGTR) in RDE (Experimental Power Reactor) must be anticipated. During the accident, steam from secondary system diffused and mixed with helium gas in the primary coolant. To avoid graphite corrosion in the core, steam will be removed by Helium purification system (HPS). There are two trains in HPS, first train for normal operation and the second for the regeneration and accident. The second train is responsible to clean the coolant during accident condition. The second train is equipped with additional component, i.e. water cooler, post accident blower, and water separator to remove this mixture gas. During water ingress, the water release from rupture tube is mixed with helium gas. The water cooler acts as a steam condenser, where the steam will be separated by water separator from the helium gas. This paper analyses capability of HPS during water ingress accident. The goal of the research is to determine the time consumed by HPS to remove the total amount of water ingress. The method used is modelling and simulation of the HPS by using ChemCAD software. The BDBA and DBA scenarios will be simulated. In BDBA scenario, up to 110 kg of water is assumed to infiltrate to primary coolant while DBA is up to 35 kg. By using ChemCAD simulation, the second train will purify steam ingress maximum in 0.5 hours. The HPS of RDE has a capability to anticipate the water ingress accident.

  10. Glider accidents: an analysis of 143 cases, 2001-2005.

    PubMed

    van Doorn, Robert R A; de Voogt, Alexander J

    2007-01-01

    The majority of aviation crashes and casualties take place in general and sport aviation. Although gliding has gained popularity in recent decades, we could find no systematic analysis of glider accidents. This study determined factors associated with both non-fatal and fatal glider accidents to document their position within sport and general aviation accidents, and to suggest preventive measures and improvements. We performed a retrospective review of glider accidents for the period 2001-2005 in the database maintained by the U.S. National Transportation Safety Board (NTSB). A total of 117 non-fatal and 26 fatal glider accidents were reported for the 5-yr period. Adverse weather was the cause in 20% of all non-fatal accidents, 60% of which occurred in the cruise phase. Logistic regression revealed that fatal accidents were predicted by pilot error, flight phase, and home-built aircraft. Factors contributing to glider crashes are specific to this type of sport aviation. Owners of home-built gliders should pay particular attention to the aircraft's specifications and design limits.

  11. Historical analysis of US pipeline accidents triggered by natural hazards

    NASA Astrophysics Data System (ADS)

    Girgin, Serkan; Krausmann, Elisabeth

    2015-04-01

    Natural hazards, such as earthquakes, floods, landslides, or lightning, can initiate accidents in oil and gas pipelines with potentially major consequences on the population or the environment due to toxic releases, fires and explosions. Accidents of this type are also referred to as Natech events. Many major accidents highlight the risk associated with natural-hazard impact on pipelines transporting dangerous substances. For instance, in the USA in 1994, flooding of the San Jacinto River caused the rupture of 8 and the undermining of 29 pipelines by the floodwaters. About 5.5 million litres of petroleum and related products were spilled into the river and ignited. As a results, 547 people were injured and significant environmental damage occurred. Post-incident analysis is a valuable tool for better understanding the causes, dynamics and impacts of pipeline Natech accidents in support of future accident prevention and mitigation. Therefore, data on onshore hazardous-liquid pipeline accidents collected by the US Pipeline and Hazardous Materials Safety Administration (PHMSA) was analysed. For this purpose, a database-driven incident data analysis system was developed to aid the rapid review and categorization of PHMSA incident reports. Using an automated data-mining process followed by a peer review of the incident records and supported by natural hazard databases and external information sources, the pipeline Natechs were identified. As a by-product of the data-collection process, the database now includes over 800,000 incidents from all causes in industrial and transportation activities, which are automatically classified in the same way as the PHMSA record. This presentation describes the data collection and reviewing steps conducted during the study, provides information on the developed database and data analysis tools, and reports the findings of a statistical analysis of the identified hazardous liquid pipeline incidents in terms of accident dynamics and

  12. [An analysis of industrial accidents in the working field with a particular emphasis on repeated accidents].

    PubMed

    Wakisaka, I; Yanagihashi, T; Tomari, T; Sato, M

    1990-03-01

    The present study is based on an analysis of routinely submitted reports of occupational accidents experienced by the workers of industrial enterprises under the jurisdiction of Kagoshima Labor Standard Office during a 5-year period 1983 to 1987. Officially notified injuries serious enough to keep employees away from their job for work at least 4 days were utilized in this study. Data was classified so as to give an observed frequency distribution for workers having any specified number of accidents. Also, the accident rate which is an indicator of the risk of accident was compared among different occupations, between age groups and between the sexes. Results obtained are as follows; 1) For the combined total of 6,324 accident cases for 8 types of occupation (Construction, Transportation, Mining & Quarrying, Forestry, Food manufacture, Lumber & Woodcraft, Manufacturing industry and Other business), the number of those who had at least one accident was 6,098, of which 5,837 were injured only once, 208 twice, 21 three times and 2 four times. When occupation type was fixed, however, the number of workers having one, two, three and four times of accidents were 5,895, 182, 19 and 2, respectively. This suggests that some workers are likely to have experienced repeated accidents in more than one type of occupation.(ABSTRACT TRUNCATED AT 250 WORDS)

  13. Anthropotechnological analysis of industrial accidents in Brazil.

    PubMed Central

    Binder, M. C.; de Almeida, I. M.; Monteau, M.

    1999-01-01

    The Brazilian Ministry of Labour has been attempting to modify the norms used to analyse industrial accidents in the country. For this purpose, in 1994 it tried to make compulsory use of the causal tree approach to accident analysis, an approach developed in France during the 1970s, without having previously determined whether it is suitable for use under the industrial safety conditions that prevail in most Brazilian firms. In addition, opposition from Brazilian employers has blocked the proposed changes to the norms. The present study employed anthropotechnology to analyse experimental application of the causal tree method to work-related accidents in industrial firms in the region of Botucatu, São Paulo. Three work-related accidents were examined in three industrial firms representative of local, national and multinational companies. On the basis of the accidents analysed in this study, the rationale for the use of the causal tree method in Brazil can be summarized for each type of firm as follows: the method is redundant if there is a predominance of the type of risk whose elimination or neutralization requires adoption of conventional industrial safety measures (firm representative of local enterprises); the method is worth while if the company's specific technical risks have already largely been eliminated (firm representative of national enterprises); and the method is particularly appropriate if the firm has a good safety record and the causes of accidents are primarily related to industrial organization and management (multinational enterprise). PMID:10680249

  14. Prediction accident triangle in maintenance of underground mine facilities using Poisson distribution analysis

    NASA Astrophysics Data System (ADS)

    Khuluqi, M. H.; Prapdito, R. R.; Sambodo, F. P.

    2018-04-01

    In Indonesia, mining is categorized as a hazardous industry. In recent years, a dramatic increase of mining equipment and technological complexities had resulted in higher maintenance expectations that accompanied by the changes in the working conditions, especially on safety. Ensuring safety during the process of conducting maintenance works in underground mine is important as an integral part of accident prevention programs. Accident triangle has provided a support to safety practitioner to draw a road map in preventing accidents. Poisson distribution is appropriate for the analysis of accidents at a specific site in a given time period. Based on the analysis of accident statistics in the underground mine maintenance of PT. Freeport Indonesia from 2011 through 2016, it is found that 12 minor accidents for 1 major accident and 66 equipment damages for 1 major accident as a new value of accident triangle. The result can be used for the future need for improving the accident prevention programs.

  15. Computer-assisted qualitative data analysis software.

    PubMed

    Cope, Diane G

    2014-05-01

    Advances in technology have provided new approaches for data collection methods and analysis for researchers. Data collection is no longer limited to paper-and-pencil format, and numerous methods are now available through Internet and electronic resources. With these techniques, researchers are not burdened with entering data manually and data analysis is facilitated by software programs. Quantitative research is supported by the use of computer software and provides ease in the management of large data sets and rapid analysis of numeric statistical methods. New technologies are emerging to support qualitative research with the availability of computer-assisted qualitative data analysis software (CAQDAS).CAQDAS will be presented with a discussion of advantages, limitations, controversial issues, and recommendations for this type of software use.

  16. INDUSTRIAL/MILITARY ACTIVITY-INITIATED ACCIDENT SCREENING ANALYSIS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    D.A. Kalinich

    1999-09-27

    Impacts due to nearby installations and operations were determined in the Preliminary MGDS Hazards Analysis (CRWMS M&O 1996) to be potentially applicable to the proposed repository at Yucca Mountain. This determination was conservatively based on limited knowledge of the potential activities ongoing on or off the Nevada Test Site (NTS). It is intended that the Industrial/Military Activity-Initiated Accident Screening Analysis provided herein will meet the requirements of the ''Standard Review Plan for the Review of Safety Analysis Reports for Nuclear Power Plants'' (NRC 1987) in establishing whether this external event can be screened from further consideration or must be includedmore » as a design basis event (DBE) in the development of accident scenarios for the Monitored Geologic Repository (MGR). This analysis only considers issues related to preclosure radiological safety. Issues important to waste isolation as related to impact from nearby installations will be covered in the MGR performance assessment.« less

  17. Analysis Software

    NASA Technical Reports Server (NTRS)

    1994-01-01

    General Purpose Boundary Element Solution Technology (GPBEST) software employs the boundary element method of mechanical engineering analysis, as opposed to finite element. It is, according to one of its developers, 10 times faster in data preparation and more accurate than other methods. Its use results in less expensive products because the time between design and manufacturing is shortened. A commercial derivative of a NASA-developed computer code, it is marketed by Best Corporation to solve problems in stress analysis, heat transfer, fluid analysis and yielding and cracking of solids. Other applications include designing tractor and auto parts, household appliances and acoustic analysis.

  18. Scenario analysis of freight vehicle accident risks in Taiwan.

    PubMed

    Tsai, Ming-Chih; Su, Chien-Chih

    2004-07-01

    This study develops a quantitative risk model by utilizing Generalized Linear Interactive Model (GLIM) to analyze the major freight vehicle accidents in Taiwan. Eight scenarios are established by interacting three categorical variables of driver ages, vehicle types and road types, each of which contains two levels. The database that consists of 2043 major accidents occurring between 1994 and 1998 in Taiwan is utilized to fit and calibrate the model parameters. The empirical results indicate that accident rates of freight vehicles in Taiwan were high in the scenarios involving trucks and non-freeway systems, while; accident consequences were severe in the scenarios involving mature drivers or non-freeway systems. Empirical evidences also show that there is no significant relationship between accident rates and accident consequences. This is to stress that safety studies that describe risk merely as accident rates rather than the combination of accident rates and consequences by definition might lead to biased risk perceptions. Finally, the study recommends using number of vehicle as an alternative of traffic exposure in commercial vehicle risk analysis. The merits of this would be that it is simple and thus reliable; meanwhile, the resulted risk that is termed as fatalities per vehicle could provide clear and direct policy implications for insurance practices and safety regulations.

  19. Bus accident analysis of routes with/without bus priority.

    PubMed

    Goh, Kelvin Chun Keong; Currie, Graham; Sarvi, Majid; Logan, David

    2014-04-01

    This paper summarises findings on road safety performance and bus-involved accidents in Melbourne along roads where bus priority measures had been applied. Results from an empirical analysis of the accident types revealed significant reduction in the proportion of accidents involving buses hitting stationary objects and vehicles, which suggests the effect of bus priority in addressing manoeuvrability issues for buses. A mixed-effects negative binomial (MENB) regression and back-propagation neural network (BPNN) modelling of bus accidents considering wider influences on accident rates at a route section level also revealed significant safety benefits when bus priority is provided. Sensitivity analyses done on the BPNN model showed general agreement in the predicted accident frequency between both models. The slightly better performance recorded by the MENB model results suggests merits in adopting a mixed effects modelling approach for accident count prediction in practice given its capability to account for unobserved location and time-specific factors. A major implication of this research is that bus priority in Melbourne's context acts to improve road safety and should be a major consideration for road management agencies when implementing bus priority and road schemes. Copyright © 2013 Elsevier Ltd. All rights reserved.

  20. Content analysis of 100 consecutive media reports of amusement ride accidents.

    PubMed

    Woodcock, Kathryn

    2008-01-01

    Accident investigations influence public perceptions and safety management strategies by determining the amount and type of information learned about the accident. To examine the factors considered in investigations, this study used a content analysis of 100 consecutive media reports of amusement ride accidents from an online media archive. Fatalities were overrepresented in the media dataset compared with U.S. national estimates. For analysis of reports, a modified "Haddon matrix" was developed using human-factors categories. This approach was useful to show differences between the proportions and types of factors considered in the different accident stages and between employee and rider accidents. Employee injury accounts primarily referred to the employee's task and to the employee. Rider injury reports were primarily related to the ride device itself and rarely referred to the rider's "task", social influences, or the rider's own actions, and only some reference to their characteristics. Qualitatively, it was evident that more human factors analysis is required to augment scant pre-failure information about the task, social environment, and the person, to make that information available for prevention of amusement ride accidents. By design, this study reflected information reported by the media. Future work will use the same techniques with official reports.

  1. NASA's Accident Precursor Analysis Process and the International Space Station

    NASA Technical Reports Server (NTRS)

    Groen, Frank; Lutomski, Michael

    2010-01-01

    This viewgraph presentation reviews the implementation of Accident Precursor Analysis (APA), as well as the evaluation of In-Flight Investigations (IFI) and Problem Reporting and Corrective Action (PRACA) data for the identification of unrecognized accident potentials on the International Space Station.

  2. Major Accidents (Gray Swans) Likelihood Modeling Using Accident Precursors and Approximate Reasoning.

    PubMed

    Khakzad, Nima; Khan, Faisal; Amyotte, Paul

    2015-07-01

    Compared to the remarkable progress in risk analysis of normal accidents, the risk analysis of major accidents has not been so well-established, partly due to the complexity of such accidents and partly due to low probabilities involved. The issue of low probabilities normally arises from the scarcity of major accidents' relevant data since such accidents are few and far between. In this work, knowing that major accidents are frequently preceded by accident precursors, a novel precursor-based methodology has been developed for likelihood modeling of major accidents in critical infrastructures based on a unique combination of accident precursor data, information theory, and approximate reasoning. For this purpose, we have introduced an innovative application of information analysis to identify the most informative near accident of a major accident. The observed data of the near accident were then used to establish predictive scenarios to foresee the occurrence of the major accident. We verified the methodology using offshore blowouts in the Gulf of Mexico, and then demonstrated its application to dam breaches in the United Sates. © 2015 Society for Risk Analysis.

  3. An Evidential Reasoning-Based CREAM to Human Reliability Analysis in Maritime Accident Process.

    PubMed

    Wu, Bing; Yan, Xinping; Wang, Yang; Soares, C Guedes

    2017-10-01

    This article proposes a modified cognitive reliability and error analysis method (CREAM) for estimating the human error probability in the maritime accident process on the basis of an evidential reasoning approach. This modified CREAM is developed to precisely quantify the linguistic variables of the common performance conditions and to overcome the problem of ignoring the uncertainty caused by incomplete information in the existing CREAM models. Moreover, this article views maritime accident development from the sequential perspective, where a scenario- and barrier-based framework is proposed to describe the maritime accident process. This evidential reasoning-based CREAM approach together with the proposed accident development framework are applied to human reliability analysis of a ship capsizing accident. It will facilitate subjective human reliability analysis in different engineering systems where uncertainty exists in practice. © 2017 Society for Risk Analysis.

  4. Accident Analysis for the NIST Research Reactor Before and After Fuel Conversion

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Baek J.; Diamond D.; Cuadra, A.

    Postulated accidents have been analyzed for the 20 MW D2O-moderated research reactor (NBSR) at the National Institute of Standards and Technology (NIST). The analysis has been carried out for the present core, which contains high enriched uranium (HEU) fuel and for a proposed equilibrium core with low enriched uranium (LEU) fuel. The analyses employ state-of-the-art calculational methods. Three-dimensional Monte Carlo neutron transport calculations were performed with the MCNPX code to determine homogenized fuel compositions in the lower and upper halves of each fuel element and to determine the resulting neutronic properties of the core. The accident analysis employed a modelmore » of the primary loop with the RELAP5 code. The model includes the primary pumps, shutdown pumps outlet valves, heat exchanger, fuel elements, and flow channels for both the six inner and twenty-four outer fuel elements. Evaluations were performed for the following accidents: (1) control rod withdrawal startup accident, (2) maximum reactivity insertion accident, (3) loss-of-flow accident resulting from loss of electrical power with an assumption of failure of shutdown cooling pumps, (4) loss-of-flow accident resulting from a primary pump seizure, and (5) loss-of-flow accident resulting from inadvertent throttling of a flow control valve. In addition, natural circulation cooling at low power operation was analyzed. The analysis shows that the conversion will not lead to significant changes in the safety analysis and the calculated minimum critical heat flux ratio and maximum clad temperature assure that there is adequate margin to fuel failure.« less

  5. National and regional analysis of road accidents in Spain.

    PubMed

    Tolón-Becerra, A; Lastra-Bravo, X; Flores-Parra, I

    2013-01-01

    In Spain, the absolute fatality figures decreased almost 50 percent between 1998 and 2009. Despite this great effort, road mortality is still of great concern to political authorities. Further progress requires efficient road safety policy based on an optimal set of measures and targets that consider the initial conditions and characteristics in each region. This study attempts to analyze road accidents in Spain and its provinces in time and space during 1998-2009. First, we analyzed daily, monthly, and nationwide (NUTS 0) development of road accidents, the correlation between logarithmic transformations of road accidents and territorial and socioeconomic variables, the causality by simple linear regression of road accidents and territorial and socioeconomic variables, and preliminary frequency by fast Fourier transform. Then we analyzed the annual trend in accidents in the Spanish provinces (NUTS 3) and found a correlation between the logarithmic transformations of the mortality rate, fatalities per fatal accident, and accidents resulting in injuries per inhabitant variables and population, population density, gross domestic product (GDP), length of road network, and area. Finally, causality was analyzed by simple linear regression. The most outstanding results were the negative correlation between mortality rate and population density in Spanish provinces, which has increased over time, and that road accidents in Spain have an approximate periodicity of 57 days. The fast Fourier transform analysis of road accident frequency in Spain was useful in identifying the periodic, harmonic components of accidents and casualties. The periodicity observed both for the period 1998-2009 and by year showed that the highest intensity in road accidents was bimonthly, despite the lower number of accidents and casualties in the spectra of amplitude and power and efforts to reduce the intensity and concentration during off-season travel (summer and December).

  6. Analysis of Child-related Road Traffic Accidents in Vietnam

    NASA Astrophysics Data System (ADS)

    Vu, Anh Tuan; Nguyen, Dinh Vinh Man

    2018-04-01

    In recent years, the number of road traffic accidents, fatalities and injuries have been decreasing, but the figures of children road traffic accidents have been increasing in Ho Chi Minh City of Vietnam. This fact strongly calls for implementing effective solutions to improve traffic safety for children by the local government. This paper presents the trends, patterns and causes of road traffic accidents involving children based on the analysis of road traffic accident data over the period 2010-2015 and the video-based observations of road traffic law violations at 15 typical school gates and 10 typical roads. The results could be useful for the city government to formulate solutions to effectively improve traffic safety for children in Ho Chi Minh City and other cities in Vietnam.

  7. Truck accidents at freeway ramps : data analysis and high-risk site identification

    DOT National Transportation Integrated Search

    1998-01-01

    To examine the relationship of ramp design to truck accident rates, this paper presents an analysis of truck accidents in Washington State, plus a comparison to limited data from Colorado and California. The authors group freeway truck accidents by r...

  8. Study on the Accident-causing of Foundation Pit Engineering

    NASA Astrophysics Data System (ADS)

    Shuicheng, Tian; Xinyue, Zhang; Pengfei, Yang; Longgang, Chen

    2018-05-01

    With the development of high-rise buildings and underground space, a large number of foundation pit projects have occurred. Frequent accidents of it cause great losses to the society, how to reduce the frequency of pit accidents has become one of the most urgent problems to be solved. Therefore, analysing the influencing factors of foundation pit engineering accidents and studying the causes of foundation pit accidents, which of great significance for improving the safety management level of foundation pit engineering and reducing the incidence of foundation pit accidents. Firstly, based on literature review and questionnaires, this paper selected construction management, survey, design, construction, supervision and monitoring as research factors, we used the AHP method and the Dematel method to analyze the weights of various influencing factors to screen indicators to determine the ultimate system of accidents caused by foundation pit accidents; Secondly, SPSS 21.0 software was used to test the reliability and validity of the recovered questionnaire data. AMOS 7.0 software was used to fit, evaluate, and explain the set model; Finally, this paper analysed the influencing factors of foundation pit engineering accidents, corresponding management countermeasures and suggestions were put forward.

  9. Infusing Reliability Techniques into Software Safety Analysis

    NASA Technical Reports Server (NTRS)

    Shi, Ying

    2015-01-01

    Software safety analysis for a large software intensive system is always a challenge. Software safety practitioners need to ensure that software related hazards are completely identified, controlled, and tracked. This paper discusses in detail how to incorporate the traditional reliability techniques into the entire software safety analysis process. In addition, this paper addresses how information can be effectively shared between the various practitioners involved in the software safety analyses. The author has successfully applied the approach to several aerospace applications. Examples are provided to illustrate the key steps of the proposed approach.

  10. Analysis on Dangerous Source of Large Safety Accident in Storage Tank Area

    NASA Astrophysics Data System (ADS)

    Wang, Tong; Li, Ying; Xie, Tiansheng; Liu, Yu; Zhu, Xueyuan

    2018-01-01

    The difference between a large safety accident and a general accident is that the consequences of a large safety accident are particularly serious. To study the tank area which factors directly or indirectly lead to the occurrence of large-sized safety accidents. According to the three kinds of hazard source theory and the consequence cause analysis of the super safety accident, this paper analyzes the dangerous source of the super safety accident in the tank area from four aspects, such as energy source, large-sized safety accident reason, management missing, environmental impact Based on the analysis of three kinds of hazard sources and environmental analysis to derive the main risk factors and the AHP evaluation model is established, and after rigorous and scientific calculation, the weights of the related factors in four kinds of risk factors and each type of risk factors are obtained. The result of analytic hierarchy process shows that management reasons is the most important one, and then the environmental factors and the direct cause and Energy source. It should be noted that although the direct cause is relatively low overall importance, the direct cause of Failure of emergency measures and Failure of prevention and control facilities in greater weight.

  11. The accident analysis of mobile mine machinery in Indian opencast coal mines.

    PubMed

    Kumar, R; Ghosh, A K

    2014-01-01

    This paper presents the analysis of large mining machinery related accidents in Indian opencast coal mines. The trends of coal production, share of mining methods in production, machinery deployment in open cast mines, size and population of machinery, accidents due to machinery, types and causes of accidents have been analysed from the year 1995 to 2008. The scrutiny of accidents during this period reveals that most of the responsible factors are machine reversal, haul road design, human fault, operator's fault, machine fault, visibility and dump design. Considering the types of machines, namely, dumpers, excavators, dozers and loaders together the maximum number of fatal accidents has been caused by operator's faults and human faults jointly during the period from 1995 to 2008. The novel finding of this analysis is that large machines with state-of-the-art safety system did not reduce the fatal accidents in Indian opencast coal mines.

  12. Reliability, Safety and Error Recovery for Advanced Control Software

    NASA Technical Reports Server (NTRS)

    Malin, Jane T.

    2003-01-01

    For long-duration automated operation of regenerative life support systems in space environments, there is a need for advanced integration and control systems that are significantly more reliable and safe, and that support error recovery and minimization of operational failures. This presentation outlines some challenges of hazardous space environments and complex system interactions that can lead to system accidents. It discusses approaches to hazard analysis and error recovery for control software and challenges of supporting effective intervention by safety software and the crew.

  13. Overdose problem associated with treatment planning software for high energy photons in response of Panama's accident.

    PubMed

    Attalla, Ehab M; Lotayef, Mohamed M; Khalil, Ehab M; El-Hosiny, Hesham A; Nazmy, Mohamed S

    2007-06-01

    The purpose of this study was to quantify dose distribution errors by comparing actual dose measurements with the calculated values done by the software. To evaluate the outcome of radiation overexposure related to Panama's accident and in response to ensure that the treatment planning systems (T.P.S.) are being operated in accordance with the appropriate quality assurance programme, we studied the central axis and pripheral depth dose data using complex field shaped with blocks to quantify dose distribution errors. Multidata T.P.S. software versions 2.35 and 2.40 and Helax T.P.S. software version 5.1 B were assesed. The calculated data of the software treatment planning systems were verified by comparing these data with the actual dose measurements for open and blocked high energy photon fields (Co-60, 6MV & 18MV photons). Close calculated and measured results were obtained for the 2-D (Multidata) and 3-D treatment planning (TMS Helax). These results were correct within 1 to 2% for open fields and 0.5 to 2.5% for peripheral blocked fields. Discrepancies between calculated and measured data ranged between 13. to 36% along the central axis of complex blocked fields when normalisation point was selected at the Dmax, when the normalisation point was selected near or under the blocks, the variation between the calculated and the measured data was up to 500% difference. The present results emphasize the importance of the proper selection of the normalization point in the radiation field, as this facilitates detection of aberrant dose distribution (over exposure or under exposure).

  14. Systemic accident analysis: examining the gap between research and practice.

    PubMed

    Underwood, Peter; Waterson, Patrick

    2013-06-01

    The systems approach is arguably the dominant concept within accident analysis research. Viewing accidents as a result of uncontrolled system interactions, it forms the theoretical basis of various systemic accident analysis (SAA) models and methods. Despite the proposed benefits of SAA, such as an improved description of accident causation, evidence within the scientific literature suggests that these techniques are not being used in practice and that a research-practice gap exists. The aim of this study was to explore the issues stemming from research and practice which could hinder the awareness, adoption and usage of SAA. To achieve this, semi-structured interviews were conducted with 42 safety experts from ten countries and a variety of industries, including rail, aviation and maritime. This study suggests that the research-practice gap should be closed and efforts to bridge the gap should focus on ensuring that systemic methods meet the needs of practitioners and improving the communication of SAA research. Copyright © 2013 Elsevier Ltd. All rights reserved.

  15. Road Traffic Accident Analysis of Ajmer City Using Remote Sensing and GIS Technology

    NASA Astrophysics Data System (ADS)

    Bhalla, P.; Tripathi, S.; Palria, S.

    2014-12-01

    With advancement in technology, new and sophisticated models of vehicle are available and their numbers are increasing day by day. A traffic accident has multi-facet characteristics associated with it. In India 93% of crashes occur due to Human induced factor (wholly or partly). For proper traffic accident analysis use of GIS technology has become an inevitable tool. The traditional accident database is a summary spreadsheet format using codes and mileposts to denote location, type and severity of accidents. Geo-referenced accident database is location-referenced. It incorporates a GIS graphical interface with the accident information to allow for query searches on various accident attributes. Ajmer city, headquarter of Ajmer district, Rajasthan has been selected as the study area. According to Police records, 1531 accidents occur during 2009-2013. Maximum accident occurs in 2009 and the maximum death in 2013. Cars, jeeps, auto, pickup and tempo are mostly responsible for accidents and that the occurrence of accidents is mostly concentrated between 4PM to 10PM. GIS has proved to be a good tool for analyzing multifaceted nature of accidents. While road safety is a critical issue, yet it is handled in an adhoc manner. This Study is a demonstration of application of GIS for developing an efficient database on road accidents taking Ajmer City as a study. If such type of database is developed for other cities, a proper analysis of accidents can be undertaken and suitable management strategies for traffic regulation can be successfully proposed.

  16. ADAM: An Accident Diagnostic,Analysis and Management System - Applications to Severe Accident Simulation and Management

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zavisca, M.J.; Khatib-Rahbar, M.; Esmaili, H.

    2002-07-01

    The Accident Diagnostic, Analysis and Management (ADAM) computer code has been developed as a tool for on-line applications to accident diagnostics, simulation, management and training. ADAM's severe accident simulation capabilities incorporate a balance of mechanistic, phenomenologically based models with simple parametric approaches for elements including (but not limited to) thermal hydraulics; heat transfer; fuel heatup, meltdown, and relocation; fission product release and transport; combustible gas generation and combustion; and core-concrete interaction. The overall model is defined by a relatively coarse spatial nodalization of the reactor coolant and containment systems and is advanced explicitly in time. The result is to enablemore » much faster than real time (i.e., 100 to 1000 times faster than real time on a personal computer) applications to on-line investigations and/or accident management training. Other features of the simulation module include provision for activation of water injection, including the Engineered Safety Features, as well as other mechanisms for the assessment of accident management and recovery strategies and the evaluation of PSA success criteria. The accident diagnostics module of ADAM uses on-line access to selected plant parameters (as measured by plant sensors) to compute the thermodynamic state of the plant, and to predict various margins to safety (e.g., times to pressure vessel saturation and steam generator dryout). Rule-based logic is employed to classify the measured data as belonging to one of a number of likely scenarios based on symptoms, and a number of 'alarms' are generated to signal the state of the reactor and containment. This paper will address the features and limitations of ADAM with particular focus on accident simulation and management. (authors)« less

  17. Semantic Metrics for Analysis of Software

    NASA Technical Reports Server (NTRS)

    Etzkorn, Letha H.; Cox, Glenn W.; Farrington, Phil; Utley, Dawn R.; Ghalston, Sampson; Stein, Cara

    2005-01-01

    A recently conceived suite of object-oriented software metrics focus is on semantic aspects of software, in contradistinction to traditional software metrics, which focus on syntactic aspects of software. Semantic metrics represent a more human-oriented view of software than do syntactic metrics. The semantic metrics of a given computer program are calculated by use of the output of a knowledge-based analysis of the program, and are substantially more representative of software quality and more readily comprehensible from a human perspective than are the syntactic metrics.

  18. Standardizing Activation Analysis: New Software for Photon Activation Analysis

    NASA Astrophysics Data System (ADS)

    Sun, Z. J.; Wells, D.; Segebade, C.; Green, J.

    2011-06-01

    Photon Activation Analysis (PAA) of environmental, archaeological and industrial samples requires extensive data analysis that is susceptible to error. For the purpose of saving time, manpower and minimizing error, a computer program was designed, built and implemented using SQL, Access 2007 and asp.net technology to automate this process. Based on the peak information of the spectrum and assisted by its PAA library, the program automatically identifies elements in the samples and calculates their concentrations and respective uncertainties. The software also could be operated in browser/server mode, which gives the possibility to use it anywhere the internet is accessible. By switching the nuclide library and the related formula behind, the new software can be easily expanded to neutron activation analysis (NAA), charged particle activation analysis (CPAA) or proton-induced X-ray emission (PIXE). Implementation of this would standardize the analysis of nuclear activation data. Results from this software were compared to standard PAA analysis with excellent agreement. With minimum input from the user, the software has proven to be fast, user-friendly and reliable.

  19. Kubios HRV--heart rate variability analysis software.

    PubMed

    Tarvainen, Mika P; Niskanen, Juha-Pekka; Lipponen, Jukka A; Ranta-Aho, Perttu O; Karjalainen, Pasi A

    2014-01-01

    Kubios HRV is an advanced and easy to use software for heart rate variability (HRV) analysis. The software supports several input data formats for electrocardiogram (ECG) data and beat-to-beat RR interval data. It includes an adaptive QRS detection algorithm and tools for artifact correction, trend removal and analysis sample selection. The software computes all the commonly used time-domain and frequency-domain HRV parameters and several nonlinear parameters. There are several adjustable analysis settings through which the analysis methods can be optimized for different data. The ECG derived respiratory frequency is also computed, which is important for reliable interpretation of the analysis results. The analysis results can be saved as an ASCII text file (easy to import into MS Excel or SPSS), Matlab MAT-file, or as a PDF report. The software is easy to use through its compact graphical user interface. The software is available free of charge for Windows and Linux operating systems at http://kubios.uef.fi. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  20. Who by accident? The social morphology of car accidents.

    PubMed

    Factor, Roni; Yair, Gad; Mahalel, David

    2010-09-01

    Prior studies in the sociology of accidents have shown that different social groups have different rates of accident involvement. This study extends those studies by implementing Bourdieu's relational perspective of social space to systematically explore the homology between drivers' social characteristics and their involvement in specific types of motor vehicle accident. Using a large database that merges official Israeli road-accident records with socioeconomic data from two censuses, this research maps the social order of road accidents through multiple correspondence analysis. Extending prior studies, the results show that different social groups indeed tend to be involved in motor vehicle accidents of different types and severity. For example, we find that drivers from low socioeconomic backgrounds are overinvolved in severe accidents with fatal outcomes. The new findings reported here shed light on the social regularity of road accidents and expose new facets in the social organization of death. © 2010 Society for Risk Analysis.

  1. [Development of medical emergency response system for accidents due to chemicals in Chongqing municipality].

    PubMed

    Ning, Xu; Dong, Zhao-jun; Mu, Ling; Zhai, Jian-cai

    2006-12-01

    To plan and develop a Chongqing chemical accident rescue command system. Based on the modes of leakage and diffusion of various poisonous gases and chemicals, different modes of injuries produced, and their appropriate rescue and treatments, also taking the following factors such as the condition of storage of chemicals, meteorological and geographic conditions, medical institutions and equipment, and their rescuing capacity into consideration, a plan was drafted to establish the rescue system. Real-time simulation technology, data analysis, evaluation technology and database technology were employed in the planning. Using Visual Studio 6.0 as the software development platform, this project aimed to design the software of an emergency command system for chemical accidents in Chongqing which could be operated with the Windows 2000/XP operating system. This system provided a dynamic scope of the endangered area, casualty number estimates, and recommendation of measures and a rescue plan for various chemical accidents. Furthermore, the system helped retrieve comprehensive information regarding the physical and chemical characteristics of more than 4 200 dangerous poisonous chemicals and their appropriate treatment modalities. This system is easy to operate with a friendly interface, functions rapidly and can provide real-time analysis with comparatively precise results. This system could satisfy the requirements of executing the command and the rescue of a chemical accident with good prospects of application.

  2. Wet weather highway accident analysis and skid resistance data management system (volume I).

    DOT National Transportation Integrated Search

    1992-06-01

    The objectives and scope of this research are to establish an effective methodology for wet weather accident analysis and to develop a database management system to facilitate information processing and storage for the accident analysis process, skid...

  3. Estimating the causes of traffic accidents using logistic regression and discriminant analysis.

    PubMed

    Karacasu, Murat; Ergül, Barış; Altin Yavuz, Arzu

    2014-01-01

    Factors that affect traffic accidents have been analysed in various ways. In this study, we use the methods of logistic regression and discriminant analysis to determine the damages due to injury and non-injury accidents in the Eskisehir Province. Data were obtained from the accident reports of the General Directorate of Security in Eskisehir; 2552 traffic accidents between January and December 2009 were investigated regarding whether they resulted in injury. According to the results, the effects of traffic accidents were reflected in the variables. These results provide a wealth of information that may aid future measures toward the prevention of undesired results.

  4. LV software support for supersonic flow analysis

    NASA Technical Reports Server (NTRS)

    Bell, W. A.; Lepicovsky, J.

    1992-01-01

    The software for configuring an LV counter processor system has been developed using structured design. The LV system includes up to three counter processors and a rotary encoder. The software for configuring and testing the LV system has been developed, tested, and included in an overall software package for data acquisition, analysis, and reduction. Error handling routines respond to both operator and instrument errors which often arise in the course of measuring complex, high-speed flows. The use of networking capabilities greatly facilitates the software development process by allowing software development and testing from a remote site. In addition, high-speed transfers allow graphics files or commands to provide viewing of the data from a remote site. Further advances in data analysis require corresponding advances in procedures for statistical and time series analysis of nonuniformly sampled data.

  5. LV software support for supersonic flow analysis

    NASA Technical Reports Server (NTRS)

    Bell, William A.

    1992-01-01

    The software for configuring a Laser Velocimeter (LV) counter processor system was developed using structured design. The LV system includes up to three counter processors and a rotary encoder. The software for configuring and testing the LV system was developed, tested, and included in an overall software package for data acquisition, analysis, and reduction. Error handling routines respond to both operator and instrument errors which often arise in the course of measuring complex, high-speed flows. The use of networking capabilities greatly facilitates the software development process by allowing software development and testing from a remote site. In addition, high-speed transfers allow graphics files or commands to provide viewing of the data from a remote site. Further advances in data analysis require corresponding advances in procedures for statistical and time series analysis of nonuniformly sampled data.

  6. Structural Analysis for the American Airlines Flight 587 Accident Investigation: Global Analysis

    NASA Technical Reports Server (NTRS)

    Young, Richard D.; Lovejoy, Andrew E.; Hilburger, Mark W.; Moore, David F.

    2005-01-01

    NASA Langley Research Center (LaRC) supported the National Transportation Safety Board (NTSB) in the American Airlines Flight 587 accident investigation due to LaRC's expertise in high-fidelity structural analysis and testing of composite structures and materials. A Global Analysis Team from LaRC reviewed the manufacturer s design and certification procedures, developed finite element models and conducted structural analyses, and participated jointly with the NTSB and Airbus in subcomponent tests conducted at Airbus in Hamburg, Germany. The Global Analysis Team identified no significant or obvious deficiencies in the Airbus certification and design methods. Analysis results from the LaRC team indicated that the most-likely failure scenario was failure initiation at the right rear main attachment fitting (lug), followed by an unstable progression of failure of all fin-to-fuselage attachments and separation of the VTP from the aircraft. Additionally, analysis results indicated that failure initiates at the final observed maximum fin loading condition in the accident, when the VTP was subjected to loads that were at minimum 1.92 times the design limit load condition for certification. For certification, the VTP is only required to support loads of 1.5 times design limit load without catastrophic failure. The maximum loading during the accident was shown to significantly exceed the certification requirement. Thus, the structure appeared to perform in a manner consistent with its design and certification, and failure is attributed to VTP loads greater than expected.

  7. [A spatially explicit analysis of traffic accidents involving pedestrians and cyclists in Berlin].

    PubMed

    Lakes, Tobia

    2017-12-01

    In many German cities and counties, sustainable mobility concepts that strengthen pedestrian and cyclist traffic are promoted. From the perspectives of urban development, traffic planning and public healthcare, a spatially differentiated analysis of traffic accident data is decisive. 1) The identification of spatial and temporal patterns of the distribution of accidents involving cyclists and pedestrians, 2) the identification of hotspots and exploration of possible underlying causes and 3) the critical discussion of benefits and challenges of the results and the derivation of conclusions. Spatio-temporal distributions of data from accident statistics in Berlin involving pedestrians and cyclists from 2011 to 2015 were analysed with geographic information systems (GIS). While the total number of accidents remains relatively stable for pedestrian and cyclist accidents, the spatial distribution analysis shows, however, that there are significant spatial clusters (hotspots) of traffic accidents with a strong concentration in the inner city area. In a critical discussion, the benefits of geographic concepts are identified, such as spatially explicit health data (in this case traffic accident data), the importance of the integration of other data sources for the evaluation of the health impact of areas (traffic accident statistics of the police), and the possibilities and limitations of spatial-temporal data analysis (spatial point-density analyses) for the derivation of decision-supported recommendations and for the evaluation of policy measures of health prevention and of health-relevant urban development.

  8. Structural Analysis and Design Software

    NASA Technical Reports Server (NTRS)

    1997-01-01

    Collier Research and Development Corporation received a one-of-a-kind computer code for designing exotic hypersonic aircraft called ST-SIZE in the first ever Langley Research Center software copyright license agreement. Collier transformed the NASA computer code into a commercial software package called HyperSizer, which integrates with other Finite Element Modeling and Finite Analysis private-sector structural analysis program. ST-SIZE was chiefly conceived as a means to improve and speed the structural design of a future aerospace plane for Langley Hypersonic Vehicles Office. Including the NASA computer code into HyperSizer has enabled the company to also apply the software to applications other than aerospace, including improved design and construction for offices, marine structures, cargo containers, commercial and military aircraft, rail cars, and a host of everyday consumer products.

  9. Acoustic Emission Analysis Applet (AEAA) Software

    NASA Technical Reports Server (NTRS)

    Nichols, Charles T.; Roth, Don J.

    2013-01-01

    NASA Glenn Research and NASA White Sands Test Facility have developed software supporting an automated pressure vessel structural health monitoring (SHM) system based on acoustic emissions (AE). The software, referred to as the Acoustic Emission Analysis Applet (AEAA), provides analysts with a tool that can interrogate data collected on Digital Wave Corp. and Physical Acoustics Corp. software using a wide spectrum of powerful filters and charts. This software can be made to work with any data once the data format is known. The applet will compute basic AE statistics, and statistics as a function of time and pressure (see figure). AEAA provides value added beyond the analysis provided by the respective vendors' analysis software. The software can handle data sets of unlimited size. A wide variety of government and commercial applications could benefit from this technology, notably requalification and usage tests for compressed gas and hydrogen-fueled vehicles. Future enhancements will add features similar to a "check engine" light on a vehicle. Once installed, the system will ultimately be used to alert International Space Station crewmembers to critical structural instabilities, but will have little impact to missions otherwise. Diagnostic information could then be transmitted to experienced technicians on the ground in a timely manner to determine whether pressure vessels have been impacted, are structurally unsound, or can be safely used to complete the mission.

  10. GWAMA: software for genome-wide association meta-analysis.

    PubMed

    Mägi, Reedik; Morris, Andrew P

    2010-05-28

    Despite the recent success of genome-wide association studies in identifying novel loci contributing effects to complex human traits, such as type 2 diabetes and obesity, much of the genetic component of variation in these phenotypes remains unexplained. One way to improving power to detect further novel loci is through meta-analysis of studies from the same population, increasing the sample size over any individual study. Although statistical software analysis packages incorporate routines for meta-analysis, they are ill equipped to meet the challenges of the scale and complexity of data generated in genome-wide association studies. We have developed flexible, open-source software for the meta-analysis of genome-wide association studies. The software incorporates a variety of error trapping facilities, and provides a range of meta-analysis summary statistics. The software is distributed with scripts that allow simple formatting of files containing the results of each association study and generate graphical summaries of genome-wide meta-analysis results. The GWAMA (Genome-Wide Association Meta-Analysis) software has been developed to perform meta-analysis of summary statistics generated from genome-wide association studies of dichotomous phenotypes or quantitative traits. Software with source files, documentation and example data files are freely available online at http://www.well.ox.ac.uk/GWAMA.

  11. Domino effect in chemical accidents: main features and accident sequences.

    PubMed

    Darbra, R M; Palacios, Adriana; Casal, Joaquim

    2010-11-15

    The main features of domino accidents in process/storage plants and in the transportation of hazardous materials were studied through an analysis of 225 accidents involving this effect. Data on these accidents, which occurred after 1961, were taken from several sources. Aspects analyzed included the accident scenario, the type of accident, the materials involved, the causes and consequences and the most common accident sequences. The analysis showed that the most frequent causes are external events (31%) and mechanical failure (29%). Storage areas (35%) and process plants (28%) are by far the most common settings for domino accidents. Eighty-nine per cent of the accidents involved flammable materials, the most frequent of which was LPG. The domino effect sequences were analyzed using relative probability event trees. The most frequent sequences were explosion→fire (27.6%), fire→explosion (27.5%) and fire→fire (17.8%). Copyright © 2010 Elsevier B.V. All rights reserved.

  12. Risk analysis of emergent water pollution accidents based on a Bayesian Network.

    PubMed

    Tang, Caihong; Yi, Yujun; Yang, Zhifeng; Sun, Jie

    2016-01-01

    To guarantee the security of water quality in water transfer channels, especially in open channels, analysis of potential emergent pollution sources in the water transfer process is critical. It is also indispensable for forewarnings and protection from emergent pollution accidents. Bridges above open channels with large amounts of truck traffic are the main locations where emergent accidents could occur. A Bayesian Network model, which consists of six root nodes and three middle layer nodes, was developed in this paper, and was employed to identify the possibility of potential pollution risk. Dianbei Bridge is reviewed as a typical bridge on an open channel of the Middle Route of the South to North Water Transfer Project where emergent traffic accidents could occur. Risk of water pollutions caused by leakage of pollutants into water is focused in this study. The risk for potential traffic accidents at the Dianbei Bridge implies a risk for water pollution in the canal. Based on survey data, statistical analysis, and domain specialist knowledge, a Bayesian Network model was established. The human factor of emergent accidents has been considered in this model. Additionally, this model has been employed to describe the probability of accidents and the risk level. The sensitive reasons for pollution accidents have been deduced. The case has also been simulated that sensitive factors are in a state of most likely to lead to accidents. Copyright © 2015 Elsevier Ltd. All rights reserved.

  13. Challenging the immediate causes: A work accident investigation in an oil refinery using organizational analysis.

    PubMed

    Beltran, Sandra Lorena; Vilela, Rodolfo Andrade de Gouveia; de Almeida, Ildeberto Muniz

    2018-01-01

    In many companies, investigations of accidents still blame the victims without exploring deeper causes. Those investigations are reactive and have no learning potential. This paper aims to debate the historical organizational aspects of a company whose policy was incubating an accident. The empirical data are analyzed as part of a qualitative study of an accident that occurred in an oil refinery in Brazil in 2014. To investigate and analyse this case we used one-to-one and group interviews, participant observation, Collective Analyses of Work and a documentary review. The analysis was conducted on the basis of concepts of the Organizational Analysis of the event and the Model for Analysis and Prevention of Work Accidents. The accident had its origin in the interaction of social and organizational factors, among them being: excessively standardized culture, management tools and outcome indicators that give a false sense of safety, the decision to speed up the project, the change of operator to facilitate this outcome and performance management that encourages getting around the usual barriers. The superficial accident analysis conducted by the company that ignored human and organizational factors reinforces the traditional safety culture and favors the occurrence of new accidents.

  14. Software development for teleroentgenogram analysis

    NASA Astrophysics Data System (ADS)

    Goshkoderov, A. A.; Khlebnikov, N. A.; Obabkov, I. N.; Serkov, K. V.; Gajniyarov, I. M.; Aliev, A. A.

    2017-09-01

    A framework for the analysis and calculation of teleroentgenograms was developed. Software development was carried out in the Department of Children's Dentistry and Orthodontics in Ural State Medical University. The software calculates the teleroentgenogram by the original method which was developed in this medical department. Program allows designing its own methods for calculating the teleroentgenograms by new methods. It is planned to use the technology of machine learning (Neural networks) in the software. This will help to make the process of calculating the teleroentgenograms easier because methodological points will be placed automatically.

  15. Aircraft accidents.method of analysis

    NASA Technical Reports Server (NTRS)

    1937-01-01

    This report is a revision of NACA-TR-357. It was prepared by the Committee on Aircraft Accidents. The purpose of this report is to provide a basis for the classification and comparison of aircraft accidents, both civil and military.

  16. Preliminary analysis of the National Crash Severity Study : factors in fatal accidents

    DOT National Transportation Integrated Search

    1979-06-01

    This study investigates the fatalities on the National Crash Severity Study (NCSS) of towaway, passenger car accidents. The analysis is in three stages. First, NCSS fatalities are compared to the fatally-injured occupants reported on the Fatal Accide...

  17. Debugging and Performance Analysis Software Tools for Peregrine System |

    Science.gov Websites

    High-Performance Computing | NREL Debugging and Performance Analysis Software Tools for Peregrine System Debugging and Performance Analysis Software Tools for Peregrine System Learn about debugging and performance analysis software tools available to use with the Peregrine system. Allinea

  18. Introduction of Bayesian network in risk analysis of maritime accidents in Bangladesh

    NASA Astrophysics Data System (ADS)

    Rahman, Sohanur

    2017-12-01

    Due to the unique geographic location, complex navigation environment and intense vessel traffic, a considerable number of maritime accidents occurred in Bangladesh which caused serious loss of life, property and environmental contamination. Based on the historical data of maritime accidents from 1981 to 2015, which has been collected from Department of Shipping (DOS) and Bangladesh Inland Water Transport Authority (BIWTA), this paper conducted a risk analysis of maritime accidents by applying Bayesian network. In order to conduct this study, a Bayesian network model has been developed to find out the relation among parameters and the probability of them which affect accidents based on the accident investigation report of Bangladesh. Furthermore, number of accidents in different categories has also been investigated in this paper. Finally, some viable recommendations have been proposed in order to ensure greater safety of inland vessels in Bangladesh.

  19. A human factors analysis of fatal and serious injury accidents in Alaska, 2004-2009.

    DOT National Transportation Integrated Search

    2011-12-01

    "This report summarizes the analysis of 97 general aviation accidents in Alaska that resulted in a fatality or serious : injury to one or more aircraft occupants for the years 2004-2009. The accidents were analyzed using the Human : Factors Analysis ...

  20. Work-related accidents among the Iranian population: a time series analysis, 2000-2011.

    PubMed

    Karimlou, Masoud; Salehi, Masoud; Imani, Mehdi; Hosseini, Agha-Fatemeh; Dehnad, Afsaneh; Vahabi, Nasim; Bakhtiyari, Mahmood

    2015-01-01

    Work-related accidents result in human suffering and economic losses and are considered as a major health problem worldwide, especially in the economically developing world. To introduce seasonal autoregressive moving average (ARIMA) models for time series analysis of work-related accident data for workers insured by the Iranian Social Security Organization (ISSO) between 2000 and 2011. In this retrospective study, all insured people experiencing at least one work-related accident during a 10-year period were included in the analyses. We used Box-Jenkins modeling to develop a time series model of the total number of accidents. There was an average of 1476 accidents per month (1476·05±458·77, mean±SD). The final ARIMA (p,d,q) (P,D,Q)s model for fitting to data was: ARIMA(1,1,1)×(0,1,1)12 consisting of the first ordering of the autoregressive, moving average and seasonal moving average parameters with 20·942 mean absolute percentage error (MAPE). The final model showed that time series analysis of ARIMA models was useful for forecasting the number of work-related accidents in Iran. In addition, the forecasted number of work-related accidents for 2011 explained the stability of occurrence of these accidents in recent years, indicating a need for preventive occupational health and safety policies such as safety inspection.

  1. [Model of Analysis and Prevention of Accidents - MAPA: tool for operational health surveillance].

    PubMed

    de Almeida, Ildeberto Muniz; Vilela, Rodolfo Andrade de Gouveia; da Silva, Alessandro José Nunes; Beltran, Sandra Lorena

    2014-12-01

    The analysis of work-related accidents is important for accident surveillance and prevention. Current methods of analysis seek to overcome reductionist views that see these occurrences as simple events explained by operator error. The objective of this paper is to analyze the Model of Analysis and Prevention of Accidents (MAPA) and its use in monitoring interventions, duly highlighting aspects experienced in the use of the tool. The descriptive analytical method was used, introducing the steps of the model. To illustrate contributions and or difficulties, cases where the tool was used in the context of service were selected. MAPA integrates theoretical approaches that have already been tried in studies of accidents by providing useful conceptual support from the data collection stage until conclusion and intervention stages. Besides revealing weaknesses of the traditional approach, it helps identify organizational determinants, such as management failings, system design and safety management involved in the accident. The main challenges lie in the grasp of concepts by users, in exploring organizational aspects upstream in the chain of decisions or at higher levels of the hierarchy, as well as the intervention to change the determinants of these events.

  2. Analysis of occupational accidents with biological material among professionals in pre-hospital services.

    PubMed

    de Oliveira, Adriana Cristina; Paiva, Maria Henriqueta Rocha Siqueira

    2013-02-01

    To estimate the prevalence of accidents due to biological material exposure, the characteristics and post-accident conduct among professionals of pre-hospital services of the four municipalities of Minas Gerais, Brazil. A cross-sectional study, using a structured questionnaire that was developed to enable the calculation of prevalence, descriptive analysis and analytical analysis using logistic regression. The study included 228 professionals; the prevalence of accidents due to biological material exposure was 29.4%, with 49.2% percutaneous, 10.4% mucousal, 6.0% non-intact skin, and 34.4% intact skin. Among the professionals injured, those that stood out were nursing technicians (41.9%) and drivers (28.3%). Notification of the occurrence of the accident occurred in 29.8% of the cases. Percutaneous exposure was associated with time of work in the organization (OR=2.51, 95% CI: 1.18 to 5.35, p<0.017). Notification about accidents with biological material should be encouraged, along with professional evaluation/monitoring.

  3. The Driver Behaviour Questionnaire as accident predictor; A methodological re-meta-analysis.

    PubMed

    Af Wåhlberg, A E; Barraclough, P; Freeman, J

    2015-12-01

    The Manchester Driver Behaviour Questionnaire (DBQ) is the most commonly used self-report tool in traffic safety research and applied settings. It has been claimed that the violation factor of this instrument predicts accident involvement, which was supported by a previous meta-analysis. However, that analysis did not test for methodological effects, or include unpublished results. The present study re-analysed studies on prediction of accident involvement from DBQ factors, including lapses, and many unpublished effects. Tests of various types of dissemination bias and common method variance were undertaken. Outlier analysis showed that some effects were probably not reliable data, but excluding them did not change the results. For correlations between violations and crashes, tendencies for published effects to be larger than unpublished ones and for effects to decrease over time were observed, but were not significant. Also, using the mean of accidents as proxy for effect indicated that studies where effects for violations are not reported have smaller effect sizes. These differences indicate dissemination bias. Studies using self-reported accidents as dependent variables had much larger effects than those using recorded accident data. Also, zero-order correlations were larger than partial correlations controlled for exposure. Similarly, violations/accidents effects were strong only when there was also a strong correlation between accidents and exposure. Overall, the true effect is probably very close to zero (r<.07) for violations versus traffic accident involvement, depending upon which tendencies are controlled for. Methodological factors and dissemination bias have inflated the published effect sizes of the DBQ. Strong evidence of various artefactual effects is apparent. A greater level of care should be taken if the DBQ continues to be used in traffic safety research. Also, validation of self-reports should be more comprehensive in the future, taking into

  4. Questioning the Role of Requirements Engineering in the Causes of Safety-Critical Software Failures

    NASA Technical Reports Server (NTRS)

    Johnson, C. W.; Holloway, C. M.

    2006-01-01

    Many software failures stem from inadequate requirements engineering. This view has been supported both by detailed accident investigations and by a number of empirical studies; however, such investigations can be misleading. It is often difficult to distinguish between failures in requirements engineering and problems elsewhere in the software development lifecycle. Further pitfalls arise from the assumption that inadequate requirements engineering is a cause of all software related accidents for which the system fails to meet its requirements. This paper identifies some of the problems that have arisen from an undue focus on the role of requirements engineering in the causes of major accidents. The intention is to provoke further debate within the emerging field of forensic software engineering.

  5. Analysis of unmitigated large break loss of coolant accidents using MELCOR code

    NASA Astrophysics Data System (ADS)

    Pescarini, M.; Mascari, F.; Mostacci, D.; De Rosa, F.; Lombardo, C.; Giannetti, F.

    2017-11-01

    In the framework of severe accident research activity developed by ENEA, a MELCOR nodalization of a generic Pressurized Water Reactor of 900 MWe has been developed. The aim of this paper is to present the analysis of MELCOR code calculations concerning two independent unmitigated large break loss of coolant accident transients, occurring in the cited type of reactor. In particular, the analysis and comparison between the transients initiated by an unmitigated double-ended cold leg rupture and an unmitigated double-ended hot leg rupture in the loop 1 of the primary cooling system is presented herein. This activity has been performed focusing specifically on the in-vessel phenomenology that characterizes this kind of accidents. The analysis of the thermal-hydraulic transient phenomena and the core degradation phenomena is therefore here presented. The analysis of the calculated data shows the capability of the code to reproduce the phenomena typical of these transients and permits their phenomenological study. A first sequence of main events is here presented and shows that the cold leg break transient results faster than the hot leg break transient because of the position of the break. Further analyses are in progress to quantitatively assess the results of the code nodalization for accident management strategy definition and fission product source term evaluation.

  6. Comprehensive Analysis of Two Downburst-Related Aircraft Accidents

    NASA Technical Reports Server (NTRS)

    Shen, J.; Parks, E. K.; Bach, R. E.

    1996-01-01

    Although downbursts have been identified as the major cause of a number of aircraft takeoff and landing accidents, only the 1985 Dallas/Fort Worth (DFW) and the more recent (July 1994) Charlotte, North Carolina, landing accidents provided sufficient onboard recorded data to perform a comprehensive analysis of the downburst phenomenon. The first step in the present analysis was the determination of the downburst wind components. Once the wind components and their gradients were determined, the degrading effect of the wind environment on the airplane's performance was calculated. This wind-shear-induced aircraft performance degradation, sometimes called the F-factor, was broken down into two components F(sub 1) and F(sub 2), representing the effect of the horizontal wind gradient and the vertical wind velocity, respectively. In both the DFW and Charlotte cases, F(sub 1) was found to be the dominant causal factor of the accident. Next, the aircraft in the two cases were mathematically modeled using the longitudinal equations of motion and the appropriate aerodynamic parameters. Based on the aircraft model and the determined winds, the aircraft response to the recorded pilot inputs showed good agreement with the onboard recordings. Finally, various landing abort strategies were studied. It was concluded that the most acceptable landing abort strategy from both an analytical and pilot's standpoint was to hold constant nose-up pitch attitude while operating at maximum engine thrust.

  7. Development of a New VLBI Data Analysis Software

    NASA Technical Reports Server (NTRS)

    Bolotin, Sergei; Gipson, John M.; MacMillan, Daniel S.

    2010-01-01

    We present an overview of a new VLBI analysis software under development at NASA GSFC. The new software will replace CALC/SOLVE and many related utility programs. It will have the capabilities of the current system as well as incorporate new models and data analysis techniques. In this paper we give a conceptual overview of the new software. We formulate the main goals of the software. The software should be flexible and modular to implement models and estimation techniques that currently exist or will appear in future. On the other hand it should be reliable and possess production quality for processing standard VLBI sessions. Also, it needs to be capable of processing observations from a fully deployed network of VLBI2010 stations in a reasonable time. We describe the software development process and outline the software architecture.

  8. Analysis of accidents and injuries on motorcycles in Mexico.

    PubMed

    Berrones-Sanz, Luis David

    2017-01-01

    To analyze the type of injuries and the characteristics and geographical distribution of road accidents where motorcycles were involved in Mexico. A descriptive analysis of second-hand information sources was conducted, including the number of accidents (N = 41,881), total number of injured people (N = 13,916) and medical expenses (N = 9,111) associated to motorcycle accidents during 2014. Motorcycles represent 13.14% of the total number of deaths in road accidents in Mexico, and the Southeast region of Mexico registers the highest proportion of fatal injuries. Of the total number of motorcycles, 1.84% (95% confidence interval [CI]: 1.83-1.86) were involved in a collision. 3.64 (95% CI: 3.39-3.89) people died and 105.5 (95% CI: 104.1-106.8) were injured in every 10,000. Out of the total number of injuries, 76.6% were male and 53.74% were women. 55.1% of deaths were caused by intracranial trauma. Only 16.6% wore a helmet at the time of the accident, and those not wearing a helmet had a 2.11 (odds ratio [OR]: 2.1; CI 95%: 1.8-2.4) higher chance of head injury. Regarding the severity of the crash, those occurred in suburban areas (OR: 6.58; CI 95%: 5.69-7.60), in unpaved surfaces (OR: 4.13; CI 95%: 3.04-5.61), after low alcohol consumption (OR: 1.89; CI 95%: 1.46-2.44), at night (OR: 2.24; CI 95%: 1.95-2.57) and on weekends (OR: 1.65; CI 95%: 1.44-1.90), had the highest chance of turning into a fatal accident. In spite of the progress made in terms of road safety, motorcycle accidents are still increasing, and the use of a helmet is still proportionally low. More information on these groups and risk factors needs to be available so people are better informed. Also, regulations need improvements regarding the use of security equipment like helmets in order to reduce injuries and fatal accidents.

  9. A spatial analysis of urban transit accidents assisted by Emergency Mobile Care Services: an analysis of space and time.

    PubMed

    Mendonça, Marcela Franklin Salvador de; Silva, Amanda Priscila de Santana Cabral; Castro, Claudia Cristina Lima de

    2017-01-01

    Urban transit accident are a global public health problem. The objective of this study was to describe the profile of the victims and the occurrences of urban transit accidents attended to by emergency mobile care services (Serviço de Atendimento Móvel de Urgência- SAMU) in Recife, and their distribution based on spatial analysis. An ecological study, developed through secondary data from emergency mobile care services in Recife, referring to the total number of occurrences of urban transit accidents attended to from January 1 to June 30, 2015. The spatial analysis was performed using the Moran index. Basic support units performed most of the emergency services (89.2%). Among the victims, there was a predominance of males (76.8%) and an age group of 20 - 29 years old (31.5%). Collisions were responsible for 59.9% of the transit accidents, and motorcycles for 61.6% of the accidents among all means of transportation. Friday was the day that showed the highest risk for treatment, and there was a concentration of events between 6:00 am - 8:59am and 6:00pm - 8:59pm. The MoranMap identified critical areas where calls came from traffic accidents during the period analyzed. The records of the mobile service from the spatial analysis are an important source of information for health surveillance. The spatial analysis of urban transit accidents identified regions with a positive spatial correlation, providing subsidies to the logistical planning of emergency mobile care services. This study is groundbreaking in that it offers such information about the region.

  10. Theoretical and software considerations for nonlinear dynamic analysis

    NASA Technical Reports Server (NTRS)

    Schmidt, R. J.; Dodds, R. H., Jr.

    1983-01-01

    In the finite element method for structural analysis, it is generally necessary to discretize the structural model into a very large number of elements to accurately evaluate displacements, strains, and stresses. As the complexity of the model increases, the number of degrees of freedom can easily exceed the capacity of present-day software system. Improvements of structural analysis software including more efficient use of existing hardware and improved structural modeling techniques are discussed. One modeling technique that is used successfully in static linear and nonlinear analysis is multilevel substructuring. This research extends the use of multilevel substructure modeling to include dynamic analysis and defines the requirements for a general purpose software system capable of efficient nonlinear dynamic analysis. The multilevel substructuring technique is presented, the analytical formulations and computational procedures for dynamic analysis and nonlinear mechanics are reviewed, and an approach to the design and implementation of a general purpose structural software system is presented.

  11. Wet weather highway accident analysis and skid resistance data management system (volume II : user's manual).

    DOT National Transportation Integrated Search

    1992-06-01

    The objectives and scope of this research are to establish an effective methodology for wet weather accident analysis and to develop a database management system to facilitate information processing and storage for the accident analysis process, skid...

  12. How perceptions of experience-based analysis influence explanations of work accidents.

    PubMed

    Mbaye, Safiétou; Kouabenan, Dongo Rémi

    2013-12-01

    This article looks into how perceptions of experience-based analysis (EBA) influence causal explanations of accidents given by managers and workers in the chemical industry (n=409) and in the nuclear industry (n=222). The approach is based on the model of naive explanations of accidents (Kouabenan, 1999, 2006, 2009), which recommends taking into account explanations of accidents spontaneously given by individuals, including laypersons, not only to better understand why accidents occur but also to design and implement the most appropriate prevention measures. The study reported here describes the impact of perceptions about EBA (perceived effectiveness, personal commitment, and the feeling of being involved in EBA practices) on managers' and workers' explanations of accidents likely to occur at the workplace. The results indicated that both managers and workers made more internal explanations than external ones when they perceived EBA positively. Moreover, the more the participants felt involved in EBA, were committed to it, and judged it effective, the more they explained accidents in terms of factors internal to the workers. Recommendations are proposed for reducing defensive reactions, increasing personal commitment to EBA, and improving EBA effectiveness. © 2013.

  13. Human error and commercial aviation accidents: an analysis using the human factors analysis and classification system.

    PubMed

    Shappell, Scott; Detwiler, Cristy; Holcomb, Kali; Hackworth, Carla; Boquet, Albert; Wiegmann, Douglas A

    2007-04-01

    The aim of this study was to extend previous examinations of aviation accidents to include specific aircrew, environmental, supervisory, and organizational factors associated with two types of commercial aviation (air carrier and commuter/ on-demand) accidents using the Human Factors Analysis and Classification System (HFACS). HFACS is a theoretically based tool for investigating and analyzing human error associated with accidents and incidents. Previous research has shown that HFACS can be reliably used to identify human factors trends associated with military and general aviation accidents. Using data obtained from both the National Transportation Safety Board and the Federal Aviation Administration, 6 pilot-raters classified aircrew, supervisory, organizational, and environmental causal factors associated with 1020 commercial aviation accidents that occurred over a 13-year period. The majority of accident causal factors were attributed to aircrew and the environment, with decidedly fewer associated with supervisory and organizational causes. Comparisons were made between HFACS causal categories and traditional situational variables such as visual conditions, injury severity, and regional differences. These data will provide support for the continuation, modification, and/or development of interventions aimed at commercial aviation safety. HFACS provides a tool for assessing human factors associated with accidents and incidents.

  14. ACES: Space shuttle flight software analysis expert system

    NASA Technical Reports Server (NTRS)

    Satterwhite, R. Scott

    1990-01-01

    The Analysis Criteria Evaluation System (ACES) is a knowledge based expert system that automates the final certification of the Space Shuttle onboard flight software. Guidance, navigation and control of the Space Shuttle through all its flight phases are accomplished by a complex onboard flight software system. This software is reconfigured for each flight to allow thousands of mission-specific parameters to be introduced and must therefore be thoroughly certified prior to each flight. This certification is performed in ground simulations by executing the software in the flight computers. Flight trajectories from liftoff to landing, including abort scenarios, are simulated and the results are stored for analysis. The current methodology of performing this analysis is repetitive and requires many man-hours. The ultimate goals of ACES are to capture the knowledge of the current experts and improve the quality and reduce the manpower required to certify the Space Shuttle onboard flight software.

  15. Analysis of multiple tank car releases in train accidents.

    PubMed

    Liu, Xiang; Liu, Chang; Hong, Yili

    2017-10-01

    There are annually over two million carloads of hazardous materials transported by rail in the United States. The American railroads use large blocks of tank cars to transport petroleum crude oil and other flammable liquids from production to consumption sites. Being different from roadway transport of hazardous materials, a train accident can potentially result in the derailment and release of multiple tank cars, which may result in significant consequences. The prior literature predominantly assumes that the occurrence of multiple tank car releases in a train accident is a series of independent Bernoulli processes, and thus uses the binomial distribution to estimate the total number of tank car releases given the number of tank cars derailing or damaged. This paper shows that the traditional binomial model can incorrectly estimate multiple tank car release probability by magnitudes in certain circumstances, thereby significantly affecting railroad safety and risk analysis. To bridge this knowledge gap, this paper proposes a novel, alternative Correlated Binomial (CB) model that accounts for the possible correlations of multiple tank car releases in the same train. We test three distinct correlation structures in the CB model, and find that they all outperform the conventional binomial model based on empirical tank car accident data. The analysis shows that considering tank car release correlations would result in a significantly improved fit of the empirical data than otherwise. Consequently, it is prudent to consider alternative modeling techniques when analyzing the probability of multiple tank car releases in railroad accidents. Copyright © 2017 Elsevier Ltd. All rights reserved.

  16. Accidents at work and costs analysis: a field study in a large Italian company.

    PubMed

    Battaglia, Massimo; Frey, Marco; Passetti, Emilio

    2014-01-01

    Accidents at work are still a heavy burden in social and economic terms, and action to improve health and safety standards at work offers great potential gains not only to employers, but also to individuals and society as a whole. However, companies often are not interested to measure the costs of accidents even if cost information may facilitate preventive occupational health and safety management initiatives. The field study, carried out in a large Italian company, illustrates technical and organisational aspects associated with the implementation of an accident costs analysis tool. The results indicate that the implementation (and the use) of the tool requires a considerable commitment by the company, that accident costs analysis should serve to reinforce the importance of health and safety prevention and that the economic dimension of accidents is substantial. The study also suggests practical ways to facilitate the implementation and the moral acceptance of the accounting technology.

  17. PIV/HPIV Film Analysis Software Package

    NASA Technical Reports Server (NTRS)

    Blackshire, James L.

    1997-01-01

    A PIV/HPIV film analysis software system was developed that calculates the 2-dimensional spatial autocorrelations of subregions of Particle Image Velocimetry (PIV) or Holographic Particle Image Velocimetry (HPIV) film recordings. The software controls three hardware subsystems including (1) a Kodak Megaplus 1.4 camera and EPIX 4MEG framegrabber subsystem, (2) an IEEE/Unidex 11 precision motion control subsystem, and (3) an Alacron I860 array processor subsystem. The software runs on an IBM PC/AT host computer running either the Microsoft Windows 3.1 or Windows 95 operating system. It is capable of processing five PIV or HPIV displacement vectors per second, and is completely automated with the exception of user input to a configuration file prior to analysis execution for update of various system parameters.

  18. Worker safety and injury severity analysis of earthmoving equipment accidents.

    PubMed

    Kazan, Emrah; Usmen, Mumtaz A

    2018-06-01

    Research on construction worker safety associated with construction equipment has mostly focused on accident type rather than injury severity and the embedded factor relationships. Significant variables and their effects on the degree of injury are examined for earthmoving equipment using data from OSHA. Four types of equipment, backhoe, bulldozer, excavator, and scraper are included in the study. Accidents involving on-foot workers and equipment operators are investigated collectively, as well as separately. Cross tabulation analysis was conducted to establish the associations between selected categorical variables, using degree of injury as a dichotomous dependent variable (fatal vs. nonfatal) and a number of independent variables having different values. Odds ratios were calculated to determine how much a certain variable/factor increases the odds of fatality in an accident, and the odds ratios were ranked to determine the relative impact of a given factor. It was found that twelve variables were significantly associated with injury severity. Rankings based on odds ratios showed that inadequate safety training (2.54), missing equipment protective system (2.38), being a non-union worker (2.26), being an equipment operator (1.93), and being on or around inadequately maintained equipment (1.58) produced higher odds for fatality. A majority of the earthmoving equipment accidents resulted in fatality. Backhoes were the most common equipment involved in accidents and fatalities. Struck-by accidents were the most prevalent and most fatal. Non-OSHA compliant safety training, missing seatbelt, operator not using seatbelt, malfunctioning back-up alarms, and poorly maintained equipment were factors contributing to accidents and fatalities. On-foot workers experienced a higher number of accidents than operators, while fatality odds were higher for the operators. Practical applications: Safety professionals should benefit from our findings in planning and delivering training

  19. Four applications of a software data collection and analysis methodology

    NASA Technical Reports Server (NTRS)

    Basili, Victor R.; Selby, Richard W., Jr.

    1985-01-01

    The evaluation of software technologies suffers because of the lack of quantitative assessment of their effect on software development and modification. A seven-step data collection and analysis methodology couples software technology evaluation with software measurement. Four in-depth applications of the methodology are presented. The four studies represent each of the general categories of analyses on the software product and development process: blocked subject-project studies, replicated project studies, multi-project variation studies, and single project strategies. The four applications are in the areas of, respectively, software testing, cleanroom software development, characteristic software metric sets, and software error analysis.

  20. Software Users Manual (SUM): Extended Testability Analysis (ETA) Tool

    NASA Technical Reports Server (NTRS)

    Maul, William A.; Fulton, Christopher E.

    2011-01-01

    This software user manual describes the implementation and use the Extended Testability Analysis (ETA) Tool. The ETA Tool is a software program that augments the analysis and reporting capabilities of a commercial-off-the-shelf (COTS) testability analysis software package called the Testability Engineering And Maintenance System (TEAMS) Designer. An initial diagnostic assessment is performed by the TEAMS Designer software using a qualitative, directed-graph model of the system being analyzed. The ETA Tool utilizes system design information captured within the diagnostic model and testability analysis output from the TEAMS Designer software to create a series of six reports for various system engineering needs. The ETA Tool allows the user to perform additional studies on the testability analysis results by determining the detection sensitivity to the loss of certain sensors or tests. The ETA Tool was developed to support design and development of the NASA Ares I Crew Launch Vehicle. The diagnostic analysis provided by the ETA Tool was proven to be valuable system engineering output that provided consistency in the verification of system engineering requirements. This software user manual provides a description of each output report generated by the ETA Tool. The manual also describes the example diagnostic model and supporting documentation - also provided with the ETA Tool software release package - that were used to generate the reports presented in the manual

  1. NASA Accident Precursor Analysis Handbook, Version 1.0

    NASA Technical Reports Server (NTRS)

    Groen, Frank; Everett, Chris; Hall, Anthony; Insley, Scott

    2011-01-01

    Catastrophic accidents are usually preceded by precursory events that, although observable, are not recognized as harbingers of a tragedy until after the fact. In the nuclear industry, the Three Mile Island accident was preceded by at least two events portending the potential for severe consequences from an underappreciated causal mechanism. Anomalies whose failure mechanisms were integral to the losses of Space Transportation Systems (STS) Challenger and Columbia had been occurring within the STS fleet prior to those accidents. Both the Rogers Commission Report and the Columbia Accident Investigation Board report found that processes in place at the time did not respond to the prior anomalies in a way that shed light on their true risk implications. This includes the concern that, in the words of the NASA Aerospace Safety Advisory Panel (ASAP), "no process addresses the need to update a hazard analysis when anomalies occur" At a broader level, the ASAP noted in 2007 that NASA "could better gauge the likelihood of losses by developing leading indicators, rather than continue to depend on lagging indicators". These observations suggest a need to revalidate prior assumptions and conclusions of existing safety (and reliability) analyses, as well as to consider the potential for previously unrecognized accident scenarios, when unexpected or otherwise undesired behaviors of the system are observed. This need is also discussed in NASA's system safety handbook, which advocates a view of safety assurance as driving a program to take steps that are necessary to establish and maintain a valid and credible argument for the safety of its missions. It is the premise of this handbook that making cases for safety more experience-based allows NASA to be better informed about the safety performance of its systems, and will ultimately help it to manage safety in a more effective manner. The APA process described in this handbook provides a systematic means of analyzing candidate

  2. Work-related accidents among the Iranian population: a time series analysis, 2000–2011

    PubMed Central

    Karimlou, Masoud; Imani, Mehdi; Hosseini, Agha-Fatemeh; Dehnad, Afsaneh; Vahabi, Nasim; Bakhtiyari, Mahmood

    2015-01-01

    Background Work-related accidents result in human suffering and economic losses and are considered as a major health problem worldwide, especially in the economically developing world. Objectives To introduce seasonal autoregressive moving average (ARIMA) models for time series analysis of work-related accident data for workers insured by the Iranian Social Security Organization (ISSO) between 2000 and 2011. Methods In this retrospective study, all insured people experiencing at least one work-related accident during a 10-year period were included in the analyses. We used Box–Jenkins modeling to develop a time series model of the total number of accidents. Results There was an average of 1476 accidents per month (1476·05±458·77, mean±SD). The final ARIMA (p,d,q) (P,D,Q)s model for fitting to data was: ARIMA(1,1,1)×(0,1,1)12 consisting of the first ordering of the autoregressive, moving average and seasonal moving average parameters with 20·942 mean absolute percentage error (MAPE). Conclusions The final model showed that time series analysis of ARIMA models was useful for forecasting the number of work-related accidents in Iran. In addition, the forecasted number of work-related accidents for 2011 explained the stability of occurrence of these accidents in recent years, indicating a need for preventive occupational health and safety policies such as safety inspection. PMID:26119774

  3. Offsite radiological consequence analysis for the bounding flammable gas accident

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    CARRO, C.A.

    2003-03-19

    The purpose of this analysis is to calculate the offsite radiological consequence of the bounding flammable gas accident. DOE-STD-3009-94, ''Preparation Guide for U.S. Department of Energy Nonreactor Nuclear Facility Documented Safety Analyses'', requires the formal quantification of a limited subset of accidents representing a complete set of bounding conditions. The results of these analyses are then evaluated to determine if they challenge the DOE-STD-3009-94, Appendix A, ''Evaluation Guideline,'' of 25 rem total effective dose equivalent in order to identify and evaluate safety class structures, systems, and components. The bounding flammable gas accident is a detonation in a single-shell tank (SST).more » A detonation versus a deflagration was selected for analysis because the faster flame speed of a detonation can potentially result in a larger release of respirable material. As will be shown, the consequences of a detonation in either an SST or a double-shell tank (DST) are approximately equal. A detonation in an SST was selected as the bounding condition because the estimated respirable release masses are the same and because the doses per unit quantity of waste inhaled are generally greater for SSTs than for DSTs. Appendix A contains a DST analysis for comparison purposes.« less

  4. [Proposal of a method for collective analysis of work-related accidents in the hospital setting].

    PubMed

    Osório, Claudia; Machado, Jorge Mesquita Huet; Minayo-Gomez, Carlos

    2005-01-01

    The article presents a method for the analysis of work-related accidents in hospitals, with the double aim of analyzing accidents in light of actual work activity and enhancing the vitality of the various professions that comprise hospital work. This process involves both research and intervention, combining knowledge output with training of health professionals, fostering expanded participation by workers in managing their daily work. The method consists of stimulating workers to recreate the situation in which a given accident occurred, shifting themselves to the position of observers of their own work. In the first stage of analysis, workers are asked to show the work analyst how the accident occurred; in the second stage, the work accident victim and analyst jointly record the described series of events in a diagram; in the third, the resulting record is re-discussed and further elaborated; in the fourth, the work accident victim and analyst evaluate and implement measures aimed to prevent the accident from recurring. The article concludes by discussing the method's possibilities and limitations in the hospital setting.

  5. Traits and causes of environmental loss-related chemical accidents in China based on co-word analysis.

    PubMed

    Wu, Desheng; Song, Yu; Xie, Kefan; Zhang, Baofeng

    2018-04-25

    Chemical accidents are major causes of environmental losses and have been debated due to the potential threat to human beings and environment. Compared with the single statistical analysis, co-word analysis of chemical accidents illustrates significant traits at various levels and presents data into a visual network. This study utilizes a co-word analysis of the keywords extracted from the Web crawling texts of environmental loss-related chemical accidents and uses the Pearson's correlation coefficient to examine the internal attributes. To visualize the keywords of the accidents, this study carries out a multidimensional scaling analysis applying PROXSCAL and centrality identification. The research results show that an enormous environmental cost is exacted, especially given the expected environmental loss-related chemical accidents with geographical features. Meanwhile, each event often brings more than one environmental impact. Large number of chemical substances are released in the form of solid, liquid, and gas, leading to serious results. Eight clusters that represent the traits of these accidents are formed, including "leakage," "poisoning," "explosion," "pipeline crack," "river pollution," "dust pollution," "emission," and "industrial effluent." "Explosion" and "gas" possess a strong correlation with "poisoning," located at the center of visualization map.

  6. Accidents at Work and Costs Analysis: A Field Study in a Large Italian Company

    PubMed Central

    BATTAGLIA, Massimo; FREY, Marco; PASSETTI, Emilio

    2014-01-01

    Accidents at work are still a heavy burden in social and economic terms, and action to improve health and safety standards at work offers great potential gains not only to employers, but also to individuals and society as a whole. However, companies often are not interested to measure the costs of accidents even if cost information may facilitate preventive occupational health and safety management initiatives. The field study, carried out in a large Italian company, illustrates technical and organisational aspects associated with the implementation of an accident costs analysis tool. The results indicate that the implementation (and the use) of the tool requires a considerable commitment by the company, that accident costs analysis should serve to reinforce the importance of health and safety prevention and that the economic dimension of accidents is substantial. The study also suggests practical ways to facilitate the implementation and the moral acceptance of the accounting technology. PMID:24869894

  7. Accident analysis of heavy water cooled thorium breeder reactor

    NASA Astrophysics Data System (ADS)

    Yulianti, Yanti; Su'ud, Zaki; Takaki, Naoyuki

    2015-04-01

    power reactor has a peak value before reactor has new balance condition. The analysis showed that temperatures of fuel and claddings during accident are still below limitations which are in secure condition.

  8. Analysis of typical WWER-1000 severe accident scenarios

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sorokin, Yu.S.; Shchekoldin, V.V.; Borisov, L.N.

    2004-07-01

    At present in EDO 'Gidropress' there is a certain experience of performing the analyses of severe accidents of reactor plant with WWER with application of domestic and foreign codes. Important data were also obtained by the results of calculation modeling of integrated experiments with fuel assembly melting comprising a real fuel. Systematization and consideration of these data in development and assimilation of codes are extremely important in connection with large uncertainty still existing in understanding and adequate description of phenomenology of severe accidents. The presented report gives a comparison of analysis results of severe accidents of reactor plant with WWER-1000more » for two typical scenarios made by using American MELCOR code and the Russian RATEG/SVECHA/HEFEST code. The results of calculation modeling are compared using above codes with the data of experiment FPT1 with fuel assembly melting comprising a real fuel, which has been carried out at the facility Phebus (France). The obtained results are considered in the report from the viewpoint of: - adequacy of results of calculation modeling of separate phenomena during severe accidents of RP with WWER by using the above codes; - influence of uncertainties (degree of details of calculation models, choice of parameters of models etc.); - choice of those or other setup variables (options) in the used codes; - necessity of detailed modeling of processes and phenomena as applied to design justification of safety of RP with WWER. (authors)« less

  9. Introducing a New Software for Geodetic Analysis

    NASA Astrophysics Data System (ADS)

    Hjelle, G. A.; Dähnn, M.; Fausk, I.; Kirkvik, A. S.; Mysen, E.

    2016-12-01

    At the Norwegian Mapping Authority, we are currently developing Where, a newsoftware for geodetic analysis. Where is built on our experiences with theGeosat software, and will be able to analyse and combine data from VLBI, SLR,GNSS and DORIS. The software is mainly written in Python which has proved veryfruitful. The code is quick to write and the architecture is easily extendableand maintainable. The Python community provides a rich eco-system of tools fordoing data-analysis, including effective data storage and powerfulvisualization. Python interfaces well with other languages so that we can easilyreuse existing, well-tested code like the SOFA and IERS libraries. This presentation will show some of the current capabilities of Where,including benchmarks against other software packages. In addition we will reporton some simple investigations we have done using the software, and outline ourplans for further progress.

  10. An online database for plant image analysis software tools.

    PubMed

    Lobet, Guillaume; Draye, Xavier; Périlleux, Claire

    2013-10-09

    Recent years have seen an increase in methods for plant phenotyping using image analyses. These methods require new software solutions for data extraction and treatment. These solutions are instrumental in supporting various research pipelines, ranging from the localisation of cellular compounds to the quantification of tree canopies. However, due to the variety of existing tools and the lack of central repository, it is challenging for researchers to identify the software that is best suited for their research. We present an online, manually curated, database referencing more than 90 plant image analysis software solutions. The website, plant-image-analysis.org, presents each software in a uniform and concise manner enabling users to identify the available solutions for their experimental needs. The website also enables user feedback, evaluations and new software submissions. The plant-image-analysis.org database provides an overview of existing plant image analysis software. The aim of such a toolbox is to help users to find solutions, and to provide developers a way to exchange and communicate about their work.

  11. United States Department of Energy severe accident research following the Fukushima Daiichi accidents

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Farmer, M. T.; Corradini, M.; Rempe, J.

    The U.S. Department of Energy (DOE) has played a major role in the U.S. response to the events at Fukushima Daiichi. During the first several weeks following the accident, U.S. assistance efforts were guided by results from a significant and diverse set of analyses. In the months that followed, a coordinated analysis activity aimed at gaining a more thorough understanding of the accident sequence was completed using laboratory-developed, system-level best-estimate accident analysis codes, while a parallel analysis was conducted by U.S. industry. A comparison of predictions for Unit 1 from these two studies indicated significant differences between MAAP and MELCORmore » results for key plant parameters, such as in-core hydrogen production. On that basis, a crosswalk was completed to determine the key modeling variations that led to these differences. In parallel with these activities, it became clear that there was a need to perform a technology gap evaluation on accident-tolerant components and severe accident analysis methodologies with the goal of identifying any data and/or knowledge gaps that may exist given the current state of light water reactor (LWR) severe accident research and augmented by insights from Fukushima. In addition, there is growing international recognition that data from Fukushima could significantly reduce uncertainties related to severe accident progression, particularly for boiling water reactors. On these bases, a group of U. S. experts in LWR safety and plant operations was convened by the DOE Office of Nuclear Energy (DOE-NE) to complete technology gap analysis and Fukushima forensics data needs identification activities. The results from these activities were used as the basis for refining DOE-NE's severe accident research and development (R&D) plan. Finally, this paper provides a high-level review of DOE-sponsored R&D efforts in these areas, including planned activities on accident-tolerant components and accident analysis methods.« less

  12. United States Department of Energy severe accident research following the Fukushima Daiichi accidents

    DOE PAGES

    Farmer, M. T.; Corradini, M.; Rempe, J.; ...

    2016-11-02

    The U.S. Department of Energy (DOE) has played a major role in the U.S. response to the events at Fukushima Daiichi. During the first several weeks following the accident, U.S. assistance efforts were guided by results from a significant and diverse set of analyses. In the months that followed, a coordinated analysis activity aimed at gaining a more thorough understanding of the accident sequence was completed using laboratory-developed, system-level best-estimate accident analysis codes, while a parallel analysis was conducted by U.S. industry. A comparison of predictions for Unit 1 from these two studies indicated significant differences between MAAP and MELCORmore » results for key plant parameters, such as in-core hydrogen production. On that basis, a crosswalk was completed to determine the key modeling variations that led to these differences. In parallel with these activities, it became clear that there was a need to perform a technology gap evaluation on accident-tolerant components and severe accident analysis methodologies with the goal of identifying any data and/or knowledge gaps that may exist given the current state of light water reactor (LWR) severe accident research and augmented by insights from Fukushima. In addition, there is growing international recognition that data from Fukushima could significantly reduce uncertainties related to severe accident progression, particularly for boiling water reactors. On these bases, a group of U. S. experts in LWR safety and plant operations was convened by the DOE Office of Nuclear Energy (DOE-NE) to complete technology gap analysis and Fukushima forensics data needs identification activities. The results from these activities were used as the basis for refining DOE-NE's severe accident research and development (R&D) plan. Finally, this paper provides a high-level review of DOE-sponsored R&D efforts in these areas, including planned activities on accident-tolerant components and accident analysis methods.« less

  13. Software for quantitative analysis of radiotherapy: overview, requirement analysis and design solutions.

    PubMed

    Zhang, Lanlan; Hub, Martina; Mang, Sarah; Thieke, Christian; Nix, Oliver; Karger, Christian P; Floca, Ralf O

    2013-06-01

    Radiotherapy is a fast-developing discipline which plays a major role in cancer care. Quantitative analysis of radiotherapy data can improve the success of the treatment and support the prediction of outcome. In this paper, we first identify functional, conceptional and general requirements on a software system for quantitative analysis of radiotherapy. Further we present an overview of existing radiotherapy analysis software tools and check them against the stated requirements. As none of them could meet all of the demands presented herein, we analyzed possible conceptional problems and present software design solutions and recommendations to meet the stated requirements (e.g. algorithmic decoupling via dose iterator pattern; analysis database design). As a proof of concept we developed a software library "RTToolbox" following the presented design principles. The RTToolbox is available as open source library and has already been tested in a larger-scale software system for different use cases. These examples demonstrate the benefit of the presented design principles. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  14. Non-standard analysis and embedded software

    NASA Technical Reports Server (NTRS)

    Platek, Richard

    1995-01-01

    One model for computing in the future is ubiquitous, embedded computational devices analogous to embedded electrical motors. Many of these computers will control physical objects and processes. Such hidden computerized environments introduce new safety and correctness concerns whose treatment go beyond present Formal Methods. In particular, one has to begin to speak about Real Space software in analogy with Real Time software. By this we mean, computerized systems which have to meet requirements expressed in the real geometry of space. How to translate such requirements into ordinary software specifications and how to carry out proofs is a major challenge. In this talk we propose a research program based on the use of no-standard analysis. Much detail remains to be carried out. The purpose of the talk is to inform the Formal Methods community that Non-Standard Analysis provides a possible avenue to attack which we believe will be fruitful.

  15. [Human risk factors and injuries due to road accidents: analysis of current data].

    PubMed

    Marchetti, Pierpaolo; Morandi, Anna; Lombardo, Carlo; Gigli Berzolari, Francesca; Bruno, Vincenzo; Marinoni, Alessandra

    2009-01-01

    several studies have shown that most road accidents are due to human factors, and that these are strongly linked to a drive's age and sex. The aim of this study is to test the role that some human factors play in road accidents by analysing current road accident data in the Province of Pavia, in Northern Italy. road accidents that occurred in 2004 were analysed by integrating the paper database of the vehicle licensing office, properly computerised, with the 911 database of the Province of Pavia. This study has been carried out by analysing 1.347 road accidents and the associated 2.908 drivers of motorised vehicles. Northern Italy, the Province of Pavia. the death rate of drivers of 2-wheeled vehicles is almost nine times higher than that of 4-wheeled vehicles. Analysis shows that females are twice as exposed to road accidents than males; it also shows the benefits of extensive road education training and of being aged 30-64 and older. Drivers who have already been punished and have had their driving license scores reduced are likely to respond rapidly when in a dangerous situation and also to be without blame after an accident. Motorcycle riders are 25 times more likely to suffer serious injury than drivers of cars. Additionally, the risk of a woman being seriously injured is higher than for a man. females, young drivers and motorcycle riders who have not previously been penalised for a previous traffic violation have a higher risk of being seriously injured; females and motorcycle riders are also at greater risk of being seriously injured. We hope that this analysis will be used to improve preventative interventions for road accidents.r.

  16. Human factors analysis and classification system applied to civil aircraft accidents in India.

    PubMed

    Gaur, Deepak

    2005-05-01

    The Human Factors Analysis and Classification System (HFACS) has gained wide acceptance as a tool to classify human factors in aircraft accidents and incidents. This study on application of HFACS to civil aircraft accident reports at Directorate General Civil of Aviation (DGCA), India, was conducted to ascertain the practicability of applying HFACS to existing investigation reports and to analyze the trends of human factor causes of civil aircraft accidents. Accident investigation reports held at DGCA, New Delhi, for the period 1990--99 were scrutinized. In all, 83 accidents occurred during this period, of which 48 accident reports were evaluated in this study. One or more human factors contributed to 37 of the 48 (77.1%) accidents. The commonest unsafe act was 'skill based errors' followed by 'decision errors.' Violations of laid down rules were contributory in 16 cases (33.3%). 'Preconditions for unsafe acts' were seen in 23 of the 48 cases (47.9%). A fairly large number (52.1%) had 'organizational influences' contributing to the accident. These results are in consonance with larger studies of accidents in the U.S. Navy and general aviation. Such a high percentage of 'organizational influences' has not been reported in other studies. This is a healthy sign for Indian civil aviation, provided effective remedial action for the same is undertaken.

  17. Off-the-shelf Control of Data Analysis Software

    NASA Astrophysics Data System (ADS)

    Wampler, S.

    The Gemini Project must provide convenient access to data analysis facilities to a wide user community. The international nature of this community makes the selection of data analysis software particularly interesting, with staunch advocates of systems such as ADAM and IRAF among the users. Additionally, the continuing trends towards increased use of networked systems and distributed processing impose additional complexity. To meet these needs, the Gemini Project is proposing the novel approach of using low-cost, off-the-shelf software to abstract out both the control and distribution of data analysis from the functionality of the data analysis software. For example, the orthogonal nature of control versus function means that users might select analysis routines from both ADAM and IRAF as appropriate, distributing these routines across a network of machines. It is the belief of the Gemini Project that this approach results in a system that is highly flexible, maintainable, and inexpensive to develop. The Khoros visualization system is presented as an example of control software that is currently available for providing the control and distribution within a data analysis system. The visual programming environment provided with Khoros is also discussed as a means to providing convenient access to this control.

  18. Learning lessons from Natech accidents - the eNATECH accident database

    NASA Astrophysics Data System (ADS)

    Krausmann, Elisabeth; Girgin, Serkan

    2016-04-01

    When natural hazards impact industrial facilities that house or process hazardous materials, fires, explosions and toxic releases can occur. This type of accident is commonly referred to as Natech accident. In order to prevent the recurrence of accidents or to better mitigate their consequences, lessons-learned type studies using available accident data are usually carried out. Through post-accident analysis, conclusions can be drawn on the most common damage and failure modes and hazmat release paths, particularly vulnerable storage and process equipment, and the hazardous materials most commonly involved in these types of accidents. These analyses also lend themselves to identifying technical and organisational risk-reduction measures that require improvement or are missing. Industrial accident databases are commonly used for retrieving sets of Natech accident case histories for further analysis. These databases contain accident data from the open literature, government authorities or in-company sources. The quality of reported information is not uniform and exhibits different levels of detail and accuracy. This is due to the difficulty of finding qualified information sources, especially in situations where accident reporting by the industry or by authorities is not compulsory, e.g. when spill quantities are below the reporting threshold. Data collection has then to rely on voluntary record keeping often by non-experts. The level of detail is particularly non-uniform for Natech accident data depending on whether the consequences of the Natech event were major or minor, and whether comprehensive information was available for reporting. In addition to the reporting bias towards high-consequence events, industrial accident databases frequently lack information on the severity of the triggering natural hazard, as well as on failure modes that led to the hazmat release. This makes it difficult to reconstruct the dynamics of the accident and renders the development of

  19. Landscape analysis software tools

    Treesearch

    Don Vandendriesche

    2008-01-01

    Recently, several new computer programs have been developed to assist in landscape analysis. The “Sequential Processing Routine for Arraying Yields” (SPRAY) program was designed to run a group of stands with particular treatment activities to produce vegetation yield profiles for forest planning. SPRAY uses existing Forest Vegetation Simulator (FVS) software coupled...

  20. Accident analysis of heavy water cooled thorium breeder reactor

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yulianti, Yanti; Su’ud, Zaki; Takaki, Naoyuki

    2015-04-16

    . The power reactor has a peak value before reactor has new balance condition. The analysis showed that temperatures of fuel and claddings during accident are still below limitations which are in secure condition.« less

  1. [Analysis of work accidents during the years 1999-2006 in a hospital company in Lombardia].

    PubMed

    Melloni, P; Antoniazzi, E; Somenzi, V; Galli, L; Fazioli, R; Mottinelli, A; Franzosi, C; Cirla, A M; Gobba, E

    2007-01-01

    This study describe accidents occurred in the period between 1999 and 2006 in the Hospital of Cremona, in which about 2400 subjects operate. The analysis of Accident Register showed a reduction of about 30% of the total number of accidents during the examined period and a non homogeneous distribution of the various types of accidents. The most frequent accidents were prick (25.8%), trauma (22.9%) and "in itinere" accidents (7.8%). One type of accident has been little considered up to now: the aggressions. Professional nurses were the most frequently involved and the most affected units were those that belong to the Internal Medicine Department. "In itinere" accidents had the longest average prognosis (11.6 days). The repetition of accidents occurred to the same operator hasn't been analysed before now: a professional nurse had nine accidents (of various type) in the seven years considered. Probably the reduction of accident must be attributed to the effectiveness of the prevention activities undertaken during the reviewed period. Biological accidents, for which it was possible to implement prevention programs, have been markedly reduced; it was not the same for "In Itinere" accidents, that depend significantly on external factors that are not easily dismissed.

  2. Risk of Occupational Accidents in Workers with Obstructive Sleep Apnea: Systematic Review and Meta-analysis

    PubMed Central

    Garbarino, Sergio; Guglielmi, Ottavia; Sanna, Antonio; Mancardi, Gian Luigi; Magnavita, Nicola

    2016-01-01

    Study Objectives: Obstructive sleep apnea (OSA) is the single most important preventable medical cause of excessive daytime sleepiness (EDS) and driving accidents. OSA may also adversely affect work performance through a decrease in productivity, and an increase in the injury rate. Nevertheless, no systematic review and meta-analysis of the relationship between OSA and work accidents has been performed thus far. Methods: PubMed, PsycInfo, Scopus, Web of Science, and Cochrane Library were searched. Out of an initial list of 1,099 papers, 10 studies (12,553 participants) were eligible for our review, and 7 of them were included in the meta-analysis. The overall effects were measured by odds ratios (OR) and 95% confidence intervals (CI). An assessment was made of the methodological quality of the studies. Moderator analysis and funnel plot analysis were used to explore the sources of between-study heterogeneity. Results: Compared to controls, the odds of work accident was found to be nearly double in workers with OSA (OR = 2.18; 95% CI = 1.53–3.10). Occupational driving was associated with a higher effect size. Conclusions: OSA is an underdiagnosed nonoccupational disease that has a strong adverse effect on work accidents. The nearly twofold increased odds of work accidents in subjects with OSA calls for workplace screening in selected safety-sensitive occupations. Commentary: A commentary on this article appears in this issue on page 1171. Citation: Garbarino S, Guglielmi O, Sanna A, Mancardi GL, Magnavita N. Risk of occupational accidents in workers with obstructive sleep apnea: systematic review and meta-analysis. SLEEP 2016;39(6):1211–1218. PMID:26951401

  3. Analysis of Crew Fatigue in AIA Guantanamo Bay Aviation Accident

    NASA Technical Reports Server (NTRS)

    Rosekind, Mark R.; Gregory, Kevin B.; Miller, Donna L.; Co, Elizabeth L.; Lebacqz, J. Victor; Statler, Irving C. (Technical Monitor)

    1994-01-01

    Flight operations can engender fatigue, which can affect flight crew performance, vigilance, and mood. The National Transportation Safety Board (NTSB) requested the NASA Fatigue Countermeasures Program to analyze crew fatigue factors in an aviation accident that occurred at Guantanamo Bay, Cuba. There are specific fatigue factors that can be considered in such investigations: cumulative sleep loss, continuous hours of wakefulness prior to the incident or accident, and the time of day at which the accident occurred. Data from the NTSB Human Performance Investigator's Factual Report, the Operations Group Chairman's Factual Report, and the Flight 808 Crew Statements were analyzed, using conservative estimates and averages to reconcile discrepancies among the sources. Analysis of these data determined the following: the entire crew displayed cumulative sleep loss, operated during an extended period of continuous wakefulness, and obtained sleep at times in opposition to the circadian disposition for sleep, and that the accident occurred in the afternoon window of physiological sleepiness. In addition to these findings, evidence that fatigue affected performance was suggested by the cockpit voice recorder (CVR) transcript as well as in the captain's testimony. Examples from the CVR showed degraded decision-making skills, fixation, and slowed responses, all of which can be affected by fatigue; also, the captain testified to feeling "lethargic and indifferent" just prior to the accident. Therefore, the sleep/wake history data supports the hypothesis that fatigue was a factor that affected crewmembers' performance. Furthermore, the examples from the CVR and the captain's testimony support the hypothesis that the fatigue had an impact on specific actions involved in the occurrence of the accident.

  4. Analysis of Software Systems for Specialized Computers,

    DTIC Science & Technology

    computer) with given computer hardware and software . The object of study is the software system of a computer, designed for solving a fixed complex of...purpose of the analysis is to find parameters that characterize the system and its elements during operation, i.e., when servicing the given requirement flow. (Author)

  5. Systems thinking, the Swiss Cheese Model and accident analysis: a comparative systemic analysis of the Grayrigg train derailment using the ATSB, AcciMap and STAMP models.

    PubMed

    Underwood, Peter; Waterson, Patrick

    2014-07-01

    The Swiss Cheese Model (SCM) is the most popular accident causation model and is widely used throughout various industries. A debate exists in the research literature over whether the SCM remains a viable tool for accident analysis. Critics of the model suggest that it provides a sequential, oversimplified view of accidents. Conversely, proponents suggest that it embodies the concepts of systems theory, as per the contemporary systemic analysis techniques. The aim of this paper was to consider whether the SCM can provide a systems thinking approach and remain a viable option for accident analysis. To achieve this, the train derailment at Grayrigg was analysed with an SCM-based model (the ATSB accident investigation model) and two systemic accident analysis methods (AcciMap and STAMP). The analysis outputs and usage of the techniques were compared. The findings of the study showed that each model applied the systems thinking approach. However, the ATSB model and AcciMap graphically presented their findings in a more succinct manner, whereas STAMP more clearly embodied the concepts of systems theory. The study suggests that, whilst the selection of an analysis method is subject to trade-offs that practitioners and researchers must make, the SCM remains a viable model for accident analysis. Copyright © 2013 Elsevier Ltd. All rights reserved.

  6. Preliminary analysis of loss-of-coolant accident in Fukushima nuclear accident

    NASA Astrophysics Data System (ADS)

    Su'ud, Zaki; Anshari, Rio

    2012-06-01

    Loss-of-Coolant Accident (LOCA) in Boiling Water Reactor (BWR) especially on Fukushima Nuclear Accident will be discussed in this paper. The Tohoku earthquake triggered the shutdown of nuclear power reactors at Fukushima Nuclear Power station. Though shutdown process has been completely performed, cooling process, at much smaller level than in normal operation, is needed to remove decay heat from the reactor core until the reactor reach cold-shutdown condition. If LOCA happen at this condition, it will cause the increase of reactor fuel and other core temperatures and can lead to reactor core meltdown and exposure of radioactive material to the environment such as in the Fukushima Dai Ichi nuclear accident case. In this study numerical simulation has been performed to calculate pressure composition, water level and temperature distribution on reactor during this accident. There are two coolant regulating system that operational on reactor unit 1 at this accident, Isolation Condensers (IC) system and Safety Relief Valves (SRV) system. Average mass flow of steam to the IC system in this event is 10 kg/s and could keep reactor core from uncovered about 3,2 hours and fully uncovered in 4,7 hours later. There are two coolant regulating system at operational on reactor unit 2, Reactor Core Isolation Condenser (RCIC) System and Safety Relief Valves (SRV). Average mass flow of coolant that correspond this event is 20 kg/s and could keep reactor core from uncovered about 73 hours and fully uncovered in 75 hours later. There are three coolant regulating system at operational on reactor unit 3, Reactor Core Isolation Condenser (RCIC) system, High Pressure Coolant Injection (HPCI) system and Safety Relief Valves (SRV). Average mass flow of water that correspond this event is 15 kg/s and could keep reactor core from uncovered about 37 hours and fully uncovered in 40 hours later.

  7. Construction industry accidents in Spain.

    PubMed

    Camino López, Miguel A; Ritzel, Dale O; Fontaneda, Ignacio; González Alcantara, Oscar J

    2008-01-01

    This paper analyzed industrial accidents that take place on construction sites and their severity. Eighteen variables were studied. We analyzed the influence of each of these with respect to the severity and fatality of the accident. This descriptive analysis was grounded in 1,630,452 accidents, representing the total number of accidents suffered by workers in the construction sector in Spain over the period 1990-2000. It was shown that age, type of contract, time of accident, length of service in the company, company size, day of the week, and the remainder of the variables under analysis influenced the seriousness of the accident. IMPACT ON INJURY PREVENTION: The results obtained show that different training was needed, depending on the severity of accidents, for different age, length of service in the company, organization of work, and time when workers work. The research provides an insight to the likely causes of construction injuries in Spain. As a result of the analysis, industries and governmental agencies in Spain can start to provide appropriate strategies and training to the construction workers.

  8. Analysis of Sertraline in Postmortem Fluids and Tissues in 11 Aviation Accident Victims

    DTIC Science & Technology

    2012-11-01

    likely undergoes significant postmortem redistribution. 17. Key Words 18. Distribution Statement Forensic Toxicology , Sertraline, Norsertraline... Toxicology .. Forensic Sci Int,.142:.75-100.(2004) . 29 .. Skopp,.G ..Postmortem.Toxicology .. Forensic Sci Med Pathol,.6:.314-25.(2010) . ... toxicological . analysis. on. specimens.from.….aircraft.accident.fatalities”.and.“in- vestigate.….general.aviation.and.air.carrier.accidents. and. search

  9. Assessment and prediction of road accident injuries trend using time-series models in Kurdistan.

    PubMed

    Parvareh, Maryam; Karimi, Asrin; Rezaei, Satar; Woldemichael, Abraha; Nili, Sairan; Nouri, Bijan; Nasab, Nader Esmail

    2018-01-01

    Road traffic accidents are commonly encountered incidents that can cause high-intensity injuries to the victims and have direct impacts on the members of the society. Iran has one of the highest incident rates of road traffic accidents. The objective of this study was to model the patterns of road traffic accidents leading to injury in Kurdistan province, Iran. A time-series analysis was conducted to characterize and predict the frequency of road traffic accidents that lead to injury in Kurdistan province. The injuries were categorized into three separate groups which were related to the car occupants, motorcyclists and pedestrian road traffic accident injuries. The Box-Jenkins time-series analysis was used to model the injury observations applying autoregressive integrated moving average (ARIMA) and seasonal autoregressive integrated moving average (SARIMA) from March 2009 to February 2015 and to predict the accidents up to 24 months later (February 2017). The analysis was carried out using R-3.4.2 statistical software package. A total of 5199 pedestrians, 9015 motorcyclists, and 28,906 car occupants' accidents were observed. The mean (SD) number of car occupant, motorcyclist and pedestrian accident injuries observed were 401.01 (SD 32.78), 123.70 (SD 30.18) and 71.19 (SD 17.92) per year, respectively. The best models for the pattern of car occupant, motorcyclist, and pedestrian injuries were the ARIMA (1, 0, 0), SARIMA (1, 0, 2) (1, 0, 0) 12 , and SARIMA (1, 1, 1) (0, 0, 1) 12 , respectively. The motorcyclist and pedestrian injuries showed a seasonal pattern and the peak was during summer (August). The minimum frequency for the motorcyclist and pedestrian injuries were observed during the late autumn and early winter (December and January). Our findings revealed that the observed motorcyclist and pedestrian injuries had a seasonal pattern that was explained by air temperature changes overtime. These findings call the need for close monitoring of the

  10. Statistical Analysis And Treatment Of Accident Black Spots: A Case Study Of Nandyal Mandal

    NASA Astrophysics Data System (ADS)

    Sudharshan Reddy, B.; Vishnu Vardhan Reddy, L.; Sreenivasa Reddy, G., Dr

    2017-08-01

    Background: Increased, economic activity raised the consumption levels of the people across the country. This created scope for increase in travel and transportation. The increase in the vehicles since last 10 years has put lot of pressure on the existing roads and ultimately resulting in road accidents. Nandyal Mandal is located in the Kurnool district of Andhra Pradesh and well developed in both agricultural and industrial sectors after Kurnool. 567 accidents occurred in the last seven years at 143 locations shows the severity of the accidents in the Nandyal Mandal. There is a need to carry out some work in the Nandyal Mandal to improve the accidents black spots for reducing the accidents. Methods: Last seven years (2010-2016) of accident data collected from Police Stations. Weighted Severity Index (WSI), a scientific method is used for identifying the accident black spots. Statistical analysis has carried out for the collected data using Chi-Square Test to determine the independence of accidents with other attributes. Chi-Square Goodness of fit test conducted for test whether the accidents are occurring by chance or following any pattern. Results: WSI values are determined for the 143 locations. The Locations with high WSI are treated as accident black spots. Five black spots are taken for field study. After field observations and interaction with the public, some improvements are suggested for improving the accident black spots. There is no relationship between the severity of accidents and the other attributes like month, season, day, hours in day and the age group except type of vehicle. Road accidents are distributed throughout the Year, Month and Season. Road accidents are not distributed throughout the day.

  11. Automated Software Vulnerability Analysis

    NASA Astrophysics Data System (ADS)

    Sezer, Emre C.; Kil, Chongkyung; Ning, Peng

    Despite decades of research, software continues to have vulnerabilities. Successful exploitations of these vulnerabilities by attackers cost millions of dollars to businesses and individuals. Unfortunately, most effective defensive measures, such as patching and intrusion prevention systems, require an intimate knowledge of the vulnerabilities. Many systems for detecting attacks have been proposed. However, the analysis of the exploited vulnerabilities is left to security experts and programmers. Both the human effortinvolved and the slow analysis process are unfavorable for timely defensive measure to be deployed. The problem is exacerbated by zero-day attacks.

  12. Accident Data Use and Geographic Information System (GIS)

    DOT National Transportation Integrated Search

    1998-09-16

    Project Description: The Cheyenne Area Transportation Planning Process (ChATTP) : has developed a PowerPoint presentation demonstrating how to use an existing : accident database with GIS software. The slides are followed by a hands-on : demonstratio...

  13. FunRich proteomics software analysis, let the fun begin!

    PubMed

    Benito-Martin, Alberto; Peinado, Héctor

    2015-08-01

    Protein MS analysis is the preferred method for unbiased protein identification. It is normally applied to a large number of both small-scale and high-throughput studies. However, user-friendly computational tools for protein analysis are still needed. In this issue, Mathivanan and colleagues (Proteomics 2015, 15, 2597-2601) report the development of FunRich software, an open-access software that facilitates the analysis of proteomics data, providing tools for functional enrichment and interaction network analysis of genes and proteins. FunRich is a reinterpretation of proteomic software, a standalone tool combining ease of use with customizable databases, free access, and graphical representations. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  14. IFDOTMETER: A New Software Application for Automated Immunofluorescence Analysis.

    PubMed

    Rodríguez-Arribas, Mario; Pizarro-Estrella, Elisa; Gómez-Sánchez, Rubén; Yakhine-Diop, S M S; Gragera-Hidalgo, Antonio; Cristo, Alejandro; Bravo-San Pedro, Jose M; González-Polo, Rosa A; Fuentes, José M

    2016-04-01

    Most laboratories interested in autophagy use different imaging software for managing and analyzing heterogeneous parameters in immunofluorescence experiments (e.g., LC3-puncta quantification and determination of the number and size of lysosomes). One solution would be software that works on a user's laptop or workstation that can access all image settings and provide quick and easy-to-use analysis of data. Thus, we have designed and implemented an application called IFDOTMETER, which can run on all major operating systems because it has been programmed using JAVA (Sun Microsystems). Briefly, IFDOTMETER software has been created to quantify a variety of biological hallmarks, including mitochondrial morphology and nuclear condensation. The program interface is intuitive and user-friendly, making it useful for users not familiar with computer handling. By setting previously defined parameters, the software can automatically analyze a large number of images without the supervision of the researcher. Once analysis is complete, the results are stored in a spreadsheet. Using software for high-throughput cell image analysis offers researchers the possibility of performing comprehensive and precise analysis of a high number of images in an automated manner, making this routine task easier. © 2015 Society for Laboratory Automation and Screening.

  15. Analysis of electrical accidents and the related causes involving citizens who are served by the Western of Tehran

    PubMed Central

    Kalte, Haji Omid; Hosseini, Alireza Haji; Arabzadeh, Sara; Najafi, Hossein; Dehghan, Naser; Akbarzadeh, Arash; Keshavarz, Safiyeh; Karchani, Mohsen

    2014-01-01

    Background: Electrical burns account for a significant percentage of fatal accidents. Each year, a number of consumers in Iran suffer from electrical injuries due to technical problems, equipment failures, and the unauthorized use of electricity. The aim of this study was to examine the root causes of accidents that involved electricity in the district served by the Western Tehran Province Electricity Distribution Company. Methods: This was a descriptive study in which incidents involving electricity-related injuries were investigated among customers served by the Western Tehran Province Electricity Distribution Company. Therefore, we collected and analyzed incident reports filed by citizens from 2005 through the first half of 2009 in the Distribution Company’s coverage area, including Savejbolagh, Shahriyar, eastern Karaj, Qods City, southern Karaj, western Karaj, Malard, and Mehrshahr. The reported events were analyzed using SPSS software. Results: Exposure of electricity lines and unauthorized construction of residential houses in areas where there were medium- and low-voltage lines were responsible for 37% of the injuries. The findings showed that the highest rate of accidents occurred in 2008 and the first half of 2009. The highest rate of accidents occurred among people with a mean age of 35. Conclusion: The results from investigating the causes of electrical accidents emphasized the necessity of developing a culture of safety in communities, especially among employees who are engaged in occupations related to electricity, construction workers, and school children to reduce the rate of such accidents. PMID:25763153

  16. One-Click Data Analysis Software for Science Operations

    NASA Astrophysics Data System (ADS)

    Navarro, Vicente

    2015-12-01

    One of the important activities of ESA Science Operations Centre is to provide Data Analysis Software (DAS) to enable users and scientists to process data further to higher levels. During operations and post-operations, Data Analysis Software (DAS) is fully maintained and updated for new OS and library releases. Nonetheless, once a Mission goes into the "legacy" phase, there are very limited funds and long-term preservation becomes more and more difficult. Building on Virtual Machine (VM), Cloud computing and Software as a Service (SaaS) technologies, this project has aimed at providing long-term preservation of Data Analysis Software for the following missions: - PIA for ISO (1995) - SAS for XMM-Newton (1999) - Hipe for Herschel (2009) - EXIA for EXOSAT (1983) Following goals have guided the architecture: - Support for all operations, post-operations and archive/legacy phases. - Support for local (user's computer) and cloud environments (ESAC-Cloud, Amazon - AWS). - Support for expert users, requiring full capabilities. - Provision of a simple web-based interface. This talk describes the architecture, challenges, results and lessons learnt gathered in this project.

  17. [Comparative analysis of the radionuclide composition in fallout after the Chernobyl and the Fukushima accidents].

    PubMed

    Kotenko, K V; Shinkarev, S M; Abramov, Iu V; Granovskaia, E O; Iatsenko, V N; Gavrilin, Iu I; Margulis, U Ia; Garetskaia, O S; Imanaka, T; Khoshi, M

    2012-01-01

    The nuclear accident occurred at Fukushima Dai-ichi Nuclear Power Plant (NPP) (March 11, 2011) similarly to the accident at the Chernobyl NPP (April 26, 1986) is related to the level 7 of the INES. It is of interest to make an analysis of the radionuclide composition of the fallout following the both accidents. The results of the spectrometric measurements were used in that comparative analysis. Two areas following the Chernobyl accident were considered: (1) the near zone of the fallout - the Belarusian part of the central spot extended up to 60 km around the Chernobyl NPS and (2) the far zone of the fallout--the "Gomel-Mogilev" spot centered 200 km to the north-northeast of the damaged reactor. In the case of Fukushima accident the near zone up to about 60 km considered. The comparative analysis has been done with respect to refractory radionuclides (95Zr, 95Nb, 141Ce, 144Ce), as well as to the intermediate and volatile radionuclides 103Ru, 106Ru, 131I, 134Cs, 137Cs, 140La, 140Ba and the results of such a comparison have been discussed. With respect to exposure to the public the most important radionuclides are 131I and 137Cs. For the both accidents the ratios of 131I/137Cs in the considered soil samples are in the similar ranges: (3-50) for the Chernobyl samples and (5-70) for the Fukushima samples. Similarly to the Chernobyl accident a clear tendency that the ratio of 131I/137Cs in the fallout decreases with the increase of the ground deposition density of 137Cs within the trace related to a radioactive cloud has been identified for the Fukushima accident. It looks like this is a universal tendency for the ratio of 131I/137Cs versus the 137Cs ground deposition density in the fallout along the trace of a radioactive cloud as a result of a heavy accident at the NPP with radionuclides releases into the environment. This tendency is important for an objective reconstruction of 131I fallout based on the results of 137Cs measurements of soil samples carried out at

  18. Spacecraft Trajectory Analysis and Mission Planning Simulation (STAMPS) Software

    NASA Technical Reports Server (NTRS)

    Puckett, Nancy; Pettinger, Kris; Hallstrom,John; Brownfield, Dana; Blinn, Eric; Williams, Frank; Wiuff, Kelli; McCarty, Steve; Ramirez, Daniel; Lamotte, Nicole; hide

    2014-01-01

    STAMPS simulates either three- or six-degree-of-freedom cases for all spacecraft flight phases using translated HAL flight software or generic GN&C models. Single or multiple trajectories can be simulated for use in optimization and dispersion analysis. It includes math models for the vehicle and environment, and currently features a "C" version of shuttle onboard flight software. The STAMPS software is used for mission planning and analysis within ascent/descent, rendezvous, proximity operations, and navigation flight design areas.

  19. Systems-based accident analysis in the led outdoor activity domain: application and evaluation of a risk management framework.

    PubMed

    Salmon, P; Williamson, A; Lenné, M; Mitsopoulos-Rubens, E; Rudin-Brown, C M

    2010-08-01

    Safety-compromising accidents occur regularly in the led outdoor activity domain. Formal accident analysis is an accepted means of understanding such events and improving safety. Despite this, there remains no universally accepted framework for collecting and analysing accident data in the led outdoor activity domain. This article presents an application of Rasmussen's risk management framework to the analysis of the Lyme Bay sea canoeing incident. This involved the development of an Accimap, the outputs of which were used to evaluate seven predictions made by the framework. The Accimap output was also compared to an analysis using an existing model from the led outdoor activity domain. In conclusion, the Accimap output was found to be more comprehensive and supported all seven of the risk management framework's predictions, suggesting that it shows promise as a theoretically underpinned approach for analysing, and learning from, accidents in the led outdoor activity domain. STATEMENT OF RELEVANCE: Accidents represent a significant problem within the led outdoor activity domain. This article presents an evaluation of a risk management framework that can be used to understand such accidents and to inform the development of accident countermeasures and mitigation strategies for the led outdoor activity domain.

  20. [Severe parachuting accident. Analysis of 122 cases].

    PubMed

    Krauss, U; Mischkowsky, T

    1993-06-01

    Based on a population of 122 severely injured patients the causes of paragliding accidents and the patterns of injury are analyzed. A questionnaire is used to establish a sport-specific profile for the paragliding pilot. The lower limbs (55.7%) and the lower parts of the spine (45.9%) are the most frequently injured parts of the body. There is a high risk of multiple injuries after a single accident because of the tremendous axial power. The standard of equipment is good in over 90% of the cases. Insufficient training and failure to take account of geographical and meteorological conditions are the main determinants of accidents sustained by paragliders, most of whom are young. Nevertheless, 80% of our patients want to continue paragliding. Finally some advice is given on how to prevent paragliding accidents and injuries.

  1. Flexible Software Architecture for Visualization and Seismic Data Analysis

    NASA Astrophysics Data System (ADS)

    Petunin, S.; Pavlov, I.; Mogilenskikh, D.; Podzyuban, D.; Arkhipov, A.; Baturuin, N.; Lisin, A.; Smith, A.; Rivers, W.; Harben, P.

    2007-12-01

    Research in the field of seismology requires software and signal processing utilities for seismogram manipulation and analysis. Seismologists and data analysts often encounter a major problem in the use of any particular software application specific to seismic data analysis: the tuning of commands and windows to the specific waveforms and hot key combinations so as to fit their familiar informational environment. The ability to modify the user's interface independently from the developer requires an adaptive code structure. An adaptive code structure also allows for expansion of software capabilities such as new signal processing modules and implementation of more efficient algorithms. Our approach is to use a flexible "open" architecture for development of geophysical software. This report presents an integrated solution for organizing a logical software architecture based on the Unix version of the Geotool software implemented on the Microsoft NET 2.0 platform. Selection of this platform greatly expands the variety and number of computers that can implement the software, including laptops that can be utilized in field conditions. It also facilitates implementation of communication functions for seismic data requests from remote databases through the Internet. The main principle of the new architecture for Geotool is that scientists should be able to add new routines for digital waveform analysis via software plug-ins that utilize the basic Geotool display for GUI interaction. The use of plug-ins allows the efficient integration of diverse signal-processing software, including software still in preliminary development, into an organized platform without changing the fundamental structure of that platform itself. An analyst's use of Geotool is tracked via a metadata file so that future studies can reconstruct, and alter, the original signal processing operations. The work has been completed in the framework of a joint Russian- American project.

  2. Introducing a New Software for Geodetic Analysis

    NASA Astrophysics Data System (ADS)

    Hjelle, Geir Arne; Dähnn, Michael; Fausk, Ingrid; Kirkvik, Ann-Silje; Mysen, Eirik

    2017-04-01

    At the Norwegian Mapping Authority, we are currently developing Where, a new software for geodetic analysis. Where is built on our experiences with the Geosat software, and will be able to analyse and combine data from VLBI, SLR, GNSS and DORIS. The software is mainly written in Python which has proved very fruitful. The code is quick to write and the architecture is easily extendable and maintainable, while at the same time taking advantage of well-tested code like the SOFA and IERS libraries. This presentation will show some of the current capabilities of Where, including benchmarks against other software packages, and outline our plans for further progress. In addition we will report on some investigations we have done experimenting with alternative weighting strategies for VLBI.

  3. Software development predictors, error analysis, reliability models and software metric analysis

    NASA Technical Reports Server (NTRS)

    Basili, Victor

    1983-01-01

    The use of dynamic characteristics as predictors for software development was studied. It was found that there are some significant factors that could be useful as predictors. From a study on software errors and complexity, it was shown that meaningful results can be obtained which allow insight into software traits and the environment in which it is developed. Reliability models were studied. The research included the field of program testing because the validity of some reliability models depends on the answers to some unanswered questions about testing. In studying software metrics, data collected from seven software engineering laboratory (FORTRAN) projects were examined and three effort reporting accuracy checks were applied to demonstrate the need to validate a data base. Results are discussed.

  4. An Analysis of U.S. Civil Rotorcraft Accidents by Cost and Injury (1990-1996)

    NASA Technical Reports Server (NTRS)

    Iseler, Laura; DeMaio, Joe; Rutkowski, Michael (Technical Monitor)

    2002-01-01

    A study of rotorcraft accidents was conducted to identify safety issues and research areas that might lead to a reduction in rotorcraft accidents and fatalities. The primary source of data was summaries of National Transportation Safety Board (NTSB) accident reports. From 1990 to 1996, the NTSB documented 1396 civil rotorcraft accidents in the United States in which 491 people were killed. The rotorcraft data were compared to airline and general aviation data to determine the relative safety of rotorcraft compared to other segments of the aviation industry. In depth analysis of the rotorcraft data addressed demographics, mission, and operational factors. Rotorcraft were found to have an accident rate about ten times that of commercial airliners and about the same as that of general aviation. The likelihood that an accident would be fatal was about equal for all three classes of operation. The most dramatic division in rotorcraft accidents is between flights flown by private pilots versus professional pilots. Private pilots, flying low cost aircraft in benign environments, have accidents that are due, in large part, to their own errors. Professional pilots, in contrast, are more likely to have accidents that are a result of exacting missions or use of specialized equipment. For both groups judgement error is more likely to lead to a fatal accident than are other types of causes. Several approaches to improving the rotorcraft accident rate are recommended. These mostly address improvement in the training of new pilots and improving the safety awareness of private pilots.

  5. Analysis of Convair 990 rejected-takeoff accident with emphasis on decision making, training and procedures

    NASA Technical Reports Server (NTRS)

    Batthauer, Byron E.

    1987-01-01

    This paper analyzes a NASA Convair 990 (CV-990) accident with emphasis on rejected-takeoff (RTO) decision making, training, procedures, and accident statistics. The NASA Aircraft Accident Investigation Board was somewhat perplexed that an aircraft could be destroyed as a result of blown tires during the takeoff roll. To provide a better understanding of tire failure RTO's, The Board obtained accident reports, Federal Aviation Administration (FAA) studies, and other pertinent information related to the elements of this accident. This material enhanced the analysis process and convinced the Accident Board that high-speed RTO's in transport aircraft should be given more emphasis during pilot training. Pilots should be made aware of various RTO situations and statistics with emphasis on failed-tire RTO's. This background information could enhance the split-second decision-making process that is required prior to initiating an RTO.

  6. Preliminary analysis of loss-of-coolant accident in Fukushima nuclear accident

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Su'ud, Zaki; Anshari, Rio

    Loss-of-Coolant Accident (LOCA) in Boiling Water Reactor (BWR) especially on Fukushima Nuclear Accident will be discussed in this paper. The Tohoku earthquake triggered the shutdown of nuclear power reactors at Fukushima Nuclear Power station. Though shutdown process has been completely performed, cooling process, at much smaller level than in normal operation, is needed to remove decay heat from the reactor core until the reactor reach cold-shutdown condition. If LOCA happen at this condition, it will cause the increase of reactor fuel and other core temperatures and can lead to reactor core meltdown and exposure of radioactive material to the environmentmore » such as in the Fukushima Dai Ichi nuclear accident case. In this study numerical simulation has been performed to calculate pressure composition, water level and temperature distribution on reactor during this accident. There are two coolant regulating system that operational on reactor unit 1 at this accident, Isolation Condensers (IC) system and Safety Relief Valves (SRV) system. Average mass flow of steam to the IC system in this event is 10 kg/s and could keep reactor core from uncovered about 3,2 hours and fully uncovered in 4,7 hours later. There are two coolant regulating system at operational on reactor unit 2, Reactor Core Isolation Condenser (RCIC) System and Safety Relief Valves (SRV). Average mass flow of coolant that correspond this event is 20 kg/s and could keep reactor core from uncovered about 73 hours and fully uncovered in 75 hours later. There are three coolant regulating system at operational on reactor unit 3, Reactor Core Isolation Condenser (RCIC) system, High Pressure Coolant Injection (HPCI) system and Safety Relief Valves (SRV). Average mass flow of water that correspond this event is 15 kg/s and could keep reactor core from uncovered about 37 hours and fully uncovered in 40 hours later.« less

  7. Balloon crash damage and injuries: an analysis of 86 accidents, 2000-2004.

    PubMed

    de Voogt, Alexander J; van Doorn, Robert R A

    2006-05-01

    General aviation accounts for the majority of aviation crashes and casualties in the United States. The role of ballooning in these statistics is not regularly studied. Since 2001, the National Transportation and Safety Board has made its accident reports more readily available, which presents opportunities for further study. This study analyzes and compares a 5-yr period of accident reports and includes an analysis of injuries and balloon damage in hot-air and gas balloon accidents. Balloon crash 2-page briefs and 5-page accident reports published by the National Transportation and Safety Board for the 5-yr time period 2000-2004 were examined. Data collected in the investigation of these crashes were analyzed and compared with the epidemiological data collected in earlier research. In 86 crashes during a 5-yr period, there were 4 fatalities and 75 people were seriously injured. Only one accident was reported involving a student pilot. Broken ankles and legs have been the most commonly recorded serious injury, but could not be linked to the severity of damage to the balloon. The absence of student pilot accidents may be explained by possible stricter supervision. Balloon basket and envelopes appear of sufficient quality to withstand crashes, but improving the protection of passengers during hard landings should help to decrease the number of serious injuries in ballooning.

  8. Differential maneuvering simulator data reduction and analysis software

    NASA Technical Reports Server (NTRS)

    Beasley, G. P.; Sigman, R. S.

    1972-01-01

    A multielement data reduction and analysis software package has been developed for use with the Langley differential maneuvering simulator (DMS). This package, which has several independent elements, was developed to support all phases of DMS aircraft simulation studies with a variety of both graphical and tabular information. The overall software package is considered unique because of the number, diversity, and sophistication of the element programs available for use in a single study. The purpose of this paper is to discuss the overall DMS data reduction and analysis package by reviewing the development of the various elements of the software, showing typical results that can be obtained, and discussing how each element can be used.

  9. Development of Automated Image Analysis Software for Suspended Marine Particle Classification

    DTIC Science & Technology

    2003-09-30

    Development of Automated Image Analysis Software for Suspended Marine Particle Classification Scott Samson Center for Ocean Technology...REPORT TYPE 3. DATES COVERED 00-00-2003 to 00-00-2003 4. TITLE AND SUBTITLE Development of Automated Image Analysis Software for Suspended...objective is to develop automated image analysis software to reduce the effort and time required for manual identification of plankton images. Automated

  10. User-driven integrated software lives: ``Paleomag'' paleomagnetics analysis on the Macintosh

    NASA Astrophysics Data System (ADS)

    Jones, Craig H.

    2002-12-01

    "PaleoMag," a paleomagnetics analysis package originally developed for the Macintosh operating system in 1988, allows examination of demagnetization of individual samples and analysis of directional data from collections of samples. Prior to recent reinvigorated development of the software for both Macintosh and Windows, it was widely used despite not running properly on machines and operating systems sold after 1995. This somewhat surprising situation demonstrates that there is a continued need for integrated analysis software within the earth sciences, in addition to well-developed scripting and batch-mode software. One distinct advantage of software like PaleoMag is in the ability to combine quality control with analysis within a unique graphical environment. Because such demands are frequent within the earth sciences, means of nurturing the development of similar software should be found.

  11. Multibiodose radiation emergency triage categorization software.

    PubMed

    Ainsbury, Elizabeth A; Barnard, Stephen; Barrios, Lleonard; Fattibene, Paola; de Gelder, Virginie; Gregoire, Eric; Lindholm, Carita; Lloyd, David; Nergaard, Inger; Rothkamm, Kai; Romm, Horst; Scherthan, Harry; Thierens, Hubert; Vandevoorde, Charlot; Woda, Clemens; Wojcik, Andrzej

    2014-07-01

    In this note, the authors describe the MULTIBIODOSE software, which has been created as part of the MULTIBIODOSE project. The software enables doses estimated by networks of laboratories, using up to five retrospective (biological and physical) assays, to be combined to give a single estimate of triage category for each individual potentially exposed to ionizing radiation in a large scale radiation accident or incident. The MULTIBIODOSE software has been created in Java. The usage of the software is based on the MULTIBIODOSE Guidance: the program creates a link to a single SQLite database for each incident, and the database is administered by the lead laboratory. The software has been tested with Java runtime environment 6 and 7 on a number of different Windows, Mac, and Linux systems, using data from a recent intercomparison exercise. The Java program MULTIBIODOSE_1.0.jar is freely available to download from http://www.multibiodose.eu/software or by contacting the software administrator: MULTIBIODOSE-software@gmx.com.

  12. Software analysis in the semantic web

    NASA Astrophysics Data System (ADS)

    Taylor, Joshua; Hall, Robert T.

    2013-05-01

    Many approaches in software analysis, particularly dynamic malware analyis, benefit greatly from the use of linked data and other Semantic Web technology. In this paper, we describe AIS, Inc.'s Semantic Extractor (SemEx) component from the Malware Analysis and Attribution through Genetic Information (MAAGI) effort, funded under DARPA's Cyber Genome program. The SemEx generates OWL-based semantic models of high and low level behaviors in malware samples from system call traces generated by AIS's introspective hypervisor, IntroVirtTM. Within MAAGI, these semantic models were used by modules that cluster malware samples by functionality, and construct "genealogical" malware lineages. Herein, we describe the design, implementation, and use of the SemEx, as well as the C2DB, an OWL ontology used for representing software behavior and cyber-environments.

  13. An association between dietary habits and traffic accidents in patients with chronic liver disease: A data-mining analysis

    PubMed Central

    KAWAGUCHI, TAKUMI; SUETSUGU, TAKURO; OGATA, SHYOU; IMANAGA, MINAMI; ISHII, KUMIKO; ESAKI, NAO; SUGIMOTO, MASAKO; OTSUYAMA, JYURI; NAGAMATSU, AYU; TANIGUCHI, EITARO; ITOU, MINORU; ORIISHI, TETSUHARU; IWASAKI, SHOKO; MIURA, HIROKO; TORIMURA, TAKUJI

    2016-01-01

    The incidence of traffic accidents in patients with chronic liver disease (CLD) is high in the USA. However, the characteristics of patients, including dietary habits, differ between Japan and the USA. The present study investigated the incidence of traffic accidents in CLD patients and the clinical profiles associated with traffic accidents in Japan using a data-mining analysis. A cross-sectional study was performed and 256 subjects [148 CLD patients (CLD group) and 106 patients with other digestive diseases (disease control group)] were enrolled; 2 patients were excluded. The incidence of traffic accidents was compared between the two groups. Independent factors for traffic accidents were analyzed using logistic regression and decision-tree analyses. The incidence of traffic accidents did not differ between the CLD and disease control groups (8.8 vs. 11.3%). The results of the logistic regression analysis showed that yoghurt consumption was the only independent risk factor for traffic accidents (odds ratio, 0.37; 95% confidence interval, 0.16–0.85; P=0.0197). Similarly, the results of the decision-tree analysis showed that yoghurt consumption was the initial divergence variable. In patients who consumed yoghurt habitually, the incidence of traffic accidents was 6.6%, while that in patients who did not consume yoghurt was 16.0%. CLD was not identified as an independent factor in the logistic regression and decision-tree analyses. In conclusion, the difference in the incidence of traffic accidents in Japan between the CLD and disease control groups was insignificant. Furthermore, yoghurt consumption was an independent negative risk factor for traffic accidents in patients with digestive diseases, including CLD. PMID:27123257

  14. An association between dietary habits and traffic accidents in patients with chronic liver disease: A data-mining analysis.

    PubMed

    Kawaguchi, Takumi; Suetsugu, Takuro; Ogata, Shyou; Imanaga, Minami; Ishii, Kumiko; Esaki, Nao; Sugimoto, Masako; Otsuyama, Jyuri; Nagamatsu, Ayu; Taniguchi, Eitaro; Itou, Minoru; Oriishi, Tetsuharu; Iwasaki, Shoko; Miura, Hiroko; Torimura, Takuji

    2016-05-01

    The incidence of traffic accidents in patients with chronic liver disease (CLD) is high in the USA. However, the characteristics of patients, including dietary habits, differ between Japan and the USA. The present study investigated the incidence of traffic accidents in CLD patients and the clinical profiles associated with traffic accidents in Japan using a data-mining analysis. A cross-sectional study was performed and 256 subjects [148 CLD patients (CLD group) and 106 patients with other digestive diseases (disease control group)] were enrolled; 2 patients were excluded. The incidence of traffic accidents was compared between the two groups. Independent factors for traffic accidents were analyzed using logistic regression and decision-tree analyses. The incidence of traffic accidents did not differ between the CLD and disease control groups (8.8 vs. 11.3%). The results of the logistic regression analysis showed that yoghurt consumption was the only independent risk factor for traffic accidents (odds ratio, 0.37; 95% confidence interval, 0.16-0.85; P=0.0197). Similarly, the results of the decision-tree analysis showed that yoghurt consumption was the initial divergence variable. In patients who consumed yoghurt habitually, the incidence of traffic accidents was 6.6%, while that in patients who did not consume yoghurt was 16.0%. CLD was not identified as an independent factor in the logistic regression and decision-tree analyses. In conclusion, the difference in the incidence of traffic accidents in Japan between the CLD and disease control groups was insignificant. Furthermore, yoghurt consumption was an independent negative risk factor for traffic accidents in patients with digestive diseases, including CLD.

  15. Explorative spatial analysis of traffic accident statistics and road mortality among the provinces of Turkey.

    PubMed

    Erdogan, Saffet

    2009-10-01

    The aim of the study is to describe the inter-province differences in traffic accidents and mortality on roads of Turkey. Two different risk indicators were used to evaluate the road safety performance of the provinces in Turkey. These indicators are the ratios between the number of persons killed in road traffic accidents (1) and the number of accidents (2) (nominators) and their exposure to traffic risk (denominator). Population and the number of registered motor vehicles in the provinces were used as denominators individually. Spatial analyses were performed to the mean annual rate of deaths and to the number of fatal accidents that were calculated for the period of 2001-2006. Empirical Bayes smoothing was used to remove background noise from the raw death and accident rates because of the sparsely populated provinces and small number of accident and death rates of provinces. Global and local spatial autocorrelation analyses were performed to show whether the provinces with high rates of deaths-accidents show clustering or are located closer by chance. The spatial distribution of provinces with high rates of deaths and accidents was nonrandom and detected as clustered with significance of P<0.05 with spatial autocorrelation analyses. Regions with high concentration of fatal accidents and deaths were located in the provinces that contain the roads connecting the Istanbul, Ankara, and Antalya provinces. Accident and death rates were also modeled with some independent variables such as number of motor vehicles, length of roads, and so forth using geographically weighted regression analysis with forward step-wise elimination. The level of statistical significance was taken as P<0.05. Large differences were found between the rates of deaths and accidents according to denominators in the provinces. The geographically weighted regression analyses did significantly better predictions for both accident rates and death rates than did ordinary least regressions, as

  16. Pandora Operation and Analysis Software

    NASA Technical Reports Server (NTRS)

    Herman, Jay; Cede, Alexander; Abuhassan, Nader

    2012-01-01

    Pandora Operation and Analysis Software controls the Pandora Sun- and sky-pointing optical head and built-in filter wheels (neutral density, UV bandpass, polarization filters, and opaque). The software also controls the attached spectrometer exposure time and thermoelectric cooler to maintain the spectrometer temperature to within 1 C. All functions are available through a GUI so as to be easily accessible by the user. The data are automatically stored on a miniature computer (netbook) for automatic download to a designated server at user defined intervals (once per day, once per week, etc.), or to a USB external device. An additional software component reduces the raw data (spectrometer counts) to preliminary scientific products for quick-view purposes. The Pandora systems are built from off-the-shelf commercial parts and from mechanical parts machined using electronic machine shop drawings. The Pandora spectrometer system is designed to look at the Sun (tracking to within 0.1 ), or to look at the sky at any zenith or azimuth angle, to gather information about the amount of trace gases or aerosols that are present.

  17. PuMA: the Porous Microstructure Analysis software

    NASA Astrophysics Data System (ADS)

    Ferguson, Joseph C.; Panerai, Francesco; Borner, Arnaud; Mansour, Nagi N.

    2018-01-01

    The Porous Microstructure Analysis (PuMA) software has been developed in order to compute effective material properties and perform material response simulations on digitized microstructures of porous media. PuMA is able to import digital three-dimensional images obtained from X-ray microtomography or to generate artificial microstructures. PuMA also provides a module for interactive 3D visualizations. Version 2.1 includes modules to compute porosity, volume fractions, and surface area. Two finite difference Laplace solvers have been implemented to compute the continuum tortuosity factor, effective thermal conductivity, and effective electrical conductivity. A random method has been developed to compute tortuosity factors from the continuum to rarefied regimes. Representative elementary volume analysis can be performed on each property. The software also includes a time-dependent, particle-based model for the oxidation of fibrous materials. PuMA was developed for Linux operating systems and is available as a NASA software under a US & Foreign release.

  18. Analysis of accidents in nine Iranian gas refineries: 2007-2011.

    PubMed

    Mehrdad, R; Bolouri, A; Shakibmanesh, A R

    2013-10-01

    Occupational accidents are one of the major health hazards in industries and associated with high mortality, morbidity, spiritual damage and economic losses in the world. To determine the incidence of occupational accidents in 9 Iranian gas refineries between March 2007 and February 2011. Data on all occupational accidents occurred between March 2007 and February 2011, as well as other possible associated variables including time of accident, whether the accident was due to a personal or systemic fault, type of accident and its outcomes, age and gender of the victim, the injured parts of the body, job experience, and type of employment, were extracted from HSE reports and notes of health care services. Based on these data, we calculated the incidence rate of accidents and assessed the associated factors. During the 5 studied years, 1129 accidents have been recorded. The incidence of fatal accidents was 1.64 per 100 000 and of nonfatal accidents was 1857 per 100 000 workers per year. 99.4% of injured workers were male. The mean±SD age of injured people was 29.6±7.3 years. Almost 70% of injured workers aged under 30 years. The mean±SD job experience was 5.3±5.3 years. Accidents occurred more commonly around 10:00. More than 60% of accidents happened between 8:00 and 15:00. July had the highest incidence rate. The most common type of accident was being struck by an object (48%). More than 94% of accidents are caused by personal rather than systemic faults. Hands and wrists were the most common injured parts and involved in more than one-third of accidents. 70% of injured workers needed medical treatment and returned to work after primary treatment. The pattern of occupational accidents in Iranian gas refineries is similar to other previous reports in many ways. The incidence did not change significantly over the study period. Establishment of an online network for precise registration, notification and meticulous data collection seems necessary.

  19. Software for computerised analysis of cardiotocographic traces.

    PubMed

    Romano, M; Bifulco, P; Ruffo, M; Improta, G; Clemente, F; Cesarelli, M

    2016-02-01

    Despite the widespread use of cardiotocography in foetal monitoring, the evaluation of foetal status suffers from a considerable inter and intra-observer variability. In order to overcome the main limitations of visual cardiotocographic assessment, computerised methods to analyse cardiotocographic recordings have been recently developed. In this study, a new software for automated analysis of foetal heart rate is presented. It allows an automatic procedure for measuring the most relevant parameters derivable from cardiotocographic traces. Simulated and real cardiotocographic traces were analysed to test software reliability. In artificial traces, we simulated a set number of events (accelerations, decelerations and contractions) to be recognised. In the case of real signals, instead, results of the computerised analysis were compared with the visual assessment performed by 18 expert clinicians and three performance indexes were computed to gain information about performances of the proposed software. The software showed preliminary performance we judged satisfactory in that the results matched completely the requirements, as proved by tests on artificial signals in which all simulated events were detected from the software. Performance indexes computed in comparison with obstetricians' evaluations are, on the contrary, not so satisfactory; in fact they led to obtain the following values of the statistical parameters: sensitivity equal to 93%, positive predictive value equal to 82% and accuracy equal to 77%. Very probably this arises from the high variability of trace annotation carried out by clinicians. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  20. Accident patterns for construction-related workers: a cluster analysis

    NASA Astrophysics Data System (ADS)

    Liao, Chia-Wen; Tyan, Yaw-Yauan

    2012-01-01

    The construction industry has been identified as one of the most hazardous industries. The risk of constructionrelated workers is far greater than that in a manufacturing based industry. However, some steps can be taken to reduce worker risk through effective injury prevention strategies. In this article, k-means clustering methodology is employed in specifying the factors related to different worker types and in identifying the patterns of industrial occupational accidents. Accident reports during the period 1998 to 2008 are extracted from case reports of the Northern Region Inspection Office of the Council of Labor Affairs of Taiwan. The results show that the cluster analysis can indicate some patterns of occupational injuries in the construction industry. Inspection plans should be proposed according to the type of construction-related workers. The findings provide a direction for more effective inspection strategies and injury prevention programs.

  1. Accident patterns for construction-related workers: a cluster analysis

    NASA Astrophysics Data System (ADS)

    Liao, Chia-Wen; Tyan, Yaw-Yauan

    2011-12-01

    The construction industry has been identified as one of the most hazardous industries. The risk of constructionrelated workers is far greater than that in a manufacturing based industry. However, some steps can be taken to reduce worker risk through effective injury prevention strategies. In this article, k-means clustering methodology is employed in specifying the factors related to different worker types and in identifying the patterns of industrial occupational accidents. Accident reports during the period 1998 to 2008 are extracted from case reports of the Northern Region Inspection Office of the Council of Labor Affairs of Taiwan. The results show that the cluster analysis can indicate some patterns of occupational injuries in the construction industry. Inspection plans should be proposed according to the type of construction-related workers. The findings provide a direction for more effective inspection strategies and injury prevention programs.

  2. Combining task analysis and fault tree analysis for accident and incident analysis: a case study from Bulgaria.

    PubMed

    Doytchev, Doytchin E; Szwillus, Gerd

    2009-11-01

    Understanding the reasons for incident and accident occurrence is important for an organization's safety. Different methods have been developed to achieve this goal. To better understand the human behaviour in incident occurrence we propose an analysis concept that combines Fault Tree Analysis (FTA) and Task Analysis (TA). The former method identifies the root causes of an accident/incident, while the latter analyses the way people perform the tasks in their work environment and how they interact with machines or colleagues. These methods were complemented with the use of the Human Error Identification in System Tools (HEIST) methodology and the concept of Performance Shaping Factors (PSF) to deepen the insight into the error modes of an operator's behaviour. HEIST shows the external error modes that caused the human error and the factors that prompted the human to err. To show the validity of the approach, a case study at a Bulgarian Hydro power plant was carried out. An incident - the flooding of the plant's basement - was analysed by combining the afore-mentioned methods. The case study shows that Task Analysis in combination with other methods can be applied successfully to human error analysis, revealing details about erroneous actions in a realistic situation.

  3. Using software security analysis to verify the secure socket layer (SSL) protocol

    NASA Technical Reports Server (NTRS)

    Powell, John D.

    2004-01-01

    nal Aeronautics and Space Administration (NASA) have tens of thousands of networked computer systems and applications. Software Security vulnerabilities present risks such as lost or corrupted data, information the3, and unavailability of critical systems. These risks represent potentially enormous costs to NASA. The NASA Code Q research initiative 'Reducing Software Security Risk (RSSR) Trough an Integrated Approach '' offers, among its capabilities, formal verification of software security properties, through the use of model based verification (MBV) to address software security risks. [1,2,3,4,5,6] MBV is a formal approach to software assurance that combines analysis of software, via abstract models, with technology, such as model checkers, that provide automation of the mechanical portions of the analysis process. This paper will discuss: The need for formal analysis to assure software systems with respect to software and why testing alone cannot provide it. The means by which MBV with a Flexible Modeling Framework (FMF) accomplishes the necessary analysis task. An example of FMF style MBV in the verification of properties over the Secure Socket Layer (SSL) communication protocol as a demonstration.

  4. Usability study of clinical exome analysis software: top lessons learned and recommendations.

    PubMed

    Shyr, Casper; Kushniruk, Andre; Wasserman, Wyeth W

    2014-10-01

    New DNA sequencing technologies have revolutionized the search for genetic disruptions. Targeted sequencing of all protein coding regions of the genome, called exome analysis, is actively used in research-oriented genetics clinics, with the transition to exomes as a standard procedure underway. This transition is challenging; identification of potentially causal mutation(s) amongst ∼10(6) variants requires specialized computation in combination with expert assessment. This study analyzes the usability of user interfaces for clinical exome analysis software. There are two study objectives: (1) To ascertain the key features of successful user interfaces for clinical exome analysis software based on the perspective of expert clinical geneticists, (2) To assess user-system interactions in order to reveal strengths and weaknesses of existing software, inform future design, and accelerate the clinical uptake of exome analysis. Surveys, interviews, and cognitive task analysis were performed for the assessment of two next-generation exome sequence analysis software packages. The subjects included ten clinical geneticists who interacted with the software packages using the "think aloud" method. Subjects' interactions with the software were recorded in their clinical office within an urban research and teaching hospital. All major user interface events (from the user interactions with the packages) were time-stamped and annotated with coding categories to identify usability issues in order to characterize desired features and deficiencies in the user experience. We detected 193 usability issues, the majority of which concern interface layout and navigation, and the resolution of reports. Our study highlights gaps in specific software features typical within exome analysis. The clinicians perform best when the flow of the system is structured into well-defined yet customizable layers for incorporation within the clinical workflow. The results highlight opportunities to

  5. a Study of the Reconstruction of Accidents and Crime Scenes Through Computational Experiments

    NASA Astrophysics Data System (ADS)

    Park, S. J.; Chae, S. W.; Kim, S. H.; Yang, K. M.; Chung, H. S.

    Recently, with an increase in the number of studies of the safety of both pedestrians and passengers, computer software, such as MADYMO, Pam-crash, and LS-dyna, has been providing human models for computer simulation. Although such programs have been applied to make machines beneficial for humans, studies that analyze the reconstruction of accidents or crime scenes are rare. Therefore, through computational experiments, the present study presents reconstructions of two questionable accidents. In the first case, a car fell off the road and the driver was separated from it. The accident investigator was very confused because some circumstantial evidence suggested the possibility that the driver was murdered. In the second case, a woman died in her house and the police suspected foul play with her boyfriend as a suspect. These two cases were reconstructed using the human model in MADYMO software. The first case was eventually confirmed as a traffic accident in which the driver bounced out of the car when the car fell off, and the second case was proved to be suicide rather than homicide.

  6. Analysis of occupational accidents: prevention through the use of additional technical safety measures for machinery.

    PubMed

    Dźwiarek, Marek; Latała, Agata

    2016-01-01

    This article presents an analysis of results of 1035 serious and 341 minor accidents recorded by Poland's National Labour Inspectorate (PIP) in 2005-2011, in view of their prevention by means of additional safety measures applied by machinery users. Since the analysis aimed at formulating principles for the application of technical safety measures, the analysed accidents should bear additional attributes: the type of machine operation, technical safety measures and the type of events causing injuries. The analysis proved that the executed tasks and injury-causing events were closely connected and there was a relation between casualty events and technical safety measures. In the case of tasks consisting of manual feeding and collecting materials, the injuries usually occur because of the rotating motion of tools or crushing due to a closing motion. Numerous accidents also happened in the course of supporting actions, like removing pollutants, correcting material position, cleaning, etc.

  7. Utilization of accident databases and fuzzy sets to estimate frequency of HazMat transport accidents.

    PubMed

    Qiao, Yuanhua; Keren, Nir; Mannan, M Sam

    2009-08-15

    Risk assessment and management of transportation of hazardous materials (HazMat) require the estimation of accident frequency. This paper presents a methodology to estimate hazardous materials transportation accident frequency by utilizing publicly available databases and expert knowledge. The estimation process addresses route-dependent and route-independent variables. Negative binomial regression is applied to an analysis of the Department of Public Safety (DPS) accident database to derive basic accident frequency as a function of route-dependent variables, while the effects of route-independent variables are modeled by fuzzy logic. The integrated methodology provides the basis for an overall transportation risk analysis, which can be used later to develop a decision support system.

  8. [Analysis of radiation-hygienic and medical consequences of the Chernobyl accident].

    PubMed

    Onishchenko, G G

    2013-01-01

    Since the day of "the Chernobyl accident" in 1986 more than 25 years have been past. Radioactively contaminated areas 14 subjects of the Russian Federation with a total area of more than 50 thousand km2, where 1.5 million people now reside were exposed to radioactive contamination. Currently, a system of comprehensive evaluation of radiation doses of the population affected by the "Chernobyl accidents", including 11 guidance documents has been created. There are methodically provided works on the assessment of average annual, accumulated and predicted radiation doses of population and its critical groups, as well as doses to the thyroid gland The relevance of the analysis of the consequences of the "Chernobyl accident" is demonstrated by the events in Japan, at nuclear power Fukusima-1. In 2011 - 20/2 there were carried out comprehensive maritime expeditions under the auspices of the Russian Geographical Society with the participation of relevant ministries and agencies, leading academic institutions in Russia. In 2012, work was carried out on radiation protection of the population from the potential transboundary impact of the accident at the Japanese nuclear power plant Fukushima-l. The results provide a basis for the favorable outlook for the radiation environment in our Far East and the Pacific coast of Russia.

  9. Routes to failure: analysis of 41 civil aviation accidents from the Republic of China using the human factors analysis and classification system.

    PubMed

    Li, Wen-Chin; Harris, Don; Yu, Chung-San

    2008-03-01

    The human factors analysis and classification system (HFACS) is based upon Reason's organizational model of human error. HFACS was developed as an analytical framework for the investigation of the role of human error in aviation accidents, however, there is little empirical work formally describing the relationship between the components in the model. This research analyses 41 civil aviation accidents occurring to aircraft registered in the Republic of China (ROC) between 1999 and 2006 using the HFACS framework. The results show statistically significant relationships between errors at the operational level and organizational inadequacies at both the immediately adjacent level (preconditions for unsafe acts) and higher levels in the organization (unsafe supervision and organizational influences). The pattern of the 'routes to failure' observed in the data from this analysis of civil aircraft accidents show great similarities to that observed in the analysis of military accidents. This research lends further support to Reason's model that suggests that active failures are promoted by latent conditions in the organization. Statistical relationships linking fallible decisions in upper management levels were found to directly affect supervisory practices, thereby creating the psychological preconditions for unsafe acts and hence indirectly impairing the performance of pilots, ultimately leading to accidents.

  10. Design and validation of Segment--freely available software for cardiovascular image analysis.

    PubMed

    Heiberg, Einar; Sjögren, Jane; Ugander, Martin; Carlsson, Marcus; Engblom, Henrik; Arheden, Håkan

    2010-01-11

    Commercially available software for cardiovascular image analysis often has limited functionality and frequently lacks the careful validation that is required for clinical studies. We have already implemented a cardiovascular image analysis software package and released it as freeware for the research community. However, it was distributed as a stand-alone application and other researchers could not extend it by writing their own custom image analysis algorithms. We believe that the work required to make a clinically applicable prototype can be reduced by making the software extensible, so that researchers can develop their own modules or improvements. Such an initiative might then serve as a bridge between image analysis research and cardiovascular research. The aim of this article is therefore to present the design and validation of a cardiovascular image analysis software package (Segment) and to announce its release in a source code format. Segment can be used for image analysis in magnetic resonance imaging (MRI), computed tomography (CT), single photon emission computed tomography (SPECT) and positron emission tomography (PET). Some of its main features include loading of DICOM images from all major scanner vendors, simultaneous display of multiple image stacks and plane intersections, automated segmentation of the left ventricle, quantification of MRI flow, tools for manual and general object segmentation, quantitative regional wall motion analysis, myocardial viability analysis and image fusion tools. Here we present an overview of the validation results and validation procedures for the functionality of the software. We describe a technique to ensure continued accuracy and validity of the software by implementing and using a test script that tests the functionality of the software and validates the output. The software has been made freely available for research purposes in a source code format on the project home page http://segment.heiberg.se. Segment is

  11. Spotlight-8 Image Analysis Software

    NASA Technical Reports Server (NTRS)

    Klimek, Robert; Wright, Ted

    2006-01-01

    Spotlight is a cross-platform GUI-based software package designed to perform image analysis on sequences of images generated by combustion and fluid physics experiments run in a microgravity environment. Spotlight can perform analysis on a single image in an interactive mode or perform analysis on a sequence of images in an automated fashion. Image processing operations can be employed to enhance the image before various statistics and measurement operations are performed. An arbitrarily large number of objects can be analyzed simultaneously with independent areas of interest. Spotlight saves results in a text file that can be imported into other programs for graphing or further analysis. Spotlight can be run on Microsoft Windows, Linux, and Apple OS X platforms.

  12. Modelling Accident Tolerant Fuel Concepts

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hales, Jason Dean; Gamble, Kyle Allan Lawrence

    2016-05-01

    The catastrophic events that occurred at the Fukushima-Daiichi nuclear power plant in 2011 have led to widespread interest in research of alternative fuels and claddings that are proposed to be accident tolerant. The United States Department of Energy (DOE) through its Nuclear Energy Advanced Modeling and Simulation (NEAMS) program has funded an Accident Tolerant Fuel (ATF) High Impact Problem (HIP). The ATF HIP is a three-year project to perform research on two accident tolerant concepts. The final outcome of the ATF HIP will be an in-depth report to the DOE Advanced Fuels Campaign (AFC) giving a recommendation on whether eithermore » of the two concepts should be included in their lead test assembly scheduled for placement into a commercial reactor in 2022. The two ATF concepts under investigation in the HIP are uranium silicide fuel and iron-chromium-aluminum (FeCrAl) alloy cladding. Utilizing the expertise of three national laboratory participants (Idaho National Laboratory, Los Alamos National Laboratory, and Argonne National Laboratory), a comprehensive multiscale approach to modeling is being used that includes atomistic modeling, molecular dynamics, rate theory, phase-field, and fuel performance simulations. Model development and fuel performance analysis are critical since a full suite of experimental studies will not be complete before AFC must prioritize concepts for focused development. In this paper, we present simulations of the two proposed accident tolerance fuel systems: U3Si2 fuel with Zircaloy-4 cladding, and UO2 fuel with FeCrAl cladding. Sensitivity analyses are completed using Sandia National Laboratories’ Dakota software to determine which input parameters (e.g., fuel specific heat) have the greatest influence on the output metrics of interest (e.g., fuel centerline temperature). We also outline the multiscale modelling approach being employed. Considerable additional work is required prior to preparing the recommendation report for the

  13. PSGMiner: A modular software for polysomnographic analysis.

    PubMed

    Umut, İlhan

    2016-06-01

    Sleep disorders affect a great percentage of the population. The diagnosis of these disorders is usually made by polysomnography. This paper details the development of new software to carry out feature extraction in order to perform robust analysis and classification of sleep events using polysomnographic data. The software, called PSGMiner, is a tool, which visualizes, processes and classifies bioelectrical data. The purpose of this program is to provide researchers with a platform with which to test new hypotheses by creating tests to check for correlations that are not available in commercially available software. The software is freely available under the GPL3 License. PSGMiner is composed of a number of diverse modules such as feature extraction, annotation, and machine learning modules, all of which are accessible from the main module. Using the software, it is possible to extract features of polysomnography using digital signal processing and statistical methods and to perform different analyses. The features can be classified through the use of five classification algorithms. PSGMiner offers an architecture designed for integrating new methods. Automatic scoring, which is available in almost all commercial PSG software, is not inherently available in this program, though it can be implemented by two different methodologies (machine learning and algorithms). While similar software focuses on a certain signal or event composed of a small number of modules with no expansion possibility, the software introduced here can handle all polysomnographic signals and events. The software simplifies the processing of polysomnographic signals for researchers and physicians that are not experts in computer programming. It can find correlations between different events which could help predict an oncoming event such as sleep apnea. The software could also be used for educational purposes. Copyright © 2016 Elsevier Ltd. All rights reserved.

  14. An Analysis of Mission Critical Computer Software in Naval Aviation

    DTIC Science & Technology

    1991-03-01

    No. Task No. Work Unit Accesion Number 11. TITLE (Include Security Classification) AN ANALYSIS OF MISSION CRITICAL COMPUTER SOFTWARE IN NAVAL AVIATION...software development schedules were sustained without a milestone change being made. Also, software that was released to the fleet had no major...fleet contain any major defects? This research has revealed that only about half of the original software development schedules were sustained without a

  15. A tool to include gamma analysis software into a quality assurance program.

    PubMed

    Agnew, Christina E; McGarry, Conor K

    2016-03-01

    To provide a tool to enable gamma analysis software algorithms to be included in a quality assurance (QA) program. Four image sets were created comprising two geometric images to independently test the distance to agreement (DTA) and dose difference (DD) elements of the gamma algorithm, a clinical step and shoot IMRT field and a clinical VMAT arc. The images were analysed using global and local gamma analysis with 2 in-house and 8 commercially available software encompassing 15 software versions. The effect of image resolution on gamma pass rates was also investigated. All but one software accurately calculated the gamma passing rate for the geometric images. Variation in global gamma passing rates of 1% at 3%/3mm and over 2% at 1%/1mm was measured between software and software versions with analysis of appropriately sampled images. This study provides a suite of test images and the gamma pass rates achieved for a selection of commercially available software. This image suite will enable validation of gamma analysis software within a QA program and provide a frame of reference by which to compare results reported in the literature from various manufacturers and software versions. Copyright © 2015. Published by Elsevier Ireland Ltd.

  16. Professional experience and traffic accidents/near-miss accidents among truck drivers.

    PubMed

    Girotto, Edmarlon; Andrade, Selma Maffei de; González, Alberto Durán; Mesas, Arthur Eumann

    2016-10-01

    To investigate the relationship between the time working as a truck driver and the report of involvement in traffic accidents or near-miss accidents. A cross-sectional study was performed with truck drivers transporting products from the Brazilian grain harvest to the Port of Paranaguá, Paraná, Brazil. The drivers were interviewed regarding sociodemographic characteristics, working conditions, behavior in traffic and involvement in accidents or near-miss accidents in the previous 12 months. Subsequently, the participants answered a self-applied questionnaire on substance use. The time of professional experience as drivers was categorized in tertiles. Statistical analyses were performed through the construction of models adjusted by multinomial regression to assess the relationship between the length of experience as a truck driver and the involvement in accidents or near-miss accidents. This study included 665 male drivers with an average age of 42.2 (±11.1) years. Among them, 7.2% and 41.7% of the drivers reported involvement in accidents and near-miss accidents, respectively. In fully adjusted analysis, the 3rd tertile of professional experience (>22years) was shown to be inversely associated with involvement in accidents (odds ratio [OR] 0.29; 95% confidence interval [CI] 0.16-0.52) and near-miss accidents (OR 0.17; 95% CI 0.05-0.53). The 2nd tertile of professional experience (11-22 years) was inversely associated with involvement in accidents (OR 0.63; 95% CI 0.40-0.98). An evident relationship was observed between longer professional experience and a reduction in reporting involvement in accidents and near-miss accidents, regardless of age, substance use, working conditions and behavior in traffic. Copyright © 2016 Elsevier Ltd. All rights reserved.

  17. SESAME: a software tool for the numerical dosimetric reconstruction of radiological accidents involving external sources and its application to the accident in Chile in December 2005.

    PubMed

    Huet, C; Lemosquet, A; Clairand, I; Rioual, J B; Franck, D; de Carlan, L; Aubineau-Lanièce, I; Bottollier-Depois, J F

    2009-01-01

    Estimating the dose distribution in a victim's body is a relevant indicator in assessing biological damage from exposure in the event of a radiological accident caused by an external source. This dose distribution can be assessed by physical dosimetric reconstruction methods. Physical dosimetric reconstruction can be achieved using experimental or numerical techniques. This article presents the laboratory-developed SESAME--Simulation of External Source Accident with MEdical images--tool specific to dosimetric reconstruction of radiological accidents through numerical simulations which combine voxel geometry and the radiation-material interaction MCNP(X) Monte Carlo computer code. The experimental validation of the tool using a photon field and its application to a radiological accident in Chile in December 2005 are also described.

  18. Implementing Software Safety in the NASA Environment

    NASA Technical Reports Server (NTRS)

    Wetherholt, Martha S.; Radley, Charles F.

    1994-01-01

    Until recently, NASA did not consider allowing computers total control of flight systems. Human operators, via hardware, have constituted the ultimate safety control. In an attempt to reduce costs, NASA has come to rely more and more heavily on computers and software to control space missions. (For example. software is now planned to control most of the operational functions of the International Space Station.) Thus the need for systematic software safety programs has become crucial for mission success. Concurrent engineering principles dictate that safety should be designed into software up front, not tested into the software after the fact. 'Cost of Quality' studies have statistics and metrics to prove the value of building quality and safety into the development cycle. Unfortunately, most software engineers are not familiar with designing for safety, and most safety engineers are not software experts. Software written to specifications which have not been safety analyzed is a major source of computer related accidents. Safer software is achieved step by step throughout the system and software life cycle. It is a process that includes requirements definition, hazard analyses, formal software inspections, safety analyses, testing, and maintenance. The greatest emphasis is placed on clearly and completely defining system and software requirements, including safety and reliability requirements. Unfortunately, development and review of requirements are the weakest link in the process. While some of the more academic methods, e.g. mathematical models, may help bring about safer software, this paper proposes the use of currently approved software methodologies, and sound software and assurance practices to show how, to a large degree, safety can be designed into software from the start. NASA's approach today is to first conduct a preliminary system hazard analysis (PHA) during the concept and planning phase of a project. This determines the overall hazard potential of

  19. Causative factors and countermeasures for rural and suburban pedestrian accidents : accident data collection and analysis

    DOT National Transportation Integrated Search

    1977-03-01

    The objectives of the study were to collect and analyze data on rural pedestrian accidents and to identify potential countermeasures. Data on a stratified random sample of over 1,500 rural and suburban accidents from six states was collected during i...

  20. New software for 3D fracture network analysis and visualization

    NASA Astrophysics Data System (ADS)

    Song, J.; Noh, Y.; Choi, Y.; Um, J.; Hwang, S.

    2013-12-01

    This study presents new software to perform analysis and visualization of the fracture network system in 3D. The developed software modules for the analysis and visualization, such as BOUNDARY, DISK3D, FNTWK3D, CSECT and BDM, have been developed using Microsoft Visual Basic.NET and Visualization TookKit (VTK) open-source library. Two case studies revealed that each module plays a role in construction of analysis domain, visualization of fracture geometry in 3D, calculation of equivalent pipes, production of cross-section map and management of borehole data, respectively. The developed software for analysis and visualization of the 3D fractured rock mass can be used to tackle the geomechanical problems related to strength, deformability and hydraulic behaviors of the fractured rock masses.

  1. New software for statistical analysis of Cambridge Structural Database data

    PubMed Central

    Sykes, Richard A.; McCabe, Patrick; Allen, Frank H.; Battle, Gary M.; Bruno, Ian J.; Wood, Peter A.

    2011-01-01

    A collection of new software tools is presented for the analysis of geometrical, chemical and crystallographic data from the Cambridge Structural Database (CSD). This software supersedes the program Vista. The new functionality is integrated into the program Mercury in order to provide statistical, charting and plotting options alongside three-dimensional structural visualization and analysis. The integration also permits immediate access to other information about specific CSD entries through the Mercury framework, a common requirement in CSD data analyses. In addition, the new software includes a range of more advanced features focused towards structural analysis such as principal components analysis, cone-angle correction in hydrogen-bond analyses and the ability to deal with topological symmetry that may be exhibited in molecular search fragments. PMID:22477784

  2. Generalized Support Software: Domain Analysis and Implementation

    NASA Technical Reports Server (NTRS)

    Stark, Mike; Seidewitz, Ed

    1995-01-01

    For the past five years, the Flight Dynamics Division (FDD) at NASA's Goddard Space Flight Center has been carrying out a detailed domain analysis effort and is now beginning to implement Generalized Support Software (GSS) based on this analysis. GSS is part of the larger Flight Dynamics Distributed System (FDDS), and is designed to run under the FDDS User Interface / Executive (UIX). The FDD is transitioning from a mainframe based environment to systems running on engineering workstations. The GSS will be a library of highly reusable components that may be configured within the standard FDDS architecture to quickly produce low-cost satellite ground support systems. The estimates for the first release is that this library will contain approximately 200,000 lines of code. The main driver for developing generalized software is development cost and schedule improvement. The goal is to ultimately have at least 80 percent of all software required for a spacecraft mission (within the domain supported by the GSS) to be configured from the generalized components.

  3. Pharmaceutical advertisements in prescribing software: an analysis.

    PubMed

    Harvey, Ken J; Vitry, Agnes I; Roughead, Elizabeth; Aroni, Rosalie; Ballenden, Nicola; Faggotter, Ralph

    2005-07-18

    To assess pharmaceutical advertisements in prescribing software, their adherence to code standards, and the opinions of general practitioners regarding the advertisements. Content analysis of advertisements displayed by Medical Director version 2.81 (Health Communication Network, Sydney, NSW) in early 2005; thematic analysis of a debate on this topic held on the General Practice Computer Group email forum (GPCG_talk) during December 2004. Placement, frequency and type of advertisements; their compliance with the Medicines Australia Code of Conduct, and the views of GPs. 24 clinical functions in Medical Director contained advertisements. These included 79 different advertisements for 41 prescription products marketed by 17 companies, including one generic manufacturer. 57 of 60 (95%) advertisements making a promotional claim appeared noncompliant with one or more requirements of the Code. 29 contributors, primarily GPs, posted 174 emails to GPCG_talk; there was little support for these advertisements, but some concern that the price of software would increase if they were removed. We suggest that pharmaceutical promotion in prescribing software should be banned, and inclusion of independent therapeutic information be mandated.

  4. Tobit analysis of vehicle accident rates on interstate highways.

    PubMed

    Anastasopoulos, Panagiotis Ch; Tarko, Andrew P; Mannering, Fred L

    2008-03-01

    There has been an abundance of research that has used Poisson models and its variants (negative binomial and zero-inflated models) to improve our understanding of the factors that affect accident frequencies on roadway segments. This study explores the application of an alternate method, tobit regression, by viewing vehicle accident rates directly (instead of frequencies) as a continuous variable that is left-censored at zero. Using data from vehicle accidents on Indiana interstates, the estimation results show that many factors relating to pavement condition, roadway geometrics and traffic characteristics significantly affect vehicle accident rates.

  5. Using recurrence plot analysis for software execution interpretation and fault detection

    NASA Astrophysics Data System (ADS)

    Mosdorf, M.

    2015-09-01

    This paper shows a method targeted at software execution interpretation and fault detection using recurrence plot analysis. In in the proposed approach recurrence plot analysis is applied to software execution trace that contains executed assembly instructions. Results of this analysis are subject to further processing with PCA (Principal Component Analysis) method that simplifies number coefficients used for software execution classification. This method was used for the analysis of five algorithms: Bubble Sort, Quick Sort, Median Filter, FIR, SHA-1. Results show that some of the collected traces could be easily assigned to particular algorithms (logs from Bubble Sort and FIR algorithms) while others are more difficult to distinguish.

  6. Analysis of construction accidents in Turkey and responsible parties.

    PubMed

    Gürcanli, G Emre; Müngen, Uğur

    2013-01-01

    Construction is one of the world's biggest industry that includes jobs as diverse as building, civil engineering, demolition, renovation, repair and maintenance. Construction workers are exposed to a wide variety of hazards. This study analyzes 1,117 expert witness reports which were submitted to criminal and labour courts. These reports are from all regions of the country and cover the period 1972-2008. Accidents were classified by the consequence of the incident, time and main causes of the accident, construction type, occupation of the victim, activity at time of the accident and party responsible for the accident. Falls (54.1%), struck by thrown/falling object (12.9%), structural collapses (9.9%) and electrocutions (7.5%) rank first four places. The accidents were most likely between the hours 15:00 and 17:00 (22.6%), 10:00-12:00 (18.7%) and just after the lunchtime (9.9%). Additionally, the most common accidents were further divided into sub-types. Expert-witness assessments were used to identify the parties at fault and what acts of negligence typically lead to accidents. Nearly two thirds of the faulty and negligent acts are carried out by the employers and employees are responsible for almost one third of all cases.

  7. Analysis of Construction Accidents in Turkey and Responsible Parties

    PubMed Central

    GÜRCANLI, G. Emre; MÜNGEN, Uğur

    2013-01-01

    Construction is one of the world’s biggest industry that includes jobs as diverse as building, civil engineering, demolition, renovation, repair and maintenance. Construction workers are exposed to a wide variety of hazards. This study analyzes 1,117 expert witness reports which were submitted to criminal and labour courts. These reports are from all regions of the country and cover the period 1972–2008. Accidents were classified by the consequence of the incident, time and main causes of the accident, construction type, occupation of the victim, activity at time of the accident and party responsible for the accident. Falls (54.1%), struck by thrown/falling object (12.9%), structural collapses (9.9%) and electrocutions (7.5%) rank first four places. The accidents were most likely between the hours 15:00 and 17:00 (22.6%), 10:00–12:00 (18.7%) and just after the lunchtime (9.9%). Additionally, the most common accidents were further divided into sub-types. Expert-witness assessments were used to identify the parties at fault and what acts of negligence typically lead to accidents. Nearly two thirds of the faulty and negligent acts are carried out by the employers and employees are responsible for almost one third of all cases. PMID:24077446

  8. Analysis of occupational accidents: prevention through the use of additional technical safety measures for machinery

    PubMed Central

    Dźwiarek, Marek; Latała, Agata

    2016-01-01

    This article presents an analysis of results of 1035 serious and 341 minor accidents recorded by Poland's National Labour Inspectorate (PIP) in 2005–2011, in view of their prevention by means of additional safety measures applied by machinery users. Since the analysis aimed at formulating principles for the application of technical safety measures, the analysed accidents should bear additional attributes: the type of machine operation, technical safety measures and the type of events causing injuries. The analysis proved that the executed tasks and injury-causing events were closely connected and there was a relation between casualty events and technical safety measures. In the case of tasks consisting of manual feeding and collecting materials, the injuries usually occur because of the rotating motion of tools or crushing due to a closing motion. Numerous accidents also happened in the course of supporting actions, like removing pollutants, correcting material position, cleaning, etc. PMID:26652689

  9. The Comparison of VLBI Data Analysis Using Software Globl and Globk

    NASA Astrophysics Data System (ADS)

    Guangli, W.; Xiaoya, W.; Jinling, L.; Wenyao, Z.

    The comparison of different geodetic data analysis software is one of the quite of- ten mentioned topics. In this paper we try to find out the difference between software GLOBL and GLOBK when use them to process the same set of VLBI data. GLOBL is a software developed by VLBI team, geodesy branch, GSFC/NASA to process geode- tic VLBI data using algorithm of arc-parameter-elimination, while GLOBK using al- gorithm of kalman filtering is mainly used in GPS data analysis, and it is also used in VLBI data analysis. Our work focus on whether there are significant difference when use the two softwares to analyze the same VLBI data set and investigate the reasons caused the difference.

  10. Indonesian railway accidents--utilizing Human Factors Analysis and Classification System in determining potential contributing factors.

    PubMed

    Iridiastadi, Hardianto; Ikatrinasari, Zulfa Fitri

    2012-01-01

    The prevalence of Indonesian railway accidents has not been declining, with hundreds of fatalities reported in the past decade. As an effort to help the National Transportation Safety Committee (NTSC), this study was conducted that aimed at understanding factors that might have contributed to the accidents. Human Factors Analysis and Classification System (HFACS) was utilized for this purpose. A total of nine accident reports (provided by the Indonesian NTSC) involving fatalities were studied using the technique. Results of this study indicated 72 factors that were closely related to the accidents. Of these, roughly 22% were considered as operator acts while about 39% were related to preconditions for operator acts. Supervisory represented 14% of the factors, and the remaining (about 25%) were associated with organizational factors. It was concluded that, while train drivers indeed played an important role in the accidents, interventions solely directed toward train drivers may not be adequate. A more comprehensive approach in minimizing the accidents should be conducted that addresses all the four aspects of HFACS.

  11. Orbiter subsystem hardware/software interaction analysis. Volume 8: AFT reaction control system, part 2

    NASA Technical Reports Server (NTRS)

    Becker, D. D.

    1980-01-01

    The orbiter subsystems and interfacing program elements which interact with the orbiter computer flight software are analyzed. The failure modes identified in the subsystem/element failure mode and effects analysis are examined. Potential interaction with the software is examined through an evaluation of the software requirements. The analysis is restricted to flight software requirements and excludes utility/checkout software. The results of the hardware/software interaction analysis for the forward reaction control system are presented.

  12. Analysis of Two Electrocution Accidents in Greece that Occurred due to Unexpected Re-energization of Power Lines.

    PubMed

    Baka, Aikaterini D; Uzunoglu, Nikolaos K

    2014-09-01

    Investigation and analysis of accidents are critical elements of safety management. The over-riding purpose of an organization in carrying out an accident investigation is to prevent similar accidents, as well as seek a general improvement in the management of health and safety. Hundreds of workers have suffered injuries while installing, maintaining, or servicing machinery and equipment due to sudden re-energization of power lines. This study presents and analyzes two electrical accidents (1 fatal injury and 1 serious injury) that occurred because the power supply was reconnected inadvertently or by mistake.

  13. Analysis of Two Electrocution Accidents in Greece that Occurred due to Unexpected Re-energization of Power Lines

    PubMed Central

    Baka, Aikaterini D.; Uzunoglu, Nikolaos K.

    2014-01-01

    Investigation and analysis of accidents are critical elements of safety management. The over-riding purpose of an organization in carrying out an accident investigation is to prevent similar accidents, as well as seek a general improvement in the management of health and safety. Hundreds of workers have suffered injuries while installing, maintaining, or servicing machinery and equipment due to sudden re-energization of power lines. This study presents and analyzes two electrical accidents (1 fatal injury and 1 serious injury) that occurred because the power supply was reconnected inadvertently or by mistake. PMID:25379331

  14. Development problem analysis of correlation leak detector’s software

    NASA Astrophysics Data System (ADS)

    Faerman, V. A.; Avramchuk, V. S.; Marukyan, V. M.

    2018-05-01

    In the article, the practical application and the structure of the correlation leak detectors’ software is studied and the task of its designing is analyzed. In the first part of the research paper, the expediency of the facilities development of correlation leak detectors for the following operating efficiency of public utilities exploitation is shown. The analysis of the functional structure of correlation leak detectors is conducted and its program software tasks are defined. In the second part of the research paper some development steps of the software package – requirement forming, program structure definition and software concept creation – are examined in the context of the usage experience of the hardware-software prototype of correlation leak detector.

  15. Public Response to a Near-Miss Nuclear Accident Scenario Varying in Causal Attributions and Outcome Uncertainty.

    PubMed

    Cui, Jinshu; Rosoff, Heather; John, Richard S

    2018-05-01

    Many studies have investigated public reactions to nuclear accidents. However, few studies focused on more common events when a serious accident could have happened but did not. This study evaluated public response (emotional, cognitive, and behavioral) over three phases of a near-miss nuclear accident. Simulating a loss-of-coolant accident (LOCA) scenario, we manipulated (1) attribution for the initial cause of the incident (software failure vs. cyber terrorist attack vs. earthquake), (2) attribution for halting the incident (fail-safe system design vs. an intervention by an individual expert vs. a chance coincidence), and (3) level of uncertainty (certain vs. uncertain) about risk of a future radiation leak after the LOCA is halted. A total of 773 respondents were sampled using a 3 × 3 × 2 between-subjects design. Results from both MANCOVA and structural equation modeling (SEM) indicate that respondents experienced more negative affect, perceived more risk, and expressed more avoidance behavioral intention when the near-miss event was initiated by an external attributed source (e.g., earthquake) compared to an internally attributed source (e.g., software failure). Similarly, respondents also indicated greater negative affect, perceived risk, and avoidance behavioral intentions when the future impact of the near-miss incident on people and the environment remained uncertain. Results from SEM analyses also suggested that negative affect predicted risk perception, and both predicted avoidance behavior. Affect, risk perception, and avoidance behavior demonstrated high stability (i.e., reliability) from one phase to the next. © 2017 Society for Risk Analysis.

  16. Human error analysis of commercial aviation accidents using the human factors analysis and classification system (HFACS)

    DOT National Transportation Integrated Search

    2001-02-01

    The Human Factors Analysis and Classification System (HFACS) is a general human error framework : originally developed and tested within the U.S. military as a tool for investigating and analyzing the human : causes of aviation accidents. Based upon ...

  17. Automating risk analysis of software design models.

    PubMed

    Frydman, Maxime; Ruiz, Guifré; Heymann, Elisa; César, Eduardo; Miller, Barton P

    2014-01-01

    The growth of the internet and networked systems has exposed software to an increased amount of security threats. One of the responses from software developers to these threats is the introduction of security activities in the software development lifecycle. This paper describes an approach to reduce the need for costly human expertise to perform risk analysis in software, which is common in secure development methodologies, by automating threat modeling. Reducing the dependency on security experts aims at reducing the cost of secure development by allowing non-security-aware developers to apply secure development with little to no additional cost, making secure development more accessible. To automate threat modeling two data structures are introduced, identification trees and mitigation trees, to identify threats in software designs and advise mitigation techniques, while taking into account specification requirements and cost concerns. These are the components of our model for automated threat modeling, AutSEC. We validated AutSEC by implementing it in a tool based on data flow diagrams, from the Microsoft security development methodology, and applying it to VOMS, a grid middleware component, to evaluate our model's performance.

  18. Automating Risk Analysis of Software Design Models

    PubMed Central

    Ruiz, Guifré; Heymann, Elisa; César, Eduardo; Miller, Barton P.

    2014-01-01

    The growth of the internet and networked systems has exposed software to an increased amount of security threats. One of the responses from software developers to these threats is the introduction of security activities in the software development lifecycle. This paper describes an approach to reduce the need for costly human expertise to perform risk analysis in software, which is common in secure development methodologies, by automating threat modeling. Reducing the dependency on security experts aims at reducing the cost of secure development by allowing non-security-aware developers to apply secure development with little to no additional cost, making secure development more accessible. To automate threat modeling two data structures are introduced, identification trees and mitigation trees, to identify threats in software designs and advise mitigation techniques, while taking into account specification requirements and cost concerns. These are the components of our model for automated threat modeling, AutSEC. We validated AutSEC by implementing it in a tool based on data flow diagrams, from the Microsoft security development methodology, and applying it to VOMS, a grid middleware component, to evaluate our model's performance. PMID:25136688

  19. Underreporting of maritime accidents to vessel accident databases.

    PubMed

    Hassel, Martin; Asbjørnslett, Bjørn Egil; Hole, Lars Petter

    2011-11-01

    Underreporting of maritime accidents is a problem not only for authorities trying to improve maritime safety through legislation, but also to risk management companies and other entities using maritime casualty statistics in risk and accident analysis. This study collected and compared casualty data from 01.01.2005 to 31.12.2009, from IHS Fairplay and the maritime authorities from a set of nations. The data was compared to find common records, and estimation of the true number of occurred accidents was performed using conditional probability given positive dependency between data sources, several variations of the capture-recapture method, calculation of best case scenario assuming perfect reporting, and scaling up a subset of casualty information from a marine insurance statistics database. The estimated upper limit reporting performance for the selected flag states ranged from 14% to 74%, while the corresponding estimated coverage of IHS Fairplay ranges from 4% to 62%. On average the study results document that the number of unreported accidents makes up roughly 50% of all occurred accidents. Even in a best case scenario, only a few flag states come close to perfect reporting (94%). The considerable scope of underreporting uncovered in the study, indicates that users of statistical vessel accident data should assume a certain degree of underreporting, and adjust their analyses accordingly. Whether to use correction factors, a safety margin, or rely on expert judgment, should be decided on a case by case basis. Copyright © 2011 Elsevier Ltd. All rights reserved.

  20. Development of Cell Analysis Software for Cultivated Corneal Endothelial Cells.

    PubMed

    Okumura, Naoki; Ishida, Naoya; Kakutani, Kazuya; Hongo, Akane; Hiwa, Satoru; Hiroyasu, Tomoyuki; Koizumi, Noriko

    2017-11-01

    To develop analysis software for cultured human corneal endothelial cells (HCECs). Software was designed to recognize cell borders and to provide parameters such as cell density, coefficient of variation, and polygonality of cultured HCECs based on phase contrast images. Cultured HCECs with high or low cell density were incubated with Ca-free and Mg-free phosphate-buffered saline for 10 minutes to reveal the cell borders and were then analyzed with software (n = 50). Phase contrast images showed that cell borders were not distinctly outlined, but these borders became more distinctly outlined after phosphate-buffered saline treatment and were recognized by cell analysis software. The cell density value provided by software was similar to that obtained using manual cell counting by an experienced researcher. Morphometric parameters, such as the coefficient of variation and polygonality, were also produced by software, and these values were significantly correlated with cell density (Pearson correlation coefficients -0.62 and 0.63, respectively). The software described here provides morphometric information from phase contrast images, and it enables subjective and noninvasive quality assessment for tissue engineering therapy of the corneal endothelium.

  1. Preliminary Modeling of Accident Tolerant Fuel Concepts under Accident Conditions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gamble, Kyle A.; Hales, Jason D.

    2016-12-01

    The catastrophic events that occurred at the Fukushima-Daiichi nuclear power plant in 2011 have led to widespread interest in research of alternative fuels and claddings that are proposed to be accident tolerant. Thus, the United States Department of Energy through its NEAMS (Nuclear Energy Advanced Modeling and Simulation) program has funded an Accident Tolerant Fuel (ATF) High Impact Problem (HIP). The ATF HIP is funded for a three-year period. The purpose of the HIP is to perform research into two potential accident tolerant concepts and provide an in-depth report to the Advanced Fuels Campaign (AFC) describing the behavior of themore » concepts, both of which are being considered for inclusion in a lead test assembly scheduled for placement into a commercial reactor in 2022. The initial focus of the HIP is on uranium silicide fuel and iron-chromium-aluminum (FeCrAl) alloy cladding. Utilizing the expertise of three national laboratory participants (INL, LANL, and ANL) a comprehensive mulitscale approach to modeling is being used including atomistic modeling, molecular dynamics, rate theory, phase-field, and fuel performance simulations. In this paper, we present simulations of two proposed accident tolerant fuel systems: U3Si2 fuel with Zircaloy-4 cladding, and UO2 fuel with FeCrAl cladding. The simulations investigate the fuel performance response of the proposed ATF systems under Loss of Coolant and Station Blackout conditions using the BISON code. Sensitivity analyses are completed using Sandia National Laboratories’ DAKOTA software to determine which input parameters (e.g., fuel specific heat) have the greatest influence on the output metrics of interest (e.g., fuel centerline temperature). Early results indicate that each concept has significant advantages as well as areas of concern. Further work is required prior to formulating the proposition report for the Advanced Fuels Campaign.« less

  2. Failure-Modes-And-Effects Analysis Of Software Logic

    NASA Technical Reports Server (NTRS)

    Garcia, Danny; Hartline, Thomas; Minor, Terry; Statum, David; Vice, David

    1996-01-01

    Rigorous analysis applied early in design effort. Method of identifying potential inadequacies and modes and effects of failures caused by inadequacies (failure-modes-and-effects analysis or "FMEA" for short) devised for application to software logic.

  3. RELAP5 Application to Accident Analysis of the NIST Research Reactor

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Baek, J.; Cuadra Gascon, A.; Cheng, L.Y.

    Detailed safety analyses have been performed for the 20 MW D{sub 2}O moderated research reactor (NBSR) at the National Institute of Standards and Technology (NIST). The time-dependent analysis of the primary system is determined with a RELAP5 transient analysis model that includes the reactor vessel, the pump, heat exchanger, fuel element geometry, and flow channels for both the six inner and twenty-four outer fuel elements. A post-processing of the simulation results has been conducted to evaluate minimum critical heat flux ratio (CHFR) using the Sudo-Kaminaga correlation. Evaluations are performed for the following accidents: (1) the control rod withdrawal startup accidentmore » and (2) the maximum reactivity insertion accident. In both cases the RELAP5 results indicate that there is adequate margin to CHF and no damage to the fuel will occur because of sufficient coolant flow through the fuel channels and the negative scram reactivity insertion.« less

  4. Development of Automated Image Analysis Software for Suspended Marine Particle Classification

    DTIC Science & Technology

    2002-09-30

    Development of Automated Image Analysis Software for Suspended Marine Particle Classification Scott Samson Center for Ocean Technology...and global water column. 1 OBJECTIVES The project’s objective is to develop automated image analysis software to reduce the effort and time

  5. Analysis-Software for Hyperspectral Algal Reflectance Probes v. 1.0

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Timlin, Jerilyn A.; Reichardt, Thomas A.; Jenson, Travis J.

    This software provides onsite analysis of the hyperspectral reflectance data acquired on an outdoor algal pond by a multichannel, fiber-coupled spectroradiometer. The analysis algorithm is based on numerical inversion of a reflectance model, in which the above-water reflectance is expressed as a function of the single backscattering albedo, which is dependent on the backscatter and absorption coefficients of the algal culture, which are in turn related to the algal biomass and pigment optical activity, respectively. Prior to the development of this software, while raw multichannel data were displayed in real time, analysis required a post-processing procedure to extract the relevantmore » parameters. This software provides the capability to track the temporal variation of such culture parameters in real time, as raw data are being acquired, or can be run in a post processing mode. The software allows the user to select between different algal species, incorporate the appropriate calibration data, and observe the quality of the resulting model inversions.« less

  6. Exploratory reconstructability analysis of accident TBI data

    NASA Astrophysics Data System (ADS)

    Zwick, Martin; Carney, Nancy; Nettleton, Rosemary

    2018-02-01

    This paper describes the use of reconstructability analysis to perform a secondary study of traumatic brain injury data from automobile accidents. Neutral searches were done and their results displayed with a hypergraph. Directed searches, using both variable-based and state-based models, were applied to predict performance on two cognitive tests and one neurological test. Very simple state-based models gave large uncertainty reductions for all three DVs and sizeable improvements in percent correct for the two cognitive test DVs which were equally sampled. Conditional probability distributions for these models are easily visualized with simple decision trees. Confounding variables and counter-intuitive findings are also reported.

  7. A methodology for accident analysis of fusion breeder blankets and its application to helium-cooled lead–lithium blanket

    DOE PAGES

    Panayotov, Dobromir; Poitevin, Yves; Grief, Andrew; ...

    2016-09-23

    'Fusion for Energy' (F4E) is designing, developing, and implementing the European Helium-Cooled Lead-Lithium (HCLL) and Helium-Cooled Pebble-Bed (HCPB) Test Blanket Systems (TBSs) for ITER (Nuclear Facility INB-174). Safety demonstration is an essential element for the integration of these TBSs into ITER and accident analysis is one of its critical components. A systematic approach to accident analysis has been developed under the F4E contract on TBS safety analyses. F4E technical requirements, together with Amec Foster Wheeler and INL efforts, have resulted in a comprehensive methodology for fusion breeding blanket accident analysis that addresses the specificity of the breeding blanket designs, materials,more » and phenomena while remaining consistent with the approach already applied to ITER accident analyses. Furthermore, the methodology phases are illustrated in the paper by its application to the EU HCLL TBS using both MELCOR and RELAP5 codes.« less

  8. Risk-based Analysis of Construction Accidents in Iran During 2007-2011-Meta Analyze Study

    PubMed Central

    AMIRI, Mehran; ARDESHIR, Abdollah; FAZEL ZARANDI, Mohammad Hossein

    2014-01-01

    Abstract Background The present study aimed to investigate the characteristics of occupational accidents and frequency and severity of work related accidents in the construction industry among Iranian insured workers during the years 20072011. Methods The Iranian Social Security Organization (ISSO) accident database containing 21,864 cases between the years 2007-2011 was applied in this study. In the next step, Total Accident Rate (TRA), Total Severity Index (TSI), and Risk Factor (RF) were defined. The core of this work is devoted to analyzing the data from different perspectives such as age of workers, occupation and construction phase, day of the week, time of the day, seasonal analysis, regional considerations, type of accident, and body parts affected. Results Workers between 15-19 years old (TAR=13.4%) are almost six times more exposed to risk of accident than the average of all ages (TAR=2.51%). Laborers and structural workers (TAR=66.6%) and those working at heights (TAR=47.2%) experience more accidents than other groups of workers. Moreover, older workers over 65 years old (TSI=1.97%> average TSI=1.60%), work supervisors (TSI=12.20% >average TSI=9.09%), and night shift workers (TSI=1.89% >average TSI=1.47%) are more prone to severe accidents. Conclusion It is recommended that laborers, young workers, weekend and night shift workers be supervised more carefully in the workplace. Use of Personal Protective Equipment (PPE) should be compulsory in working environments, and special attention should be undertaken to people working outdoors and at heights. It is also suggested that policymakers pay more attention to the improvement of safety conditions in deprived and cold western regions. PMID:26005662

  9. Risk-based Analysis of Construction Accidents in Iran During 2007-2011-Meta Analyze Study.

    PubMed

    Amiri, Mehran; Ardeshir, Abdollah; Fazel Zarandi, Mohammad Hossein

    2014-04-01

    The present study aimed to investigate the characteristics of occupational accidents and frequency and severity of work related accidents in the construction industry among Iranian insured workers during the years 20072011. The Iranian Social Security Organization (ISSO) accident database containing 21,864 cases between the years 2007-2011 was applied in this study. In the next step, Total Accident Rate (TRA), Total Severity Index (TSI), and Risk Factor (RF) were defined. The core of this work is devoted to analyzing the data from different perspectives such as age of workers, occupation and construction phase, day of the week, time of the day, seasonal analysis, regional considerations, type of accident, and body parts affected. Workers between 15-19 years old (TAR=13.4%) are almost six times more exposed to risk of accident than the average of all ages (TAR=2.51%). Laborers and structural workers (TAR=66.6%) and those working at heights (TAR=47.2%) experience more accidents than other groups of workers. Moreover, older workers over 65 years old (TSI=1.97%> average TSI=1.60%), work supervisors (TSI=12.20% >average TSI=9.09%), and night shift workers (TSI=1.89% >average TSI=1.47%) are more prone to severe accidents. It is recommended that laborers, young workers, weekend and night shift workers be supervised more carefully in the workplace. Use of Personal Protective Equipment (PPE) should be compulsory in working environments, and special attention should be undertaken to people working outdoors and at heights. It is also suggested that policymakers pay more attention to the improvement of safety conditions in deprived and cold western regions.

  10. Uncertainty analysis of accident notification time and emergency medical service response time in work zone traffic accidents.

    PubMed

    Meng, Qiang; Weng, Jinxian

    2013-01-01

    Taking into account the uncertainty caused by exogenous factors, the accident notification time (ANT) and emergency medical service (EMS) response time were modeled as 2 random variables following the lognormal distribution. Their mean values and standard deviations were respectively formulated as the functions of environmental variables including crash time, road type, weekend, holiday, light condition, weather, and work zone type. Work zone traffic accident data from the Fatality Analysis Report System between 2002 and 2009 were utilized to determine the distributions of the ANT and the EMS arrival time in the United States. A mixed logistic regression model, taking into account the uncertainty associated with the ANT and the EMS response time, was developed to estimate the risk of death. The results showed that the uncertainty of the ANT was primarily influenced by crash time and road type, whereas the uncertainty of EMS response time is greatly affected by road type, weather, and light conditions. In addition, work zone accidents occurring during a holiday and in poor light conditions were found to be statistically associated with a longer mean ANT and longer EMS response time. The results also show that shortening the ANT was a more effective approach in reducing the risk of death than the EMS response time in work zones. To shorten the ANT and the EMS response time, work zone activities are suggested to be undertaken during non-holidays, during the daytime, and in good weather and light conditions.

  11. Causes of accidents in terrain parks: an exploratory factor analysis of recreational freestylers' views.

    PubMed

    Carús, Luis

    2014-03-01

    This study examines ski and snowboard terrain park users' views on aspects associated with accidents by identifying and assessing variables that may influence the occurrence of accidents and the resulting injuries. The research was conducted in a major resort in the Spanish Pyrenees, using information gathered from freestyle skiers and snowboarders aged 6 or older. To identify interrelationships among variables and to group the variables belonging to unified concepts, an exploratory factor analysis was performed using varimax rotation. The results revealed 5 factors that grouped the measured variables that may influence the occurrence of accidents while freestyling in terrain parks. The park features, conditions of the activity, and the user's personal conditions were found to have the most substantial influence on the freestylers' perceptions. Variables identified as components of the main factors of accident risk in terrain parks should be incorporated into resort management communication and policies. © 2013 Wilderness Medical Society Published by Wilderness Medical Society All rights reserved.

  12. Causative factors and countermeasures for rural and suburban pedestrian accidents : accident data collection and analysis--appendices

    DOT National Transportation Integrated Search

    1977-06-01

    The objectives of the study were to collect and analyze data on rural pedestrian accidents and to identify potential countermeasures. Data on a stratified random sample of over 1,500 rural and suburban accidents from six states was collected during i...

  13. The software analysis project for the Office of Human Resources

    NASA Technical Reports Server (NTRS)

    Tureman, Robert L., Jr.

    1994-01-01

    There were two major sections of the project for the Office of Human Resources (OHR). The first section was to conduct a planning study to analyze software use with the goal of recommending software purchases and determining whether the need exists for a file server. The second section was analysis and distribution planning for retirement planning computer program entitled VISION provided by NASA Headquarters. The software planning study was developed to help OHR analyze the current administrative desktop computing environment and make decisions regarding software acquisition and implementation. There were three major areas addressed by the study: current environment new software requirements, and strategies regarding the implementation of a server in the Office. To gather data on current environment, employees were surveyed and an inventory of computers were produced. The surveys were compiled and analyzed by the ASEE fellow with interpretation help by OHR staff. New software requirements represented a compilation and analysis of the surveyed requests of OHR personnel. Finally, the information on the use of a server represents research done by the ASEE fellow and analysis of survey data to determine software requirements for a server. This included selection of a methodology to estimate the number of copies of each software program required given current use and estimated growth. The report presents the results of the computing survey, a description of the current computing environment, recommenations for changes in the computing environment, current software needs, management advantages of using a server, and management considerations in the implementation of a server. In addition, detailed specifications were presented for the hardware and software recommendations to offer a complete picture to OHR management. The retirement planning computer program available to NASA employees will aid in long-range retirement planning. The intended audience is the NASA civil

  14. Analysis of National Major Work Safety Accidents in China, 2003–2012

    PubMed Central

    YE, Yunfeng; ZHANG, Siheng; RAO, Jiaming; WANG, Haiqing; LI, Yang; WANG, Shengyong; DONG, Xiaomei

    2016-01-01

    Background: This study provides a national profile of major work safety accidents in China, which cause more than 10 fatalities per accident, intended to provide scientific basis for prevention measures and strategies to reduce major work safety accidents and deaths. Methods: Data from 2003–2012 Census of major work safety accidents were collected from State Administration of Work Safety System (SAWS). Published literature and statistical yearbook were also included to implement information. We analyzed the frequency of accidents and deaths, trend, geographic distribution and injury types. Additionally, we discussed the severity and urgency of emergency rescue by types of accidents. Results: A total of 877 major work safety accidents were reported, resulting in 16,795 deaths and 9,183 injuries. The numbers of accidents and deaths, mortality rate and incidence of major accidents have declined in recent years. The mortality rate and incidence was 0.71 and 1.20 per 106 populations in 2012, respectively. Transportation and mining contributed to the highest number of major accidents and deaths. Major aviation and railway accidents caused more casualties per incident, while collapse, machinery, electrical shock accidents and tailing dam accidents were the most severe situation that resulted in bigger proportion of death. Conclusion: Ten years’ major work safety accident data indicate that the frequency of accidents and number of eaths was declined and several safety concerns persist in some segments. PMID:27057515

  15. Video analysis of the biomechanics of a bicycle accident resulting in significant facial fractures.

    PubMed

    Syed, Shameer H; Willing, Ryan; Jenkyn, Thomas R; Yazdani, Arjang

    2013-11-01

    This study aimed to use video analysis techniques to determine the velocity, impact force, angle of impact, and impulse to fracture involved in a video-recorded bicycle accident resulting in facial fractures. Computed tomographic images of the resulting facial injury are presented for correlation with data and calculations. To our knowledge, such an analysis of an actual recorded trauma has not been reported in the literature. A video recording of the accident was split into frames and analyzed using an image editing program. Measurements of velocity and angle of impact were obtained from this analysis, and the force of impact and impulse were calculated using the inverse dynamic method with connected rigid body segments. These results were then correlated with the actual fracture pattern found on computed tomographic imaging of the subject's face. There was an impact velocity of 6.25 m/s, impact angles of 14 and 6.3 degrees of neck extension and axial rotation, respectively, an impact force of 1910.4 N, and an impulse to fracture of 47.8 Ns. These physical parameters resulted in clinically significant bilateral mid-facial Le Fort II and III pattern fractures. These data confer further understanding of the biomechanics of bicycle-related accidents by correlating an actual clinical outcome with the kinematic and dynamic parameters involved in the accident itself and yielding a concrete evidence of the velocity, force, and impulse necessary to cause clinically significant facial trauma. These findings can aid in the design of protective equipment for bicycle riders to help avoid this type of injury.

  16. Preliminary Analysis of Aircraft Loss of Control Accidents: Worst Case Precursor Combinations and Temporal Sequencing

    NASA Technical Reports Server (NTRS)

    Belcastro, Christine M.; Groff, Loren; Newman, Richard L.; Foster, John V.; Crider, Dennis H.; Klyde, David H.; Huston, A. McCall

    2014-01-01

    Aircraft loss of control (LOC) is a leading cause of fatal accidents across all transport airplane and operational classes, and can result from a wide spectrum of hazards, often occurring in combination. Technologies developed for LOC prevention and recovery must therefore be effective under a wide variety of conditions and uncertainties, including multiple hazards, and their validation must provide a means of assessing system effectiveness and coverage of these hazards. This requires the definition of a comprehensive set of LOC test scenarios based on accident and incident data as well as future risks. This paper defines a comprehensive set of accidents and incidents over a recent 15 year period, and presents preliminary analysis results to identify worst-case combinations of causal and contributing factors (i.e., accident precursors) and how they sequence in time. Such analyses can provide insight in developing effective solutions for LOC, and form the basis for developing test scenarios that can be used in evaluating them. Preliminary findings based on the results of this paper indicate that system failures or malfunctions, crew actions or inactions, vehicle impairment conditions, and vehicle upsets contributed the most to accidents and fatalities, followed by inclement weather or atmospheric disturbances and poor visibility. Follow-on research will include finalizing the analysis through a team consensus process, defining future risks, and developing a comprehensive set of test scenarios with correlation to the accidents, incidents, and future risks. Since enhanced engineering simulations are required for batch and piloted evaluations under realistic LOC precursor conditions, these test scenarios can also serve as a high-level requirement for defining the engineering simulation enhancements needed for generating them.

  17. Learn by Yourself: The Self-Learning Tools for Qualitative Analysis Software Packages

    ERIC Educational Resources Information Center

    Freitas, Fábio; Ribeiro, Jaime; Brandão, Catarina; Reis, Luís Paulo; de Souza, Francislê Neri; Costa, António Pedro

    2017-01-01

    Computer Assisted Qualitative Data Analysis Software (CAQDAS) are tools that help researchers to develop qualitative research projects. These software packages help the users with tasks such as transcription analysis, coding and text interpretation, writing and annotation, content search and analysis, recursive abstraction, grounded theory…

  18. Tank car accident data analysis

    DOT National Transportation Integrated Search

    1991-06-01

    This report presents the results of a study of accidents involving railroad tank cars. The study is part of an overall effort to provide improved safety of rail transportation at reduced life-cycle costs. A major goal of the study is to provide a tec...

  19. Meta-analysis of the effect of road safety campaigns on accidents.

    PubMed

    Phillips, Ross Owen; Ulleberg, Pål; Vaa, Truls

    2011-05-01

    A meta-analysis of 67 studies evaluating the effect of road safety campaigns on accidents is reported. A total of 119 results were extracted from the studies, which were reported in 12 different countries between 1975 and 2007. After allowing for publication bias and heterogeneity of effects, the weighted average effect of road safety campaigns is a 9% reduction in accidents (with 95% confidence that the weighted average is between -12 and -6%). To account for the variability of effects measured across studies, data were collected to characterise aspects of the campaign and evaluation design associated with each effect, and analysed to identify a model of seven campaign factors for testing by meta-regression. The model was tested using both fixed and random effect meta-regression, and dependency among effects was accounted for by aggregation. These analyses suggest positive associations between accident reduction and the use of personal communication or roadside media as part of a campaign delivery strategy. Campaigns with a drink-driving theme were also associated with greater accident reductions, while some of the analyses suggested that accompanying enforcement and short campaign duration (less than one month) are beneficial. Overall the results are consistent with the idea that campaigns can be more effective in the short term if the message is delivered with personal communication in a way that is proximal in space and time to the behaviour targeted by the campaign. Copyright © 2011 Elsevier Ltd. All rights reserved.

  20. Comparison Campaign of VLBI Data Analysis Software - First Results

    NASA Technical Reports Server (NTRS)

    Plank, Lucia; Bohm, Johannes; Schuh, Harald

    2010-01-01

    During the development of the Vienna VLBI Software VieVS at the Institute of Geodesy and Geophysics at Vienna University of Technology, a special comparison setup was developed with the goal of easily finding links between deviations of results achieved with different software packages and certain parameters of the observation. The object of comparison is the computed time delay, a value calculated for each observation including all relevant models and corrections that need to be applied in geodetic VLBI analysis. Besides investigating the effects of the various models on the total delay, results of comparisons between VieVS and Occam 6.1 are shown. Using the same methods, a Comparison Campaign of VLBI data analysis software called DeDeCC is about to be launched within the IVS soon.

  1. Assessing accident phobia in mild traumatic brain injury: The Accident Fear Questionnaire.

    PubMed

    Sutherland, Jessica; Middleton, Jason; Ornstein, Tisha J; Lawson, Kerry; Vickers, Kristin

    2016-08-01

    Despite a documented prevalence of accident phobia in almost 40% of motor vehicle accident (MVA) survivors, the onset of accident phobia after traumatic brain injury (TBI) remains poorly understood. There is currently a body of knowledge about posttraumatic stress disorder (PTSD) in patients with TBI, but less is known about accident phobia following TBI, particularly in cases of mild TBI (mTBI). Accident phobia can impede safe return to driving or motor vehicle travel, inhibiting return to daily functioning. In addition, pain complaints have been found to correlate positively with postinjury anxiety disorders. The present study sought to determine the reliability and validity of the Accident Fear Questionnaire (AFQ), a measure used to assess accident phobia, in 72 patients with mTBI using secondary data analysis and the subsequent development of accident phobia postinjury. Furthermore, we sought to examine the impact of pain, anxiety, and depression complaints on the AFQ. Results reveal convergent validity and reliability in mTBI populations. Additionally, pain, anxiety, and depression measures were significantly correlated with scores on the AFQ. Psychometrically, the phobia avoidance subscale of the AFQ is a reliable measure for use with mTBI populations, although some limitations were found. In particular, the accident profile (AP) subscale was not found to be reliable or valid and could be eliminated from the AFQ. Collectively, the present study contributes to the small body of published literature evaluating accident phobia in patients with mTBI and the impact of pain on the development of postinjury anxiety disorders. (PsycINFO Database Record (c) 2016 APA, all rights reserved).

  2. Profile of an accident flying squad.

    PubMed

    Little, K

    1972-09-30

    An analysis of 184 accident flying squad calls and of 280 patients injured in road accidents and treated by a flying squad based on an accident department inclusive from 1967 to 1971 has shown that such a service can provide an efficient system without disrupting the routine work of the hospital.

  3. Software design for analysis of multichannel intracardial and body surface electrocardiograms.

    PubMed

    Potse, Mark; Linnenbank, André C; Grimbergen, Cornelis A

    2002-11-01

    Analysis of multichannel ECG recordings (body surface maps (BSMs) and intracardial maps) requires special software. We created a software package and a user interface on top of a commercial data analysis package (MATLAB) by a combination of high-level and low-level programming. Our software was created to satisfy the needs of a diverse group of researchers. It can handle a large variety of recording configurations. It allows for interactive usage through a fast and robust user interface, and batch processing for the analysis of large amounts of data. The package is user-extensible, includes routines for both common and experimental data processing tasks, and works on several computer platforms. The source code is made intelligible using software for structured documentation and is available to the users. The package is currently used by more than ten research groups analysing ECG data worldwide.

  4. Human error analysis of commercial aviation accidents: application of the Human Factors Analysis and Classification system (HFACS).

    PubMed

    Wiegmann, D A; Shappell, S A

    2001-11-01

    The Human Factors Analysis and Classification System (HFACS) is a general human error framework originally developed and tested within the U.S. military as a tool for investigating and analyzing the human causes of aviation accidents. Based on Reason's (1990) model of latent and active failures, HFACS addresses human error at all levels of the system, including the condition of aircrew and organizational factors. The purpose of the present study was to assess the utility of the HFACS framework as an error analysis and classification tool outside the military. The HFACS framework was used to analyze human error data associated with aircrew-related commercial aviation accidents that occurred between January 1990 and December 1996 using database records maintained by the NTSB and the FAA. Investigators were able to reliably accommodate all the human causal factors associated with the commercial aviation accidents examined in this study using the HFACS system. In addition, the classification of data using HFACS highlighted several critical safety issues in need of intervention research. These results demonstrate that the HFACS framework can be a viable tool for use within the civil aviation arena. However, additional research is needed to examine its applicability to areas outside the flight deck, such as aircraft maintenance and air traffic control domains.

  5. Gyroplane accidents 1985-2005: epidemiological analysis and pilot factors in 223 events.

    PubMed

    Pagán, Brian J; de Voogt, Alex

    2008-10-01

    Gyroplanes (autogyros) are regarded as a relatively safe and stable type of general-aviation aircraft. The U.S. Federal Aviation Administration categorizes them as sport pilot/light sport aircraft, and reports of gyroplane accidents are included in a publicly available database. We hypothesized that issues related to pilot experience and aircraft maintenance would affect the severity of accidents as indicated by aircraft damage and fatalities. A search of the National Transportation Safety Board database for the period 1985-2005 yielded 223 reports of gyroplane accidents. Information from those reports was compiled and cross-referenced with pilot performance breakdowns and contextual information. The data was then analyzed using the Human Factors Analysis and Classification System. There was a strong effect of pilot experience on crash outcomes; compared to more experienced pilots, crashes involving pilots with less than 40 flight hours in the same make/model gyroplane were five times more likely to involve loss of control, twice as likely to destroy the aircraft, and four times more likely to involve fatalities. On the other hand, crashes involving pilots with more than 40 make/model hours were more likely to be related to perception-based performance breakdown. Maintenance issues were not found to play a significant role in this sample of crashes. The results support the hypothesis that pilot experience is a significant predictor of accident fatality in gyroplanes. Training that is adapted to the experience level of pilots as implemented in new FAA regulations for sport pilot and light sport aircraft (2004) may help to reduce the frequency and seriousness of gyroplane accidents.

  6. CADDIS Volume 4. Data Analysis: Download Software

    EPA Pesticide Factsheets

    Overview of the data analysis tools available for download on CADDIS. Provides instructions for downloading and installing CADStat, access to Microsoft Excel macro for computing SSDs, a brief overview of command line use of R, a statistical software.

  7. Team Software Development for Aerothermodynamic and Aerodynamic Analysis and Design

    NASA Technical Reports Server (NTRS)

    Alexandrov, N.; Atkins, H. L.; Bibb, K. L.; Biedron, R. T.; Carpenter, M. H.; Gnoffo, P. A.; Hammond, D. P.; Jones, W. T.; Kleb, W. L.; Lee-Rausch, E. M.

    2003-01-01

    A collaborative approach to software development is described. The approach employs the agile development techniques: project retrospectives, Scrum status meetings, and elements of Extreme Programming to efficiently develop a cohesive and extensible software suite. The software product under development is a fluid dynamics simulator for performing aerodynamic and aerothermodynamic analysis and design. The functionality of the software product is achieved both through the merging, with substantial rewrite, of separate legacy codes and the authorship of new routines. Examples of rapid implementation of new functionality demonstrate the benefits obtained with this agile software development process. The appendix contains a discussion of coding issues encountered while porting legacy Fortran 77 code to Fortran 95, software design principles, and a Fortran 95 coding standard.

  8. Profile of an Accident Flying Squad

    PubMed Central

    Little, Keith

    1972-01-01

    An analysis of 184 accident flying squad calls and of 280 patients injured in road accidents and treated by a flying squad based on an accident department inclusive from 1967 to 1971 has shown that such a service can provide an efficient system without disrupting the routine work of the hospital. PMID:5076258

  9. Aircraft Accident Prevention: Loss-of-Control Analysis

    NASA Technical Reports Server (NTRS)

    Kwatny, Harry G.; Dongmo, Jean-Etienne T.; Chang, Bor-Chin; Bajpai, Guarav; Yasar, Murat; Belcastro, Christine M.

    2009-01-01

    The majority of fatal aircraft accidents are associated with loss-of-control . Yet the notion of loss-of-control is not well-defined in terms suitable for rigorous control systems analysis. Loss-of-control is generally associated with flight outside of the normal flight envelope, with nonlinear influences, and with an inability of the pilot to control the aircraft. The two primary sources of nonlinearity are the intrinsic nonlinear dynamics of the aircraft and the state and control constraints within which the aircraft must operate. In this paper we examine how these nonlinearities affect the ability to control the aircraft and how they may contribute to loss-of-control. Examples are provided using NASA s Generic Transport Model.

  10. Orbiter subsystem hardware/software interaction analysis. Volume 8: Forward reaction control system

    NASA Technical Reports Server (NTRS)

    Becker, D. D.

    1980-01-01

    The results of the orbiter hardware/software interaction analysis for the AFT reaction control system are presented. The interaction between hardware failure modes and software are examined in order to identify associated issues and risks. All orbiter subsystems and interfacing program elements which interact with the orbiter computer flight software are analyzed. The failure modes identified in the subsystem/element failure mode and effects analysis are discussed.

  11. Confirmatory Factor Analysis Alternative: Free, Accessible CBID Software.

    PubMed

    Bott, Marjorie; Karanevich, Alex G; Garrard, Lili; Price, Larry R; Mudaranthakam, Dinesh Pal; Gajewski, Byron

    2018-02-01

    New software that performs Classical and Bayesian Instrument Development (CBID) is reported that seamlessly integrates expert (content validity) and participant data (construct validity) to produce entire reliability estimates with smaller sample requirements. The free CBID software can be accessed through a website and used by clinical investigators in new instrument development. Demonstrations are presented of the three approaches using the CBID software: (a) traditional confirmatory factor analysis (CFA), (b) Bayesian CFA using flat uninformative prior, and (c) Bayesian CFA using content expert data (informative prior). Outcomes of usability testing demonstrate the need to make the user-friendly, free CBID software available to interdisciplinary researchers. CBID has the potential to be a new and expeditious method for instrument development, adding to our current measurement toolbox. This allows for the development of new instruments for measuring determinants of health in smaller diverse populations or populations of rare diseases.

  12. Software for Real-Time Analysis of Subsonic Test Shot Accuracy

    DTIC Science & Technology

    2014-03-01

    used the C++ programming language, the Open Source Computer Vision ( OpenCV ®) software library, and Microsoft Windows® Application Programming...video for comparison through OpenCV image analysis tools. Based on the comparison, the software then computed the coordinates of each shot relative to...DWB researchers wanted to use the Open Source Computer Vision ( OpenCV ) software library for capturing and analyzing frames of video. OpenCV contains

  13. Analysis of accidents with organic material in health workers.

    PubMed

    Vieira, Mariana; Padilha, Maria Itayra; Pinheiro, Regina Dal Castel

    2011-01-01

    This retrospective and descriptive study with a quantitative design aimed to evaluate occupational accidents with exposure to biological material, as well as the profile of workers, based on reporting forms sent to the Regional Reference Center of Occupational Health in Florianópolis/SC. Data collection was carried out through a survey of 118 reporting forms in 2007. Data were analyzed electronically. The occurrence of accidents was predominantly among nursing technicians, women and the mean age was 34.5 years. 73% of accidents involved percutaneous exposure, 78% had blood and fluid with blood, 44.91% resulted from invasive procedures. It was concluded that strategies to prevent the occurrence of accidents with biological material should include joint activities between workers and service management and should be directed at improving work conditions and organization.

  14. The Analysis of the Contribution of Human Factors to the In-Flight Loss of Control Accidents

    NASA Technical Reports Server (NTRS)

    Ancel, Ersin; Shih, Ann T.

    2012-01-01

    In-flight loss of control (LOC) is currently the leading cause of fatal accidents based on various commercial aircraft accident statistics. As the Next Generation Air Transportation System (NextGen) emerges, new contributing factors leading to LOC are anticipated. The NASA Aviation Safety Program (AvSP), along with other aviation agencies and communities are actively developing safety products to mitigate the LOC risk. This paper discusses the approach used to construct a generic integrated LOC accident framework (LOCAF) model based on a detailed review of LOC accidents over the past two decades. The LOCAF model is comprised of causal factors from the domain of human factors, aircraft system component failures, and atmospheric environment. The multiple interdependent causal factors are expressed in an Object-Oriented Bayesian belief network. In addition to predicting the likelihood of LOC accident occurrence, the system-level integrated LOCAF model is able to evaluate the impact of new safety technology products developed in AvSP. This provides valuable information to decision makers in strategizing NASA's aviation safety technology portfolio. The focus of this paper is on the analysis of human causal factors in the model, including the contributions from flight crew and maintenance workers. The Human Factors Analysis and Classification System (HFACS) taxonomy was used to develop human related causal factors. The preliminary results from the baseline LOCAF model are also presented.

  15. VLBI Analysis with the Multi-Technique Software GEOSAT

    NASA Technical Reports Server (NTRS)

    Kierulf, Halfdan Pascal; Andersen, Per-Helge; Boeckmann, Sarah; Kristiansen, Oddgeir

    2010-01-01

    GEOSAT is a multi-technique geodetic analysis software developed at Forsvarets Forsknings Institutt (Norwegian defense research establishment). The Norwegian Mapping Authority has now installed the software and has, together with Forsvarets Forsknings Institutt, adapted the software to deliver datum-free normal equation systems in SINEX format. The goal is to be accepted as an IVS Associate Analysis Center and to provide contributions to the IVS EOP combination on a routine basis. GEOSAT is based on an upper diagonal factorized Kalman filter which allows estimation of time variable parameters like the troposphere and clocks as stochastic parameters. The tropospheric delays in various directions are mapped to tropospheric zenith delay using ray-tracing. Meteorological data from ECMWF with a resolution of six hours is used to perform the ray-tracing which depends both on elevation and azimuth. Other models are following the IERS and IVS conventions. The Norwegian Mapping Authority has submitted test SINEX files produced with GEOSAT to IVS. The results have been compared with the existing IVS combined products. In this paper the outcome of these comparisons is presented.

  16. Type airman certification as related to accidents.

    DOT National Transportation Integrated Search

    1967-10-01

    An analysis of 1964 aircraft accidents, using type of airman certificate as a measure of pilot proficiency, is presented. Data show that student pilots generally have a better accident record than any other of the certification groups. Analysis confi...

  17. Propensity Score Analysis in R: A Software Review

    ERIC Educational Resources Information Center

    Keller, Bryan; Tipton, Elizabeth

    2016-01-01

    In this article, we review four software packages for implementing propensity score analysis in R: "Matching, MatchIt, PSAgraphics," and "twang." After briefly discussing essential elements for propensity score analysis, we apply each package to a data set from the Early Childhood Longitudinal Study in order to estimate the…

  18. NASA Structural Analysis Report on the American Airlines Flight 587 Accident - Local Analysis of the Right Rear Lug

    NASA Technical Reports Server (NTRS)

    Raju, Ivatury S; Glaessgen, Edward H.; Mason, Brian H; Krishnamurthy, Thiagarajan; Davila, Carlos G

    2005-01-01

    A detailed finite element analysis of the right rear lug of the American Airlines Flight 587 - Airbus A300-600R was performed as part of the National Transportation Safety Board s failure investigation of the accident that occurred on November 12, 2001. The loads experienced by the right rear lug are evaluated using global models of the vertical tail, local models near the right rear lug, and a global-local analysis procedure. The right rear lug was analyzed using two modeling approaches. In the first approach, solid-shell type modeling is used, and in the second approach, layered-shell type modeling is used. The solid-shell and the layered-shell modeling approaches were used in progressive failure analyses (PFA) to determine the load, mode, and location of failure in the right rear lug under loading representative of an Airbus certification test conducted in 1985 (the 1985-certification test). Both analyses were in excellent agreement with each other on the predicted failure loads, failure mode, and location of failure. The solid-shell type modeling was then used to analyze both a subcomponent test conducted by Airbus in 2003 (the 2003-subcomponent test) and the accident condition. Excellent agreement was observed between the analyses and the observed failures in both cases. From the analyses conducted and presented in this paper, the following conclusions were drawn. The moment, Mx (moment about the fuselage longitudinal axis), has significant effect on the failure load of the lugs. Higher absolute values of Mx give lower failure loads. The predicted load, mode, and location of the failure of the 1985-certification test, 2003-subcomponent test, and the accident condition are in very good agreement. This agreement suggests that the 1985-certification and 2003- subcomponent tests represent the accident condition accurately. The failure mode of the right rear lug for the 1985-certification test, 2003-subcomponent test, and the accident load case is identified as a

  19. [Accidents in travellers - the hidden epidemic].

    PubMed

    Walz, Alexander; Hatz, Christoph

    2013-06-01

    The risk of malaria and other communicable diseases is well addressed in pre-travel advice. Accidents are usually less discussed. Thus, we aimed at assessing accident figures for the Swiss population, based on data of the register from 2004 to 2008 of the largest Swiss accident insurance organization (SUVA). More than 139'000 accidents over 5 years showed that 65 % of the accidents overseas are injuries, and 24 % are caused by poisoning or harm by cold, heat or air pressure. Most accidents happened during leisure activities or sports. More than one third of the non-lethal and more than 50 % of the fatal accidents happened in Asia. More than three-quarters of non-lethal accidents take place in people between 25 and 54 years. One out of 74 insured persons has an accident abroad per year. Despite of many analysis short-comings of the data set with regard to overseas travel, the figures document the underestimated burden of disease caused by accidents abroad and should affect the given pre-health advice.

  20. Occupational accidents aboard merchant ships

    PubMed Central

    Hansen, H; Nielsen, D; Frydenberg, M

    2002-01-01

    Objectives: To investigate the frequency, circumstances, and causes of occupational accidents aboard merchant ships in international trade, and to identify risk factors for the occurrence of occupational accidents as well as dangerous working situations where possible preventive measures may be initiated. Methods: The study is a historical follow up on occupational accidents among crew aboard Danish merchant ships in the period 1993–7. Data were extracted from the Danish Maritime Authority and insurance data. Exact data on time at risk were available. Results: A total of 1993 accidents were identified during a total of 31 140 years at sea. Among these, 209 accidents resulted in permanent disability of 5% or more, and 27 were fatal. The mean risk of having an occupational accident was 6.4/100 years at sea and the risk of an accident causing a permanent disability of 5% or more was 0.67/100 years aboard. Relative risks for notified accidents and accidents causing permanent disability of 5% or more were calculated in a multivariate analysis including ship type, occupation, age, time on board, change of ship since last employment period, and nationality. Foreigners had a considerably lower recorded rate of accidents than Danish citizens. Age was a major risk factor for accidents causing permanent disability. Change of ship and the first period aboard a particular ship were identified as risk factors. Walking from one place to another aboard the ship caused serious accidents. The most serious accidents happened on deck. Conclusions: It was possible to clearly identify work situations and specific risk factors for accidents aboard merchant ships. Most accidents happened while performing daily routine duties. Preventive measures should focus on workplace instructions for all important functions aboard and also on the prevention of accidents caused by walking around aboard the ship. PMID:11850550

  1. Software System Safety and the NASA Aeronautics Blueprint

    NASA Technical Reports Server (NTRS)

    Holloway, C. Michael; Hayhurst, Kelly J.

    2002-01-01

    NASA's Aeronautics Blueprint lays out a research agenda for the Agency s aeronautics program. The word software appears only four times in this Blueprint, but the critical importance of safe and correct software to the fulfillment of the proposed research is evident on almost every page. Most of the technology solutions proposed to address challenges in aviation are software dependent technologies. Of the fifty-two specific technology solutions described in the Blueprint, forty-one depend, at least in part, on software for success. For thirty-five of these forty-one, software is not only critical to success, but also to human safety. That is, implementing the technology solutions will require using software in such a way that it may, if not specified, designed, and implemented properly, lead to fatal accidents. These results have at least two implications for the research based on the Blueprint: (1) knowledge about the current state-of-the-art and state-of-the-practice in software engineering and software system safety is essential, and (2) research into current unsolved problems in these software disciplines is also essential.

  2. Analysis of a hardware and software fault tolerant processor for critical applications

    NASA Technical Reports Server (NTRS)

    Dugan, Joanne B.

    1993-01-01

    Computer systems for critical applications must be designed to tolerate software faults as well as hardware faults. A unified approach to tolerating hardware and software faults is characterized by classifying faults in terms of duration (transient or permanent) rather than source (hardware or software). Errors arising from transient faults can be handled through masking or voting, but errors arising from permanent faults require system reconfiguration to bypass the failed component. Most errors which are caused by software faults can be considered transient, in that they are input-dependent. Software faults are triggered by a particular set of inputs. Quantitative dependability analysis of systems which exhibit a unified approach to fault tolerance can be performed by a hierarchical combination of fault tree and Markov models. A methodology for analyzing hardware and software fault tolerant systems is applied to the analysis of a hypothetical system, loosely based on the Fault Tolerant Parallel Processor. The models consider both transient and permanent faults, hardware and software faults, independent and related software faults, automatic recovery, and reconfiguration.

  3. Fission product transport analysis in a loss of decay heat removal accident at Browns Ferry

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wichner, R.P.; Weber, C.F.; Hodge, S.A.

    1984-01-01

    This paper summarizes an analysis of the movement of noble gases, iodine, and cesium fission products within the Mark-I containment BWR reactor system represented by Browns Ferry Unit 1 during a postulated accident sequence initiated by a loss of decay heat removal (DHR) capability following a scram. The event analysis showed that this accident could be brought under control by various means, but the sequence with no operator action ultimately leads to containment (drywell) failure followed by loss of water from the reactor vessel, core degradation due to overheating, and reactor vessel failure with attendant movement of core debris ontomore » the drywell floor.« less

  4. 'It was a freak accident': an analysis of the labelling of injury events in the US press.

    PubMed

    Smith, Katherine C; Girasek, Deborah C; Baker, Susan P; Manganello, Jennifer A; Bowman, Stephen M; Samuels, Alicia; Gielen, Andrea C

    2012-02-01

    Given that the news media shape our understanding of health issues, a study was undertaken to examine the use by the US media of the expression 'freak accident' in relation to injury events. This analysis is intended to contribute to the ongoing consideration of lay conceptualisation of injuries as 'accidents'. LexisNexis Academic was used to search three purposively selected US news sources (Associated Press, New York Times and Philadelphia Inquirer) for the expression 'freak accident' over 5 years (2005-9). Textual analysis included both structured and open coding. Coding included measures for who used the expression within the story, the nature of the injury event and the injured person(s) being reported upon, incorporation of prevention information within the story and finally a phenomenological consideration of the uses and meanings of the expression within the story context. Results The search yielded a dataset of 250 human injury stories incorporating the term 'freak accident'. Injuries sustained by professional athletes dominated coverage (61%). Fewer than 10% of stories provided a clear and explicit injury prevention message. Stories in which journalists employed the expression 'freak accident' were less likely to include prevention information than stories in which the expression was used by people quoted in the story. Journalists who frame injury events as freak accidents may be an appropriate focus for advocacy efforts. Effective prevention messages should be developed and disseminated to accompany injury reporting in order to educate and protect the public.

  5. Lifestyle and accidents among young drivers.

    PubMed

    Gregersen, N P; Berg, H Y

    1994-06-01

    This study covers the lifestyle component of the problems related to young drivers' accident risk. The purpose of the study is to measure the relationship between lifestyle and accident risk, and to identify specific high-risk and low-risk groups. Lifestyle is measured through a questionnaire, where 20-year-olds describe themselves and how often they deal with a large number of different activities, like sports, music, movies, reading, cars and driving, political engagement, etc. They also report their involvement in traffic accidents. With a principal component analysis followed by a cluster analysis, lifestyle profiles are defined. These profiles are finally correlated to accidents, which makes it possible to define high-risk and low-risk groups. The cluster analysis defined 15 clusters including four high-risk groups with an average overrisk of 150% and two low-risk groups with an average underrisk of 75%. The results are discussed from two perspectives. The first is the importance of theoretical understanding of the contribution of lifestyle factors to young drivers' high accident risk. The second is how the findings could be used in practical road safety measures, like education, campaigns, etc.

  6. Application of systems and control theory-based hazard analysis to radiation oncology.

    PubMed

    Pawlicki, Todd; Samost, Aubrey; Brown, Derek W; Manger, Ryan P; Kim, Gwe-Ya; Leveson, Nancy G

    2016-03-01

    Both humans and software are notoriously challenging to account for in traditional hazard analysis models. The purpose of this work is to investigate and demonstrate the application of a new, extended accident causality model, called systems theoretic accident model and processes (STAMP), to radiation oncology. Specifically, a hazard analysis technique based on STAMP, system-theoretic process analysis (STPA), is used to perform a hazard analysis. The STPA procedure starts with the definition of high-level accidents for radiation oncology at the medical center and the hazards leading to those accidents. From there, the hierarchical safety control structure of the radiation oncology clinic is modeled, i.e., the controls that are used to prevent accidents and provide effective treatment. Using STPA, unsafe control actions (behaviors) are identified that can lead to the hazards as well as causal scenarios that can lead to the identified unsafe control. This information can be used to eliminate or mitigate potential hazards. The STPA procedure is demonstrated on a new online adaptive cranial radiosurgery procedure that omits the CT simulation step and uses CBCT for localization, planning, and surface imaging system during treatment. The STPA procedure generated a comprehensive set of causal scenarios that are traced back to system hazards and accidents. Ten control loops were created for the new SRS procedure, which covered the areas of hospital and department management, treatment design and delivery, and vendor service. Eighty three unsafe control actions were identified as well as 472 causal scenarios that could lead to those unsafe control actions. STPA provides a method for understanding the role of management decisions and hospital operations on system safety and generating process design requirements to prevent hazards and accidents. The interaction of people, hardware, and software is highlighted. The method of STPA produces results that can be used to improve

  7. Specdata: Automated Analysis Software for Broadband Spectra

    NASA Astrophysics Data System (ADS)

    Oliveira, Jasmine N.; Martin-Drumel, Marie-Aline; McCarthy, Michael C.

    2017-06-01

    With the advancement of chirped-pulse techniques, broadband rotational spectra with a few tens to several hundred GHz of spectral coverage are now routinely recorded. When studying multi-component mixtures that might result, for example, with the use of an electrical discharge, lines of new chemical species are often obscured by those of known compounds, and analysis can be laborious. To address this issue, we have developed SPECdata, an open source, interactive tool which is designed to simplify and greatly accelerate the spectral analysis and discovery. Our software tool combines both automated and manual components that free the user from computation, while giving him/her considerable flexibility to assign, manipulate, interpret and export their analysis. The automated - and key - component of the new software is a database query system that rapidly assigns transitions of known species in an experimental spectrum. For each experiment, the software identifies spectral features, and subsequently assigns them to known molecules within an in-house database (Pickett .cat files, list of frequencies...), or those catalogued in Splatalogue (using automatic on-line queries). With suggested assignments, the control is then handed over to the user who can choose to accept, decline or add additional species. Data visualization, statistical information, and interactive widgets assist the user in making decisions about their data. SPECdata has several other useful features intended to improve the user experience. Exporting a full report of the analysis, or a peak file in which assigned lines are removed are among several options. A user may also save their progress to continue at another time. Additional features of SPECdata help the user to maintain and expand their database for future use. A user-friendly interface allows one to search, upload, edit or update catalog or experiment entries.

  8. 10 CFR 76.85 - Assessment of accidents.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 10 Energy 2 2013-01-01 2013-01-01 false Assessment of accidents. 76.85 Section 76.85 Energy NUCLEAR REGULATORY COMMISSION (CONTINUED) CERTIFICATION OF GASEOUS DIFFUSION PLANTS Safety § 76.85 Assessment of accidents. The Corporation shall perform an analysis of potential accidents and consequences to...

  9. 10 CFR 76.85 - Assessment of accidents.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 10 Energy 2 2014-01-01 2014-01-01 false Assessment of accidents. 76.85 Section 76.85 Energy NUCLEAR REGULATORY COMMISSION (CONTINUED) CERTIFICATION OF GASEOUS DIFFUSION PLANTS Safety § 76.85 Assessment of accidents. The Corporation shall perform an analysis of potential accidents and consequences to...

  10. 10 CFR 76.85 - Assessment of accidents.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 10 Energy 2 2012-01-01 2012-01-01 false Assessment of accidents. 76.85 Section 76.85 Energy NUCLEAR REGULATORY COMMISSION (CONTINUED) CERTIFICATION OF GASEOUS DIFFUSION PLANTS Safety § 76.85 Assessment of accidents. The Corporation shall perform an analysis of potential accidents and consequences to...

  11. 10 CFR 76.85 - Assessment of accidents.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 10 Energy 2 2010-01-01 2010-01-01 false Assessment of accidents. 76.85 Section 76.85 Energy NUCLEAR REGULATORY COMMISSION (CONTINUED) CERTIFICATION OF GASEOUS DIFFUSION PLANTS Safety § 76.85 Assessment of accidents. The Corporation shall perform an analysis of potential accidents and consequences to...

  12. 10 CFR 76.85 - Assessment of accidents.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 10 Energy 2 2011-01-01 2011-01-01 false Assessment of accidents. 76.85 Section 76.85 Energy NUCLEAR REGULATORY COMMISSION (CONTINUED) CERTIFICATION OF GASEOUS DIFFUSION PLANTS Safety § 76.85 Assessment of accidents. The Corporation shall perform an analysis of potential accidents and consequences to...

  13. GIS-based accident location and analysis system (GIS-ALAS) : project report : phase I

    DOT National Transportation Integrated Search

    1998-04-06

    This report summarizes progress made in Phase I of the geographic information system (GIS) based Accident Location and Analysis System (GIS-ALAS). The GIS-ALAS project builds on PC-ALAS, a locationally-referenced highway crash database query system d...

  14. A systemic analysis of South Korea Sewol ferry accident - Striking a balance between learning and accountability.

    PubMed

    Kee, Dohyung; Jun, Gyuchan Thomas; Waterson, Patrick; Haslam, Roger

    2017-03-01

    The South Korea Sewol ferry accident in April 2014 claimed the lives of over 300 passengers and led to criminal charges of 399 personnel concerned including imprisonment of 154 of them as of Oct 2014. Blame and punishment culture can be prevalent in a more hierarchical society like South Korea as shown in the aftermath of this disaster. This study aims to analyse the South Korea ferry accident using Rasmussen's risk management framework and the associated AcciMap technique and to propose recommendations drawn from an AcciMap-based focus group with systems safety experts. The data for the accident analysis were collected mainly from an interim investigation report by the Board of Audit and Inspection of Korea and major South Korean and foreign newspapers. The analysis showed that the accident was attributed to many contributing factors arising from front-line operators, management, regulators and government. It also showed how the multiple factors including economic, social and political pressures and individual workload contributed to the accident and how they affected each other. This AcciMap was presented to 27 safety researchers and experts at 'the legacy of Jens Rasmussen' symposium adjunct to ODAM2014. Their recommendations were captured through a focus group. The four main recommendations include forgive (no blame and punishment on individuals), analyse (socio-technical system-based), learn (from why things do not go wrong) and change (bottom-up safety culture and safety system management). The findings offer important insights into how this type of accident should be understood, analysed and the subsequent response. Copyright © 2016 Elsevier Ltd. All rights reserved.

  15. Evaluating the Quantitative Capabilities of Metagenomic Analysis Software.

    PubMed

    Kerepesi, Csaba; Grolmusz, Vince

    2016-05-01

    DNA sequencing technologies are applied widely and frequently today to describe metagenomes, i.e., microbial communities in environmental or clinical samples, without the need for culturing them. These technologies usually return short (100-300 base-pairs long) DNA reads, and these reads are processed by metagenomic analysis software that assign phylogenetic composition-information to the dataset. Here we evaluate three metagenomic analysis software (AmphoraNet--a webserver implementation of AMPHORA2--, MG-RAST, and MEGAN5) for their capabilities of assigning quantitative phylogenetic information for the data, describing the frequency of appearance of the microorganisms of the same taxa in the sample. The difficulties of the task arise from the fact that longer genomes produce more reads from the same organism than shorter genomes, and some software assign higher frequencies to species with longer genomes than to those with shorter ones. This phenomenon is called the "genome length bias." Dozens of complex artificial metagenome benchmarks can be found in the literature. Because of the complexity of those benchmarks, it is usually difficult to judge the resistance of a metagenomic software to this "genome length bias." Therefore, we have made a simple benchmark for the evaluation of the "taxon-counting" in a metagenomic sample: we have taken the same number of copies of three full bacterial genomes of different lengths, break them up randomly to short reads of average length of 150 bp, and mixed the reads, creating our simple benchmark. Because of its simplicity, the benchmark is not supposed to serve as a mock metagenome, but if a software fails on that simple task, it will surely fail on most real metagenomes. We applied three software for the benchmark. The ideal quantitative solution would assign the same proportion to the three bacterial taxa. We have found that AMPHORA2/AmphoraNet gave the most accurate results and the other two software were under

  16. PyPWA: A partial-wave/amplitude analysis software framework

    NASA Astrophysics Data System (ADS)

    Salgado, Carlos

    2016-05-01

    The PyPWA project aims to develop a software framework for Partial Wave and Amplitude Analysis of data; providing the user with software tools to identify resonances from multi-particle final states in photoproduction. Most of the code is written in Python. The software is divided into two main branches: one general-shell where amplitude's parameters (or any parametric model) are to be estimated from the data. This branch also includes software to produce simulated data-sets using the fitted amplitudes. A second branch contains a specific realization of the isobar model (with room to include Deck-type and other isobar model extensions) to perform PWA with an interface into the computer resources at Jefferson Lab. We are currently implementing parallelism and vectorization using the Intel's Xeon Phi family of coprocessors.

  17. Software Construction and Analysis Tools for Future Space Missions

    NASA Technical Reports Server (NTRS)

    Lowry, Michael R.; Clancy, Daniel (Technical Monitor)

    2002-01-01

    NASA and its international partners will increasingly depend on software-based systems to implement advanced functions for future space missions, such as Martian rovers that autonomously navigate long distances exploring geographic features formed by surface water early in the planet's history. The software-based functions for these missions will need to be robust and highly reliable, raising significant challenges in the context of recent Mars mission failures attributed to software faults. After reviewing these challenges, this paper describes tools that have been developed at NASA Ames that could contribute to meeting these challenges; 1) Program synthesis tools based on automated inference that generate documentation for manual review and annotations for automated certification. 2) Model-checking tools for concurrent object-oriented software that achieve memorability through synergy with program abstraction and static analysis tools.

  18. Decision Engines for Software Analysis Using Satisfiability Modulo Theories Solvers

    NASA Technical Reports Server (NTRS)

    Bjorner, Nikolaj

    2010-01-01

    The area of software analysis, testing and verification is now undergoing a revolution thanks to the use of automated and scalable support for logical methods. A well-recognized premise is that at the core of software analysis engines is invariably a component using logical formulas for describing states and transformations between system states. The process of using this information for discovering and checking program properties (including such important properties as safety and security) amounts to automatic theorem proving. In particular, theorem provers that directly support common software constructs offer a compelling basis. Such provers are commonly called satisfiability modulo theories (SMT) solvers. Z3 is a state-of-the-art SMT solver. It is developed at Microsoft Research. It can be used to check the satisfiability of logical formulas over one or more theories such as arithmetic, bit-vectors, lists, records and arrays. The talk describes some of the technology behind modern SMT solvers, including the solver Z3. Z3 is currently mainly targeted at solving problems that arise in software analysis and verification. It has been applied to various contexts, such as systems for dynamic symbolic simulation (Pex, SAGE, Vigilante), for program verification and extended static checking (Spec#/Boggie, VCC, HAVOC), for software model checking (Yogi, SLAM), model-based design (FORMULA), security protocol code (F7), program run-time analysis and invariant generation (VS3). We will describe how it integrates support for a variety of theories that arise naturally in the context of the applications. There are several new promising avenues and the talk will touch on some of these and the challenges related to SMT solvers. Proceedings

  19. Effectiveness of an automatic tracking software in underwater motion analysis.

    PubMed

    Magalhaes, Fabrício A; Sawacha, Zimi; Di Michele, Rocco; Cortesi, Matteo; Gatta, Giorgio; Fantozzi, Silvia

    2013-01-01

    Tracking of markers placed on anatomical landmarks is a common practice in sports science to perform the kinematic analysis that interests both athletes and coaches. Although different software programs have been developed to automatically track markers and/or features, none of them was specifically designed to analyze underwater motion. Hence, this study aimed to evaluate the effectiveness of a software developed for automatic tracking of underwater movements (DVP), based on the Kanade-Lucas-Tomasi feature tracker. Twenty-one video recordings of different aquatic exercises (n = 2940 markers' positions) were manually tracked to determine the markers' center coordinates. Then, the videos were automatically tracked using DVP and a commercially available software (COM). Since tracking techniques may produce false targets, an operator was instructed to stop the automatic procedure and to correct the position of the cursor when the distance between the calculated marker's coordinate and the reference one was higher than 4 pixels. The proportion of manual interventions required by the software was used as a measure of the degree of automation. Overall, manual interventions were 10.4% lower for DVP (7.4%) than for COM (17.8%). Moreover, when examining the different exercise modes separately, the percentage of manual interventions was 5.6% to 29.3% lower for DVP than for COM. Similar results were observed when analyzing the type of marker rather than the type of exercise, with 9.9% less manual interventions for DVP than for COM. In conclusion, based on these results, the developed automatic tracking software presented can be used as a valid and useful tool for underwater motion analysis. Key PointsThe availability of effective software for automatic tracking would represent a significant advance for the practical use of kinematic analysis in swimming and other aquatic sports.An important feature of automatic tracking software is to require limited human interventions and

  20. A new approach to modeling aviation accidents

    NASA Astrophysics Data System (ADS)

    Rao, Arjun Harsha

    views aviation accidents as a set of hazardous states of a system (pilot and aircraft), and triggers that cause the system to move between hazardous states. I used the NTSB's accident coding manual (that contains nearly 4000 different codes) to develop a "dictionary" of hazardous states, triggers, and information codes. Then, I created the "grammar", or a set of rules, that: (1) orders the hazardous states in each accident; and, (2) links the hazardous states using the appropriate triggers. This approach: (1) provides a more correct count of the causes for accidents in the NTSB database; and, (2) checks for gaps or omissions in NTSB accident data, and fills in some of these gaps using logic-based rules. These rules also help identify and count causes for accidents that were not discernable from previous analyses of historical accident data. I apply the model to 6200 helicopter accidents that occurred in the US between 1982 and 2015. First, I identify the states and triggers that are most likely to be associated with fatal and non-fatal accidents. The results suggest that non-fatal accidents, which account for approximately 84% of the accidents, provide valuable opportunities to learn about the causes for accidents. Next, I investigate the causes of inflight loss of control using both a conventional approach and using the state-based approach. The conventional analysis provides little insight into the causal mechanism for LOC. For instance, the top cause of LOC is "aircraft control/directional control not maintained", which does not provide any insight. In contrast, the state-based analysis showed that pilots' tendency to clip objects frequently triggered LOC (16.7% of LOC accidents)--this finding was not directly discernable from conventional analyses. Finally, I investigate the causes for improper autorotations using both a conventional approach and the state-based approach. The conventional approach uses modifiers (e.g., "improper", "misjudged") associated with "24520

  1. Data analysis software for the autoradiographic enhancement process. Volumes 1, 2, and 3, and appendix

    NASA Technical Reports Server (NTRS)

    Singh, S. P.

    1979-01-01

    The computer software developed to set up a method for Wiener spectrum analysis of photographic films is presented. This method is used for the quantitative analysis of the autoradiographic enhancement process. The software requirements and design for the autoradiographic enhancement process are given along with the program listings and the users manual. A software description and program listings modification of the data analysis software are included.

  2. Injury protection and accident causation parameters for vulnerable road users based on German In-Depth Accident Study GIDAS.

    PubMed

    Otte, Dietmar; Jänsch, Michael; Haasper, Carl

    2012-01-01

    Within a study of accident data from GIDAS (German In-Depth Accident Study), vulnerable road users are investigated regarding injury risk in traffic accidents. GIDAS is the largest in-depth accident study in Germany. Due to a well-defined sampling plan, representativeness with respect to the federal statistics is also guaranteed. A hierarchical system ACASS (Accident Causation Analysis with Seven Steps) was developed in GIDAS, describing the human causation factors in a chronological sequence. The accordingly classified causation factors - derived from the systematic of the analysis of human accident causes ("7 steps") - can be used to describe the influence of accident causes on the injury outcome. The bases of the study are accident documentations over ten years from 1999 to 2008 with 8204 vulnerable road users (VRU), of which 3 different groups were selected as pedestrians n=2041, motorcyclists n=2199 and bicyclists n=3964, and analyzed on collisions with cars and trucks as well as vulnerable road users alone. The paper will give a description of the injury pattern and injury mechanisms of accidents. The injury frequencies and severities are pointed out considering different types of VRU and protective measures of helmet and clothes of the human body. The impact points are demonstrated on the car, following to conclusion of protective measures on the vehicle. Existing standards of protection devices as well as interdisciplinary research, including accident and injury statistics, are described. With this paper, a summarization of the existing possibilities on protective measures for pedestrians, bicyclists and motorcyclists is given and discussed by comparison of all three groups of vulnerable road users. Also the relevance of special impact situations and accident causes mainly responsible for severe injuries are pointed out, given the new orientation of research for the avoidance and reduction of accident patterns. 2010 Elsevier Ltd. All rights reserved.

  3. Manned space flight nuclear system safety. Volume 3: Reactor system preliminary nuclear safety analysis. Part 2: Accident Model Document (AMD)

    NASA Technical Reports Server (NTRS)

    1972-01-01

    The Accident Model Document is one of three documents of the Preliminary Safety Analysis Report (PSAR) - Reactor System as applied to a Space Base Program. Potential terrestrial nuclear hazards involving the zirconium hydride reactor-Brayton power module are identified for all phases of the Space Base program. The accidents/events that give rise to the hazards are defined and abort sequence trees are developed to determine the sequence of events leading to the hazard and the associated probabilities of occurence. Source terms are calculated to determine the magnitude of the hazards. The above data is used in the mission accident analysis to determine the most probable and significant accidents/events in each mission phase. The only significant hazards during the prelaunch and launch ascent phases of the mission are those which arise form criticality accidents. Fission product inventories during this time period were found to be very low due to very limited low power acceptance testing.

  4. SIMA: Python software for analysis of dynamic fluorescence imaging data.

    PubMed

    Kaifosh, Patrick; Zaremba, Jeffrey D; Danielson, Nathan B; Losonczy, Attila

    2014-01-01

    Fluorescence imaging is a powerful method for monitoring dynamic signals in the nervous system. However, analysis of dynamic fluorescence imaging data remains burdensome, in part due to the shortage of available software tools. To address this need, we have developed SIMA, an open source Python package that facilitates common analysis tasks related to fluorescence imaging. Functionality of this package includes correction of motion artifacts occurring during in vivo imaging with laser-scanning microscopy, segmentation of imaged fields into regions of interest (ROIs), and extraction of signals from the segmented ROIs. We have also developed a graphical user interface (GUI) for manual editing of the automatically segmented ROIs and automated registration of ROIs across multiple imaging datasets. This software has been designed with flexibility in mind to allow for future extension with different analysis methods and potential integration with other packages. Software, documentation, and source code for the SIMA package and ROI Buddy GUI are freely available at http://www.losonczylab.org/sima/.

  5. Analysis of Software Development Methodologies to Build Safety Software Applications for the SATEX-II: A Mexican Experimental Satellite

    NASA Astrophysics Data System (ADS)

    Aguilar Cisneros, Jorge; Vargas Martinez, Hector; Pedroza Melendez, Alejandro; Alonso Arevalo, Miguel

    2013-09-01

    Mexico is a country where the experience to build software for satellite applications is beginning. This is a delicate situation because in the near future we will need to develop software for the SATEX-II (Mexican Experimental Satellite). SATEX- II is a SOMECyTA's project (the Mexican Society of Aerospace Science and Technology). We have experienced applying software development methodologies, like TSP (Team Software Process) and SCRUM in other areas. Then, we analyzed these methodologies and we concluded: these can be applied to develop software for the SATEX-II, also, we supported these methodologies with SSP-05-0 Standard in particular with ESA PSS-05-11. Our analysis was focusing on main characteristics of each methodology and how these methodologies could be used with the ESA PSS 05-0 Standards. Our outcomes, in general, may be used by teams who need to build small satellites, but, in particular, these are going to be used when we will build the on board software applications for the SATEX-II.

  6. Geographic analysis of road accident severity index in Nigeria.

    PubMed

    Iyanda, Ayodeji E

    2018-05-28

    Before 2030, deaths from road traffic accidents (RTAs) will surpass cerebrovascular disease, tuberculosis, and HIV/AIDS. Yet, there is little knowledge on the geographic distribution of RTA severity in Nigeria. Accident Severity Index is the proportion of deaths that result from a road accident. This study analysed the geographic pattern of RTA severity based on the data retrieved from Federal Road Safety Corps (FRSC). The study predicted a two-year data from a historic road accident data using exponential smoothing technique. To determine spatial autocorrelation, global and local indicators of spatial association were implemented in a geographic information system. Results show significant clusters of high RTA severity among states in the northeast and the northwest of Nigeria. Hence, the findings are discussed from two perspectives: Road traffic law compliance and poor emergency response. Conclusion, the severity of RTA is high in the northern states of Nigeria, hence, RTA remains a public health concern.

  7. A Software Tool for Integrated Optical Design Analysis

    NASA Technical Reports Server (NTRS)

    Moore, Jim; Troy, Ed; DePlachett, Charles; Montgomery, Edward (Technical Monitor)

    2001-01-01

    Design of large precision optical systems requires multi-disciplinary analysis, modeling, and design. Thermal, structural and optical characteristics of the hardware must be accurately understood in order to design a system capable of accomplishing the performance requirements. The interactions between each of the disciplines become stronger as systems are designed lighter weight for space applications. This coupling dictates a concurrent engineering design approach. In the past, integrated modeling tools have been developed that attempt to integrate all of the complex analysis within the framework of a single model. This often results in modeling simplifications and it requires engineering specialist to learn new applications. The software described in this presentation addresses the concurrent engineering task using a different approach. The software tool, Integrated Optical Design Analysis (IODA), uses data fusion technology to enable a cross discipline team of engineering experts to concurrently design an optical system using their standard validated engineering design tools.

  8. Combining Static Analysis and Model Checking for Software Analysis

    NASA Technical Reports Server (NTRS)

    Brat, Guillaume; Visser, Willem; Clancy, Daniel (Technical Monitor)

    2003-01-01

    We present an iterative technique in which model checking and static analysis are combined to verify large software systems. The role of the static analysis is to compute partial order information which the model checker uses to reduce the state space. During exploration, the model checker also computes aliasing information that it gives to the static analyzer which can then refine its analysis. The result of this refined analysis is then fed back to the model checker which updates its partial order reduction. At each step of this iterative process, the static analysis computes optimistic information which results in an unsafe reduction of the state space. However we show that the process converges to a fired point at which time the partial order information is safe and the whole state space is explored.

  9. Navigating freely-available software tools for metabolomics analysis.

    PubMed

    Spicer, Rachel; Salek, Reza M; Moreno, Pablo; Cañueto, Daniel; Steinbeck, Christoph

    2017-01-01

    The field of metabolomics has expanded greatly over the past two decades, both as an experimental science with applications in many areas, as well as in regards to data standards and bioinformatics software tools. The diversity of experimental designs and instrumental technologies used for metabolomics has led to the need for distinct data analysis methods and the development of many software tools. To compile a comprehensive list of the most widely used freely available software and tools that are used primarily in metabolomics. The most widely used tools were selected for inclusion in the review by either ≥ 50 citations on Web of Science (as of 08/09/16) or the use of the tool being reported in the recent Metabolomics Society survey. Tools were then categorised by the type of instrumental data (i.e. LC-MS, GC-MS or NMR) and the functionality (i.e. pre- and post-processing, statistical analysis, workflow and other functions) they are designed for. A comprehensive list of the most used tools was compiled. Each tool is discussed within the context of its application domain and in relation to comparable tools of the same domain. An extended list including additional tools is available at https://github.com/RASpicer/MetabolomicsTools which is classified and searchable via a simple controlled vocabulary. This review presents the most widely used tools for metabolomics analysis, categorised based on their main functionality. As future work, we suggest a direct comparison of tools' abilities to perform specific data analysis tasks e.g. peak picking.

  10. Rail-highway crossing accident prediction analysis

    DOT National Transportation Integrated Search

    1987-04-01

    This report contains technical results that have been produced in a study : to revise and update the DOT rail-highway crossing resource allocation : procedure. This work has resulted in new accident prediction and severity : formulas, a modified and ...

  11. Severe Accident Scoping Simulations of Accident Tolerant Fuel Concepts for BWRs

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Robb, Kevin R.

    2015-08-01

    CrAl would tend to generate heat and hydrogen from oxidation at a slower rate compared to the zirconium-based alloys in use today. The previous study, [2], of the FeCrAl ATF concept during station blackout (SBO) severe accident scenarios in BWRs was based on simulating short term SBO (STSBO), long term SBO (LTSBO), and modified SBO scenarios occurring in a BWR-4 reactor with MARK-I containment. The analysis indicated that FeCrAl had the potential to delay the onset of fuel failure by a few hours depending on the scenario, and it could delay lower head failure by several hours. The analysis demonstrated reduced in-vessel hydrogen production. However, the work was preliminary and was based on limited knowledge of material properties for FeCrAl. Limitations of the MELCOR code were identified for direct use in modeling ATF concepts. This effort used an older version of MELCOR (1.8.5). Since these analyses, the BWR model has been updated for use in MELCOR 1.8.6 [10], and more representative material properties for FeCrAl have been modeled. Sections 2 4 present updated analyses for the FeCrAl ATF concept response during severe accidents in a BWR. The purpose of the study is to estimate the potential gains afforded by the FeCrAl ATF concept during BWR SBO scenarios.« less

  12. Capi text V.1--data analysis software for nailfold skin capillaroscopy.

    PubMed

    Dobrev, Hristo P

    2007-01-01

    Nailfold skin capillaroscopy is a simple non-invasive method used to assess conditions of disturbed microcirculation such as Raynaud's phenomenon, acrocyanosis, perniones, connective tissue diseases, psoriasis, diabetes mellitus, neuropathy and vibration disease. To develop data analysis software aimed at assisting the documentation and analysis of a capillaroscopic investigation. SOFTWARE DESCRIPTION: The programme is based on a modular principle. The module "Nomenclatures" includes menus for the patients' data. The module "Examinations" includes menus for all general and specific aspects of the medical examination and capillaroscopic investigations. The modules "Settings" and "Information" include customization menus for the programme. The results of nailfold capillaroscopy can be printed in a short or expanded form. This software allows physicians to perform quick search by using various specified criteria and prepare analyses and reports. This software programme will facilitate any practitioner who performs nailfold skin capillaroscopy.

  13. Gesture Analysis for Astronomy Presentation Software

    NASA Astrophysics Data System (ADS)

    Robinson, Marc A.

    Astronomy presentation software in a planetarium setting provides a visually stimulating way to introduce varied scientific concepts, including computer science concepts, to a wide audience. However, the underlying computational complexity and opportunities for discussion are often overshadowed by the brilliance of the presentation itself. To bring this discussion back out into the open, a method needs to be developed to make the computer science applications more visible. This thesis introduces the GAAPS system, which endeavors to implement free-hand gesture-based control of astronomy presentation software, with the goal of providing that talking point to begin the discussion of computer science concepts in a planetarium setting. The GAAPS system incorporates gesture capture and analysis in a unique environment presenting unique challenges, and introduces a novel algorithm called a Bounding Box Tree to create and select features for this particular gesture data. This thesis also analyzes several different machine learning techniques to determine a well-suited technique for the classification of this particular data set, with an artificial neural network being chosen as the implemented algorithm. The results of this work will allow for the desired introduction of computer science discussion into the specific setting used, as well as provide for future work pertaining to gesture recognition with astronomy presentation software.

  14. An Evaluation of the Hazard Prediction and Assessment Capability (HPAC) Software’s Ability to Model the Chornobyl Accident

    DTIC Science & Technology

    2002-03-01

    source term. Several publications provided a thorough accounting of the accident, including “ Chernobyl Record” [Mould], and the NRC technical report...Report on the Accident at the Chernobyl Nuclear Power Station” [NUREG-1250]. The most comprehensive study of transport models to predict the...from the Chernobyl Accident: The ATMES Report” [Klug, et al.]. The Atmospheric Transport 5 Model Evaluation Study (ATMES) report used data

  15. Digital PIV (DPIV) Software Analysis System

    NASA Technical Reports Server (NTRS)

    Blackshire, James L.

    1997-01-01

    A software package was developed to provide a Digital PIV (DPIV) capability for NASA LaRC. The system provides an automated image capture, test correlation, and autocorrelation analysis capability for the Kodak Megaplus 1.4 digital camera system for PIV measurements. The package includes three separate programs that, when used together with the PIV data validation algorithm, constitutes a complete DPIV analysis capability. The programs are run on an IBM PC/AT host computer running either Microsoft Windows 3.1 or Windows 95 using a 'quickwin' format that allows simple user interface and output capabilities to the windows environment.

  16. BNL severe-accident sequence experiments and analysis program. [PWR; BWR

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Greene, G.A.; Ginsberg, T.; Tutu, N.K.

    1983-01-01

    In the analysis of degraded core accidents, the two major sources of pressure loading on light water reactor containments are: steam generation from core debris-water thermal interactions; and molten core-concrete interactions. Experiments are in progress at BNL in support of analytical model development related to aspects of the above containment loading mechanisms. The work supports development and evaluation of the CORCON (Muir, 1981) and MARCH (Wooton, 1980) computer codes. Progress in the two programs is described.

  17. Description of the GMAO OSSE for Weather Analysis Software Package: Version 3

    NASA Technical Reports Server (NTRS)

    Koster, Randal D. (Editor); Errico, Ronald M.; Prive, Nikki C.; Carvalho, David; Sienkiewicz, Meta; El Akkraoui, Amal; Guo, Jing; Todling, Ricardo; McCarty, Will; Putman, William M.; hide

    2017-01-01

    The Global Modeling and Assimilation Office (GMAO) at the NASA Goddard Space Flight Center has developed software and products for conducting observing system simulation experiments (OSSEs) for weather analysis applications. Such applications include estimations of potential effects of new observing instruments or data assimilation techniques on improving weather analysis and forecasts. The GMAO software creates simulated observations from nature run (NR) data sets and adds simulated errors to those observations. The algorithms employed are much more sophisticated, adding a much greater degree of realism, compared with OSSE systems currently available elsewhere. The algorithms employed, software designs, and validation procedures are described in this document. Instructions for using the software are also provided.

  18. Occupational accidents among mototaxi drivers.

    PubMed

    Amorim, Camila Rego; de Araújo, Edna Maria; de Araújo, Tânia Maria; de Oliveira, Nelson Fernandes

    2012-03-01

    The use of motorcycles as a means of work has contributed to the increase in traffic accidents, in particular, mototaxi accidents. The aim of this study was to estimate and characterize the incidence of occupational accidents among the mototaxis registered in Feira de Santana, BA. This is a cross-sectional study with descriptive and census data. Of the 300 professionals registered at the Municipal Transportation Service, 267 professionals were interviewed through a structured questionnaire. Then, a descriptive analysis was conducted and the incidence of accidents was estimated based on the variables studied. Relative risks were calculated and statistical significance was determined using the chi-square test and Fisher's exact test, considering p < 0.05. Logistic regression was used in order to perform simultaneous adjustment of variables. Occupational accidents were observed in 10.5% of mototaxis. There were mainly minor injuries (48.7%), 27% of them requiring leaves of absence from work. There was an association between the days of work per week, fatigue in lower limbs and musculoskeletal complaints, and accidents. Knowledge of the working conditions and accidents involved in this activity can be of great importance for the adoption of traffic education policies, and to help prevent accidents by improving the working conditions and lives of these professionals.

  19. STAMPS: development and verification of swallowing kinematic analysis software.

    PubMed

    Lee, Woo Hyung; Chun, Changmook; Seo, Han Gil; Lee, Seung Hak; Oh, Byung-Mo

    2017-10-17

    Swallowing impairment is a common complication in various geriatric and neurodegenerative diseases. Swallowing kinematic analysis is essential to quantitatively evaluate the swallowing motion of the oropharyngeal structures. This study aims to develop a novel swallowing kinematic analysis software, called spatio-temporal analyzer for motion and physiologic study (STAMPS), and verify its validity and reliability. STAMPS was developed in MATLAB, which is one of the most popular platforms for biomedical analysis. This software was constructed to acquire, process, and analyze the data of swallowing motion. The target of swallowing structures includes bony structures (hyoid bone, mandible, maxilla, and cervical vertebral bodies), cartilages (epiglottis and arytenoid), soft tissues (larynx and upper esophageal sphincter), and food bolus. Numerous functions are available for the spatiotemporal parameters of the swallowing structures. Testing for validity and reliability was performed in 10 dysphagia patients with diverse etiologies and using the instrumental swallowing model which was designed to mimic the motion of the hyoid bone and the epiglottis. The intra- and inter-rater reliability tests showed excellent agreement for displacement and moderate to excellent agreement for velocity. The Pearson correlation coefficients between the measured and instrumental reference values were nearly 1.00 (P < 0.001) for displacement and velocity. The Bland-Altman plots showed good agreement between the measurements and the reference values. STAMPS provides precise and reliable kinematic measurements and multiple practical functionalities for spatiotemporal analysis. The software is expected to be useful for researchers who are interested in the swallowing motion analysis.

  20. Spacelab user implementation assessment study. (Software requirements analysis). Volume 2: Technical report

    NASA Technical Reports Server (NTRS)

    1976-01-01

    The engineering analyses and evaluation studies conducted for the Software Requirements Analysis are discussed. Included are the development of the study data base, synthesis of implementation approaches for software required by both mandatory onboard computer services and command/control functions, and identification and implementation of software for ground processing activities.

  1. Software ion scan functions in analysis of glycomic and lipidomic MS/MS datasets.

    PubMed

    Haramija, Marko

    2018-03-01

    Hardware ion scan functions unique to tandem mass spectrometry (MS/MS) mode of data acquisition, such as precursor ion scan (PIS) and neutral loss scan (NLS), are important for selective extraction of key structural data from complex MS/MS spectra. However, their software counterparts, software ion scan (SIS) functions, are still not regularly available. Software ion scan functions can be easily coded for additional functionalities, such as software multiple precursor ion scan, software no ion scan, and software variable ion scan functions. These are often necessary, since they allow more efficient analysis of complex MS/MS datasets, often encountered in glycomics and lipidomics. Software ion scan functions can be easily coded by using modern script languages and can be independent of instrument manufacturer. Here we demonstrate the utility of SIS functions on a medium-size glycomic MS/MS dataset. Knowledge of sample properties, as well as of diagnostic and conditional diagnostic ions crucial for data analysis, was needed. Based on the tables constructed with the output data from the SIS functions performed, a detailed analysis of a complex MS/MS glycomic dataset could be carried out in a quick, accurate, and efficient manner. Glycomic research is progressing slowly, and with respect to the MS experiments, one of the key obstacles for moving forward is the lack of appropriate bioinformatic tools necessary for fast analysis of glycomic MS/MS datasets. Adding novel SIS functionalities to the glycomic MS/MS toolbox has a potential to significantly speed up the glycomic data analysis process. Similar tools are useful for analysis of lipidomic MS/MS datasets as well, as will be discussed briefly. Copyright © 2017 John Wiley & Sons, Ltd.

  2. A parallel and sensitive software tool for methylation analysis on multicore platforms.

    PubMed

    Tárraga, Joaquín; Pérez, Mariano; Orduña, Juan M; Duato, José; Medina, Ignacio; Dopazo, Joaquín

    2015-10-01

    DNA methylation analysis suffers from very long processing time, as the advent of Next-Generation Sequencers has shifted the bottleneck of genomic studies from the sequencers that obtain the DNA samples to the software that performs the analysis of these samples. The existing software for methylation analysis does not seem to scale efficiently neither with the size of the dataset nor with the length of the reads to be analyzed. As it is expected that the sequencers will provide longer and longer reads in the near future, efficient and scalable methylation software should be developed. We present a new software tool, called HPG-Methyl, which efficiently maps bisulphite sequencing reads on DNA, analyzing DNA methylation. The strategy used by this software consists of leveraging the speed of the Burrows-Wheeler Transform to map a large number of DNA fragments (reads) rapidly, as well as the accuracy of the Smith-Waterman algorithm, which is exclusively employed to deal with the most ambiguous and shortest reads. Experimental results on platforms with Intel multicore processors show that HPG-Methyl significantly outperforms in both execution time and sensitivity state-of-the-art software such as Bismark, BS-Seeker or BSMAP, particularly for long bisulphite reads. Software in the form of C libraries and functions, together with instructions to compile and execute this software. Available by sftp to anonymous@clariano.uv.es (password 'anonymous'). juan.orduna@uv.es or jdopazo@cipf.es. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  3. The Implication of Using NVivo Software in Qualitative Data Analysis: Evidence-Based Reflections.

    PubMed

    Zamawe, F C

    2015-03-01

    For a long time, electronic data analysis has been associated with quantitative methods. However, Computer Assisted Qualitative Data Analysis Software (CAQDAS) are increasingly being developed. Although the CAQDAS has been there for decades, very few qualitative health researchers report using it. This may be due to the difficulties that one has to go through to master the software and the misconceptions that are associated with using CAQDAS. While the issue of mastering CAQDAS has received ample attention, little has been done to address the misconceptions associated with CAQDAS. In this paper, the author reflects on his experience of interacting with one of the popular CAQDAS (NVivo) in order to provide evidence-based implications of using the software. The key message is that unlike statistical software, the main function of CAQDAS is not to analyse data but rather to aid the analysis process, which the researcher must always remain in control of. In other words, researchers must equally know that no software can analyse qualitative data. CAQDAS are basically data management packages, which support the researcher during analysis.

  4. Epidemiology of Accidents and Traumas in Qom Province in 2010

    PubMed Central

    Karami Joushin, Moharram; Saghafipour, Abedin; Noroozi, Mehdi; Soori, Hamid; Khedmati Morasae, Esmaeil

    2013-01-01

    Background Accidents are the most important public health challenges in our society. To prevent the accidents, the identification of their epidemiological features seems necessary. Objectives This study was conducted to reveal the epidemiological features of accidents and their casualties in Qom province in 2010. Patients and Methods A cross–sectional study was conducted on 29426 injured people referred to Qom province hospitals in 2010. Information about place, time, type of accidents and traumas and demographic variables had been collected in a veteran hospital. Data were analyzed by SPSS (version 16) software, using chi-square test and logistic regression. Results The incidence of accidents was about 27/1000 per year. The incidences of traffic accidents, motorcycle accidents, violence, burns, poisoning and suicides were 3, 1.6, 1.2, 0.3, 0.8, 0.37 cases per 1000 people respectively. Strikes (65%) and falls (12%) were the main causes of traumas. Forty-six percent of all injuries had occurred in 16 - 30 years groups. Most frequent accidents were as follows: fall (97%) and strike (50%) in < 12, violence (46%) in 20 - 29, suicide (71%) in 15 - 29, poisoning (34%) and burns (20%) among < 5 years old. Pedestrian and motorcycle accidents among +60 years old people were significantly higher than other (P = 0.000). Odds ratio for suicide among female was about 3.36 and in 16 - 30 age-group was 15.7 more than +60 years old group (P = 0.000). Conclusions Most traumas in Qom province occurred among younger age-groups and strikes and falls are the main causes of such traumas. Therefore, safeties to prevent falls and traffic regulations to reduce strikes can be effective strategies. PMID:24693520

  5. Epidemiology of accidents and traumas in qom province in 2010.

    PubMed

    Karami Joushin, Moharram; Saghafipour, Abedin; Noroozi, Mehdi; Soori, Hamid; Khedmati Morasae, Esmaeil

    2013-12-01

    Accidents are the most important public health challenges in our society. To prevent the accidents, the identification of their epidemiological features seems necessary. This study was conducted to reveal the epidemiological features of accidents and their casualties in Qom province in 2010. A cross-sectional study was conducted on 29426 injured people referred to Qom province hospitals in 2010. Information about place, time, type of accidents and traumas and demographic variables had been collected in a veteran hospital. Data were analyzed by SPSS (version 16) software, using chi-square test and logistic regression. The incidence of accidents was about 27/1000 per year. The incidences of traffic accidents, motorcycle accidents, violence, burns, poisoning and suicides were 3, 1.6, 1.2, 0.3, 0.8, 0.37 cases per 1000 people respectively. Strikes (65%) and falls (12%) were the main causes of traumas. Forty-six percent of all injuries had occurred in 16 - 30 years groups. Most frequent accidents were as follows: fall (97%) and strike (50%) in < 12, violence (46%) in 20 - 29, suicide (71%) in 15 - 29, poisoning (34%) and burns (20%) among < 5 years old. Pedestrian and motorcycle accidents among +60 years old people were significantly higher than other (P = 0.000). Odds ratio for suicide among female was about 3.36 and in 16 - 30 age-group was 15.7 more than +60 years old group (P = 0.000). Most traumas in Qom province occurred among younger age-groups and strikes and falls are the main causes of such traumas. Therefore, safeties to prevent falls and traffic regulations to reduce strikes can be effective strategies.

  6. Experience report: Using formal methods for requirements analysis of critical spacecraft software

    NASA Technical Reports Server (NTRS)

    Lutz, Robyn R.; Ampo, Yoko

    1994-01-01

    Formal specification and analysis of requirements continues to gain support as a method for producing more reliable software. However, the introduction of formal methods to a large software project is difficult, due in part to the unfamiliarity of the specification languages and the lack of graphics. This paper reports results of an investigation into the effectiveness of formal methods as an aid to the requirements analysis of critical, system-level fault-protection software on a spacecraft currently under development. Our experience indicates that formal specification and analysis can enhance the accuracy of the requirements and add assurance prior to design development in this domain. The work described here is part of a larger, NASA-funded research project whose purpose is to use formal-methods techniques to improve the quality of software in space applications. The demonstration project described here is part of the effort to evaluate experimentally the effectiveness of supplementing traditional engineering approaches to requirements specification with the more rigorous specification and analysis available with formal methods.

  7. Impact of traffic congestion on road accidents: a spatial analysis of the M25 motorway in England.

    PubMed

    Wang, Chao; Quddus, Mohammed A; Ison, Stephen G

    2009-07-01

    Traffic congestion and road accidents are two external costs of transport and the reduction of their impacts is often one of the primary objectives for transport policy makers. The relationship between traffic congestion and road accidents however is not apparent and less studied. It is speculated that there may be an inverse relationship between traffic congestion and road accidents, and as such this poses a potential dilemma for transport policy makers. This study aims to explore the impact of traffic congestion on the frequency of road accidents using a spatial analysis approach, while controlling for other relevant factors that may affect road accidents. The M25 London orbital motorway, divided into 70 segments, was chosen to conduct this study and relevant data on road accidents, traffic and road characteristics were collected. A robust technique has been developed to map M25 accidents onto its segments. Since existing studies have often used a proxy to measure the level of congestion, this study has employed a precise congestion measurement. A series of Poisson based non-spatial (such as Poisson-lognormal and Poisson-gamma) and spatial (Poisson-lognormal with conditional autoregressive priors) models have been used to account for the effects of both heterogeneity and spatial correlation. The results suggest that traffic congestion has little or no impact on the frequency of road accidents on the M25 motorway. All other relevant factors have provided results consistent with existing studies.

  8. Complaints against doctors in an accident and emergency department: a 10-year analysis.

    PubMed Central

    Kadzombe, E A; Coals, J

    1992-01-01

    We carried out an analysis of complaints against doctors in our Accident and Emergency Department received from 1 January 1979 to 31 December 1988. There were 66 complainants in all, comprising 37 relatives, 21 patients and eight persons acting in a professional capacity. The majority of complaints (80 out of 125) were about poor communication and dissatisfaction with diagnosis and treatment. A small number of complainants had unrealistic expectations of the Accident and Emergency service. A total of 83.3% of complaints were against Senior House Officers who saw 61.3% of all patients. We concluded that an improvement in the communicative, diagnostic and therapeutic skills of doctors would minimize justified complaints. PMID:1388487

  9. XMM-Newton Remote Interface to Science Analysis Software: First Public Version

    NASA Astrophysics Data System (ADS)

    Ibarra, A.; Gabriel, C.

    2011-07-01

    We present the first public beta release of the XMM-Newton Remote Interface to Science Analysis (RISA) software, available through the official XMM-Newton web pages. In a nutshell, RISA is a web based application that encapsulates the XMM-Newton data analysis software. The client identifies observations and creates XMM-Newton workflows. The server processes the client request, creates job templates and sends the jobs to a computer. RISA has been designed to help, at the same time, non-expert and professional XMM-Newton users. Thanks to the predefined threads, non-expert users can easily produce light curves and spectra. And on the other hand, expert user can use the full parameter interface to tune their own analysis. In both cases, the VO compliant client/server design frees the users from having to install any specific software to analyze XMM-Newton data.

  10. Analysis 320 coal mine accidents using structural equation modeling with unsafe conditions of the rules and regulations as exogenous variables.

    PubMed

    Zhang, Yingyu; Shao, Wei; Zhang, Mengjia; Li, Hejun; Yin, Shijiu; Xu, Yingjun

    2016-07-01

    Mining has been historically considered as a naturally high-risk industry worldwide. Deaths caused by coal mine accidents are more than the sum of all other accidents in China. Statistics of 320 coal mine accidents in Shandong province show that all accidents contain indicators of "unsafe conditions of the rules and regulations" with a frequency of 1590, accounting for 74.3% of the total frequency of 2140. "Unsafe behaviors of the operator" is another important contributory factor, which mainly includes "operator error" and "venturing into dangerous places." A systems analysis approach was applied by using structural equation modeling (SEM) to examine the interactions between the contributory factors of coal mine accidents. The analysis of results leads to three conclusions. (i) "Unsafe conditions of the rules and regulations," affect the "unsafe behaviors of the operator," "unsafe conditions of the equipment," and "unsafe conditions of the environment." (ii) The three influencing factors of coal mine accidents (with the frequency of effect relation in descending order) are "lack of safety education and training," "rules and regulations of safety production responsibility," and "rules and regulations of supervision and inspection." (iii) The three influenced factors (with the frequency in descending order) of coal mine accidents are "venturing into dangerous places," "poor workplace environment," and "operator error." Copyright © 2016 Elsevier Ltd. All rights reserved.

  11. Analysis of fatal accidents with tractors in the Centre of Portugal: Ten years analysis.

    PubMed

    Antunes, Soraia M; Cordeiro, Cristina; Teixeira, Helena M

    2018-06-01

    Tractors have been described as one of the deadliest farming implements concerning agricultural activity. In Portugal, the scientific investigations about this problem are practically non-existent, with only statistical studies performed by entities related to road traffic safety, not in accordance to the study now performed, pinpointing the possibility of an underreporting of these accidents. This work aims to characterize the fatal tractor accidents in Portugal, autopsied at the Forensic Pathology Department of the Centre Branch of the National Institute of Legal Medicine and Forensic Sciences of Portugal, analysing several variables: gender, age, occupation, survival time, the victim position in the tractor, cause of death, toxicological and histological exams, year/month/day of the week, type of agricultural machine, existence of rollover protective structures (ROPS), type of accident, ground conditions, circumstantial information and geographic distribution of the accidents. All the autopsies between 2005 and 2014 were analysed. The victim profile corresponded to a man (89.5%), between 61 and 70 years old (33.3%), retired (43.9%), being the tractor driver (45.6%). In most of the cases, death occurred in less than 24h after the accident. These fatalities arose mainly in May and October. Rollover in sloping land was the most common type of accident, and cranioencephalic, thoracic and abdominal traumatic injuries were the main cause of death. In 16.2% of the cases, blood alcohol concentration was above the lower limit established in our road traffic law (<0.5g/L). There was lack information about the use of ROPS (95.9%), and even when existent, the protections were not used or were incorrectly used. This is the first national study involving the description of the forensic achievements in each autopsy related to tractor accidents, and the corresponding circumstances that contributed to the death. Many barriers remain about this matter, but the Portuguese

  12. Analysis of the FeCrAl Accident Tolerant Fuel Concept Benefits during BWR Station Blackout Accidents

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Robb, Kevin R

    2015-01-01

    Iron-chromium-aluminum (FeCrAl) alloys are being considered for fuel concepts with enhanced accident tolerance. FeCrAl alloys have very slow oxidation kinetics and good strength at high temperatures. FeCrAl could be used for fuel cladding in light water reactors and/or as channel box material in boiling water reactors (BWRs). To estimate the potential safety gains afforded by the FeCrAl concept, the MELCOR code was used to analyze a range of postulated station blackout severe accident scenarios in a BWR/4 reactor employing FeCrAl. The simulations utilize the most recently known thermophysical properties and oxidation kinetics for FeCrAl. Overall, when compared to the traditionalmore » Zircaloy-based cladding and channel box, the FeCrAl concept provides a few extra hours of time for operators to take mitigating actions and/or for evacuations to take place. A coolable core geometry is retained longer, enhancing the ability to stabilize an accident. Finally, due to the slower oxidation kinetics, substantially less hydrogen is generated, and the generation is delayed in time. This decreases the amount of non-condensable gases in containment and the potential for deflagrations to inhibit the accident response.« less

  13. The ESA's Space Trajectory Analysis software suite

    NASA Astrophysics Data System (ADS)

    Ortega, Guillermo

    The European Space Agency (ESA) initiated in 2005 an internal activity to develop an open source software suite involving university science departments and research institutions all over the world. This project is called the "Space Trajectory Analysis" or STA. This article describes the birth of STA and its present configuration. One of the STA aims is to promote the exchange of technical ideas, and raise knowledge and competence in the areas of applied mathematics, space engineering, and informatics at University level. Conceived as a research and education tool to support the analysis phase of a space mission, STA is able to visualize a wide range of space trajectories. These include among others ascent, re-entry, descent and landing trajectories, orbits around planets and moons, interplanetary trajectories, rendezvous trajectories, etc. The article explains that STA project is an original idea of the Technical Directorate of ESA. It was born in August 2005 to provide a framework in astrodynamics research at University level. As research and education software applicable to Academia, a number of Universities support this development by joining ESA in leading the development. ESA and Universities partnership are expressed in the STA Steering Board. Together with ESA, each University has a chair in the board whose tasks are develop, control, promote, maintain, and expand the software suite. The article describes that STA provides calculations in the fields of spacecraft tracking, attitude analysis, coverage and visibility analysis, orbit determination, position and velocity of solar system bodies, etc. STA implements the concept of "space scenario" composed of Solar system bodies, spacecraft, ground stations, pads, etc. It is able to propagate the orbit of a spacecraft where orbital propagators are included. STA is able to compute communication links between objects of a scenario (coverage, line of sight), and to represent the trajectory computations and

  14. Freud: a software suite for high-throughput simulation analysis

    NASA Astrophysics Data System (ADS)

    Harper, Eric; Spellings, Matthew; Anderson, Joshua; Glotzer, Sharon

    Computer simulation is an indispensable tool for the study of a wide variety of systems. As simulations scale to fill petascale and exascale supercomputing clusters, so too does the size of the data produced, as well as the difficulty in analyzing these data. We present Freud, an analysis software suite for efficient analysis of simulation data. Freud makes no assumptions about the system being analyzed, allowing for general analysis methods to be applied to nearly any type of simulation. Freud includes standard analysis methods such as the radial distribution function, as well as new methods including the potential of mean force and torque and local crystal environment analysis. Freud combines a Python interface with fast, parallel C + + analysis routines to run efficiently on laptops, workstations, and supercomputing clusters. Data analysis on clusters reduces data transfer requirements, a prohibitive cost for petascale computing. Used in conjunction with simulation software, Freud allows for smart simulations that adapt to the current state of the system, enabling the study of phenomena such as nucleation and growth, intelligent investigation of phases and phase transitions, and determination of effective pair potentials.

  15. Software Performs Complex Design Analysis

    NASA Technical Reports Server (NTRS)

    2008-01-01

    Designers use computational fluid dynamics (CFD) to gain greater understanding of the fluid flow phenomena involved in components being designed. They also use finite element analysis (FEA) as a tool to help gain greater understanding of the structural response of components to loads, stresses and strains, and the prediction of failure modes. Automated CFD and FEA engineering design has centered on shape optimization, which has been hindered by two major problems: 1) inadequate shape parameterization algorithms, and 2) inadequate algorithms for CFD and FEA grid modification. Working with software engineers at Stennis Space Center, a NASA commercial partner, Optimal Solutions Software LLC, was able to utilize its revolutionary, one-of-a-kind arbitrary shape deformation (ASD) capability-a major advancement in solving these two aforementioned problems-to optimize the shapes of complex pipe components that transport highly sensitive fluids. The ASD technology solves the problem of inadequate shape parameterization algorithms by allowing the CFD designers to freely create their own shape parameters, therefore eliminating the restriction of only being able to use the computer-aided design (CAD) parameters. The problem of inadequate algorithms for CFD grid modification is solved by the fact that the new software performs a smooth volumetric deformation. This eliminates the extremely costly process of having to remesh the grid for every shape change desired. The program can perform a design change in a markedly reduced amount of time, a process that would traditionally involve the designer returning to the CAD model to reshape and then remesh the shapes, something that has been known to take hours, days-even weeks or months-depending upon the size of the model.

  16. What can the drivers' own description from combined sources provide in an analysis of driver distraction and low vigilance in accident situations?

    PubMed

    Tivesten, Emma; Wiberg, Henrik

    2013-03-01

    Accident data play an important role in vehicle safety development. Accident data sources are generally limited in terms of how much information is provided on driver states and behaviour prior to an accident. However, the precise limitations vary between databases, due to differences in analysis focus and data collection procedures between organisations. If information about a specific accident can be retrieved from more than one data source it should be possible to combine the available information sets to facilitate data from one source to compensate for limitations in the other(s). To investigate the viability of such compensation, this study identified a set of accidents recorded in two different data sources. The first data source investigated was an accident mail survey and the second data source insurance claims documents consisting predominantly of insurance claims completed by the involved road users. An analysis of survey variables was compared to a case analysis including word data derived from the same survey and filed insurance claims documents. For each accident, the added value of having access to more than one source of information was assessed. To limit the scope of this study, three particular topics were investigated: available information on low vigilance (e.g., being drowsy, ill); secondary task distraction (e.g., talking with passengers, mobile phone use); and distraction related to the driving task (e.g., looking for approaching vehicles). Results suggest that for low vigilance and secondary task distraction, a combination of the mail survey and insurance claims documents provide more reliable and detailed pre-crash information than survey variables alone. However, driving related distraction appears to be more difficult to capture. In order to gain a better understanding of the above issues and how frequently they occur in accidents, the data sources and analysis methods suggested here may be combined with other investigation methods such

  17. CAMBerVis: visualization software to support comparative analysis of multiple bacterial strains.

    PubMed

    Woźniak, Michał; Wong, Limsoon; Tiuryn, Jerzy

    2011-12-01

    A number of inconsistencies in genome annotations are documented among bacterial strains. Visualization of the differences may help biologists to make correct decisions in spurious cases. We have developed a visualization tool, CAMBerVis, to support comparative analysis of multiple bacterial strains. The software manages simultaneous visualization of multiple bacterial genomes, enabling visual analysis focused on genome structure annotations. The CAMBerVis software is freely available at the project website: http://bioputer.mimuw.edu.pl/camber. Input datasets for Mycobacterium tuberculosis and Staphylocacus aureus are integrated with the software as examples. m.wozniak@mimuw.edu.pl Supplementary data are available at Bioinformatics online.

  18. New Results in Software Model Checking and Analysis

    NASA Technical Reports Server (NTRS)

    Pasareanu, Corina S.

    2010-01-01

    This introductory article surveys new techniques, supported by automated tools, for the analysis of software to ensure reliability and safety. Special focus is on model checking techniques. The article also introduces the five papers that are enclosed in this special journal volume.

  19. Validating the Effectiveness of Switching the Vancomycin TDM Analysis Software Based on the Predictive Accuracy.

    PubMed

    Imai, Shungo; Yamada, Takehiro; Ishiguro, Nobuhisa; Miyamoto, Takenori; Kagami, Keisuke; Tomiyama, Naoki; Niinuma, Yusuke; Nagasaki, Daisuke; Suzuki, Koji; Yamagami, Akira; Kasashi, Kumiko; Kobayashi, Masaki; Iseki, Ken

    2017-01-01

    Based on the predictive performance in our previous study, we switched the therapeutic drug monitoring (TDM) analysis software for dose setting of vancomycin (VCM) from "Vancomycin MEEK TDM analysis software Ver2.0" (MEEK) to "SHIONOGI-VCM-TDM ver.2009" (VCM-TDM) in January 2015. In the present study, our aim was to validate the effectiveness of the changing VCM TDM analysis software in initial dose setting of VCM. The enrolled patients were divided into two groups, each having 162 patients in total, who received VCM with the initial dose set using MEEK (MEEK group) or VCM-TDM (VCM-TDM group). We compared the rates of attaining the therapeutic range (trough value; 10-20 μg/mL) of serum VCM concentration between the groups. Multivariate logistic regression analysis was performed to confirm that changing the VCM TDM analysis software was an independent factor related to attaining the therapeutic range. Switching the VCM TDM analysis software from MEEK to VCM-TDM improved the rate of attaining the therapeutic range by 21.6% (MEEK group: 42.6% vs. VCM-TDM group: 64.2%, p<0.01). Patient age ≥65 years, concomitant medication (furosemide) and the TDM analysis software used VCM-TDM were considered to be independent factors for attaining the therapeutic range. These results demonstrated the effectiveness of switching the VCM TDM analysis software from MEEK to VCM-TDM for initial dose setting of VCM.

  20. Squeal Those Tires! Automobile-Accident Reconstruction.

    ERIC Educational Resources Information Center

    Caples, Linda Griffin

    1992-01-01

    Methods use to reconstruct traffic accidents provide settings for real life applications for students in precalculus, mathematical analysis, or trigonometry. Described is the investigation of an accident in conjunction with the local Highway Patrol Academy integrating physics, vector, and trigonometry. Class findings were compared with those of…

  1. Review of U.S. Army Unmanned Aerial Systems Accident Reports: Analysis of Human Error Contributions

    DTIC Science & Technology

    2018-03-20

    USAARL Report No. 2018-08 Review of U.S. Army Unmanned Aerial Systems Accident Reports: Analysis of Human Error Contributions By Kathryn A...3 Statistical Analysis Approach ..............................................................................................3 Results...1 Introduction The success of unmanned aerial systems (UAS) operations relies upon a variety of factors, including, but not limited to

  2. Volumetric neuroimage analysis extensions for the MIPAV software package.

    PubMed

    Bazin, Pierre-Louis; Cuzzocreo, Jennifer L; Yassa, Michael A; Gandler, William; McAuliffe, Matthew J; Bassett, Susan S; Pham, Dzung L

    2007-09-15

    We describe a new collection of publicly available software tools for performing quantitative neuroimage analysis. The tools perform semi-automatic brain extraction, tissue classification, Talairach alignment, and atlas-based measurements within a user-friendly graphical environment. They are implemented as plug-ins for MIPAV, a freely available medical image processing software package from the National Institutes of Health. Because the plug-ins and MIPAV are implemented in Java, both can be utilized on nearly any operating system platform. In addition to the software plug-ins, we have also released a digital version of the Talairach atlas that can be used to perform regional volumetric analyses. Several studies are conducted applying the new tools to simulated and real neuroimaging data sets.

  3. [Forensic Analysis of 498 Road Traffic Accident Deaths in Haikou City].

    PubMed

    Bai, R; Chen, M

    2017-12-01

    To analyse the characteristics of road traffic accident deaths in Haikou city, and to provide reference for the identification of causes of death and the preventive measures. Totally 498 road traffic accident deaths accepted by the Traffic Police Branch of Haikou City Public Security Bureau in 2014-2016 were collected, and the related parameters such as sex, age, time of the accidents, travel mode of the victims, the types of vehicle and the cause of death were analysed. Most victims aged 21-40 years old with the sex ratio of 3:1, and the accidents mainly happened in March, April, May and October and peaked at 6:01-8:00 and 20:01-22:00 per day. Riding motorbike and electric bicycle, as travel modes, had the highest accident incidence (30.9%). The vast majority of involved vehicles were motorbike and electric bicycle (57.4%). The most common cause of death was craniocerebral injury, followed by chest and abdominal injury. The autopsy of road traffic accident deaths plays an important role in identification of death manner and responsibility confirmation. Copyright© by the Editorial Department of Journal of Forensic Medicine

  4. Data and Analysis Center for Software: An IAC in Transition.

    DTIC Science & Technology

    1983-06-01

    reviewed and is approved for publication. * APPROVEDt Proj ect Engineer . JOHN J. MARCINIAK, Colonel, USAF Chief, Command and Control Division . FOR THE CO...SUPPLEMENTARY NOTES RADC Project Engineer : John Palaimo (COEE) It. KEY WORDS (Conilnuo n rever*e aide if necessary and identify by block numober...Software Engineering Software Technology Information Analysis Center Database Scientific and Technical Information 20. ABSTRACT (Continue on reverse side It

  5. Thermodynamic analysis of cesium and iodine behavior in severe light water reactor accidents

    NASA Astrophysics Data System (ADS)

    Minato, Kazuo

    1991-11-01

    In order to understand the release and transport behavior of cesium (Cs) and iodine (I) in severe light water reactor accidents, chemical forms of Cs and I in steam-hydrogen mixtures were analyzed thermodynamically. In the calculations reactions of boron (B) with Cs were taken into consideration. The analysis showed that B plays an important role in determining chemical forms of Cs. The main Cs-containing species are CsBO 2(g) and CsBO 2(l), depending on temperature. The contribution of CsOH(g) is minor. The main I-containing species are HI(g) and CsI(g) over the wide ranges of the parameters considered. Calculations were also carried out under the conditions of the Three Mile Island Unit 2 accident.

  6. The Role of Data Analysis Software in Graduate Programs in Education and Post-Graduate Research

    ERIC Educational Resources Information Center

    Harwell, Michael

    2018-01-01

    The importance of data analysis software in graduate programs in education and post-graduate educational research is self-evident. However the role of this software in facilitating supererogated statistical practice versus "cookbookery" is unclear. The need to rigorously document the role of data analysis software in students' graduate…

  7. Risk assessment of maintenance operations: the analysis of performing task and accident mechanism.

    PubMed

    Carrillo-Castrillo, Jesús A; Rubio-Romero, Juan Carlos; Guadix, Jose; Onieva, Luis

    2015-01-01

    Maintenance operations cover a great number of occupations. Most small and medium-sized enterprises lack the appropriate information to conduct risk assessments of maintenance operations. The objective of this research is to provide a method based on the concepts of task and accident mechanisms for an initial risk assessment by taking into consideration the prevalence and severity of the maintenance accidents reported. Data were gathered from 11,190 reported accidents in maintenance operations in the manufacturing sector of Andalusia from 2003 to 2012. By using a semi-quantitative methodology, likelihood and severity were evaluated based on the actual distribution of accident mechanisms in each of the tasks. Accident mechanisms and tasks were identified by using those variables included in the European Statistics of Accidents at Work methodology. As main results, the estimated risk of the most frequent accident mechanisms identified for each of the analysed tasks is low and the only accident mechanisms with medium risk are accidents when lifting or pushing with physical stress on the musculoskeletal system in tasks involving carrying, and impacts against objects after slipping or stumbling for tasks involving movements. The prioritisation of public preventive actions for the accident mechanisms with a higher estimated risk is highly recommended.

  8. OASIS - ORBIT ANALYSIS AND SIMULATION SOFTWARE

    NASA Technical Reports Server (NTRS)

    Wu, S. C.

    1994-01-01

    The Orbit Analysis and Simulation Software, OASIS, is a software system developed for covariance and simulation analyses of problems involving earth satellites, especially the Global Positioning System (GPS). It provides a flexible, versatile and efficient accuracy analysis tool for earth satellite navigation and GPS-based geodetic studies. To make future modifications and enhancements easy, the system is modular, with five major modules: PATH/VARY, REGRES, PMOD, FILTER/SMOOTHER, and OUTPUT PROCESSOR. PATH/VARY generates satellite trajectories. Among the factors taken into consideration are: 1) the gravitational effects of the planets, moon and sun; 2) space vehicle orientation and shapes; 3) solar pressure; 4) solar radiation reflected from the surface of the earth; 5) atmospheric drag; and 6) space vehicle gas leaks. The REGRES module reads the user's input, then determines if a measurement should be made based on geometry and time. PMOD modifies a previously generated REGRES file to facilitate various analysis needs. FILTER/SMOOTHER is especially suited to a multi-satellite precise orbit determination and geodetic-type problems. It can be used for any situation where parameters are simultaneously estimated from measurements and a priori information. Examples of nonspacecraft areas of potential application might be Very Long Baseline Interferometry (VLBI) geodesy and radio source catalogue studies. OUTPUT PROCESSOR translates covariance analysis results generated by FILTER/SMOOTHER into user-desired easy-to-read quantities, performs mapping of orbit covariances and simulated solutions, transforms results into different coordinate systems, and computes post-fit residuals. The OASIS program was developed in 1986. It is designed to be implemented on a DEC VAX 11/780 computer using VAX VMS 3.7 or higher. It can also be implemented on a Micro VAX II provided sufficient disk space is available.

  9. Accident Precursor Analysis and Management: Reducing Technological Risk Through Diligence

    NASA Technical Reports Server (NTRS)

    Phimister, James R. (Editor); Bier, Vicki M. (Editor); Kunreuther, Howard C. (Editor)

    2004-01-01

    Almost every year there is at least one technological disaster that highlights the challenge of managing technological risk. On February 1, 2003, the space shuttle Columbia and her crew were lost during reentry into the atmosphere. In the summer of 2003, there was a blackout that left millions of people in the northeast United States without electricity. Forensic analyses, congressional hearings, investigations by scientific boards and panels, and journalistic and academic research have yielded a wealth of information about the events that led up to each disaster, and questions have arisen. Why were the events that led to the accident not recognized as harbingers? Why were risk-reducing steps not taken? This line of questioning is based on the assumption that signals before an accident can and should be recognized. To examine the validity of this assumption, the National Academy of Engineering (NAE) undertook the Accident Precursors Project in February 2003. The project was overseen by a committee of experts from the safety and risk-sciences communities. Rather than examining a single accident or incident, the committee decided to investigate how different organizations anticipate and assess the likelihood of accidents from accident precursors. The project culminated in a workshop held in Washington, D.C., in July 2003. This report includes the papers presented at the workshop, as well as findings and recommendations based on the workshop results and committee discussions. The papers describe precursor strategies in aviation, the chemical industry, health care, nuclear power and security operations. In addition to current practices, they also address some areas for future research.

  10. Analysis of Variance: What Is Your Statistical Software Actually Doing?

    ERIC Educational Resources Information Center

    Li, Jian; Lomax, Richard G.

    2011-01-01

    Users assume statistical software packages produce accurate results. In this article, the authors systematically examined Statistical Package for the Social Sciences (SPSS) and Statistical Analysis System (SAS) for 3 analysis of variance (ANOVA) designs, mixed-effects ANOVA, fixed-effects analysis of covariance (ANCOVA), and nested ANOVA. For each…

  11. Methodology for object-oriented real-time systems analysis and design: Software engineering

    NASA Technical Reports Server (NTRS)

    Schoeffler, James D.

    1991-01-01

    Successful application of software engineering methodologies requires an integrated analysis and design life-cycle in which the various phases flow smoothly 'seamlessly' from analysis through design to implementation. Furthermore, different analysis methodologies often lead to different structuring of the system so that the transition from analysis to design may be awkward depending on the design methodology to be used. This is especially important when object-oriented programming is to be used for implementation when the original specification and perhaps high-level design is non-object oriented. Two approaches to real-time systems analysis which can lead to an object-oriented design are contrasted: (1) modeling the system using structured analysis with real-time extensions which emphasizes data and control flows followed by the abstraction of objects where the operations or methods of the objects correspond to processes in the data flow diagrams and then design in terms of these objects; and (2) modeling the system from the beginning as a set of naturally occurring concurrent entities (objects) each having its own time-behavior defined by a set of states and state-transition rules and seamlessly transforming the analysis models into high-level design models. A new concept of a 'real-time systems-analysis object' is introduced and becomes the basic building block of a series of seamlessly-connected models which progress from the object-oriented real-time systems analysis and design system analysis logical models through the physical architectural models and the high-level design stages. The methodology is appropriate to the overall specification including hardware and software modules. In software modules, the systems analysis objects are transformed into software objects.

  12. Software selection based on analysis and forecasting methods, practised in 1C

    NASA Astrophysics Data System (ADS)

    Vazhdaev, A. N.; Chernysheva, T. Y.; Lisacheva, E. I.

    2015-09-01

    The research focuses on the problem of a “1C: Enterprise 8” platform inboard mechanisms for data analysis and forecasting. It is important to evaluate and select proper software to develop effective strategies for customer relationship management in terms of sales, as well as implementation and further maintenance of software. Research data allows creating new forecast models to schedule further software distribution.

  13. Comprehensive analysis of atmospheric radionuclides just after the Fukushima accident

    NASA Astrophysics Data System (ADS)

    Tsuruta, Haruo; Oura, Yasuji; Ebihara, Mitsuru; Ohara, Toshimasa; Moriguchi, Yuichi; Nakajima, Teruyuki

    2017-04-01

    Even six years passed after the Fukushima Daiichi Nuclear Power Plant (FD1NPP) accident, we still have large uncertainty for atmospheric transport and deposition models, the estimate of release rate of source terms and of internal exposure from inhalation. For our better understanding and to reduce the uncertainty, we thoroughly analyzed all the published data of radionuclides such as Cs-137, I-131 and Xe-133, and of radiation dose rates at many monitoring sites in eastern Japan. We also retrieved the spatio-temporal distributions of Cs-137 just after the accident by using the unique dataset of hourly radionuclides in atmospheric aerosols collected on the used filter-tapes installed in the suspended particulate matter (SPM) monitors operated at more than 100 stations in the air pollution monitoring network of Japan. The most important findings are summarized as follows. Analyzing the hourly Cs-137 concentrations at two SPM stations located within 20 km from the FD1NPP, we revealed the complicated behavior of plumes and atmospheric radionuclides near the FD1NPP just after the accident. The transport pathways to the northwestern and northern areas from the FD1NPP are clarified especially on March 12-21, 2011. Analysis of the published data clearly shows that atmospheric ratio of I-131/Cs-137 (=R) was mainly divided into two groups, one (R≦10) is for the plumes before March 21, 2011, and the other (R>100) is after that day. These two groups are consistent in all the measured sites, whether the sites are in the Fukushima prefecture or in the Tokyo Metropolitan area. These results are expected partially to identify the source term for each plume.

  14. The software application and classification algorithms for welds radiograms analysis

    NASA Astrophysics Data System (ADS)

    Sikora, R.; Chady, T.; Baniukiewicz, P.; Grzywacz, B.; Lopato, P.; Misztal, L.; Napierała, L.; Piekarczyk, B.; Pietrusewicz, T.; Psuj, G.

    2013-01-01

    The paper presents a software implementation of an Intelligent System for Radiogram Analysis (ISAR). The system has to support radiologists in welds quality inspection. The image processing part of software with a graphical user interface and a welds classification part are described with selected classification results. Classification was based on a few algorithms: an artificial neural network, a k-means clustering, a simplified k-means and a rough sets theory.

  15. Development of a bespoke human factors taxonomy for gliding accident analysis and its revelations about highly inexperienced UK glider pilots.

    PubMed

    Jarvis, Steve; Harris, Don

    2010-02-01

    Low-hours solo glider pilots have a high risk of accidents compared to more experienced pilots. Numerous taxonomies for causal accident analysis have been produced for powered aviation but none of these is suitable for gliding, so a new taxonomy was required. A human factors taxonomy specifically for glider operations was developed and used to analyse all UK gliding accidents from 2002 to 2006 for their overall causes as well as factors specific to low hours pilots. Fifty-nine categories of pilot-related accident causation emerged, which were formed into progressively larger categories until four overall human factors groups were arrived at: 'judgement'; 'handling'; 'strategy'; 'attention'. 'Handling' accounted for a significantly higher proportion of injuries than other categories. Inexperienced pilots had considerably more accidents in all categories except 'strategy'. Approach control (path judgement, airbrake and speed handling) as well as landing flare misjudgement were chiefly responsible for the high accident rate in early solo glider pilots. Statement of Relevance: This paper uses extant accident data to produce a taxonomy of underlying human factors causes to analyse gliding accidents and identify the specific causes associated with low hours pilots. From this specific, well-targeted remedial measures can be identified.

  16. [Spatial analysis of road traffic accidents with fatalities in Spain, 2008-2011].

    PubMed

    Gómez-Barroso, Diana; López-Cuadrado, Teresa; Llácer, Alicia; Palmera Suárez, Rocío; Fernández-Cuenca, Rafael

    2015-09-01

    To estimate the areas of greatest density of road traffic accidents with fatalities at 24 hours per km(2)/year in Spain from 2008 to 2011, using a geographic information system. Accidents were geocodified using the road and kilometer points where they occurred. The average nearest neighbor was calculated to detect possible clusters and to obtain the bandwidth for kernel density estimation. A total of 4775 accidents were analyzed, of which 73.3% occurred on conventional roads. The estimated average distance between accidents was 1,242 meters, and the average expected distance was 10,738 meters. The nearest neighbor index was 0.11, indicating that there were aggregations of accidents in space. A map showing the kernel density was obtained with a resolution of 1 km(2), which identified the areas of highest density. This methodology allowed a better approximation to locating accident risks by taking into account kilometer points. The map shows areas where there was a greater density of accidents. This could be an advantage in decision-making by the relevant authorities. Copyright © 2014 SESPAS. Published by Elsevier Espana. All rights reserved.

  17. Analysis of Workplace Accidents in Automotive Repair Workshops in Spain.

    PubMed

    López-Arquillos, Antonio; Rubio-Romero, Juan Carlos

    2016-09-01

    To analyze the effects of the factors associated with different types of injury (superficial wounds, dislocations and sprains, bone fractures, concussion and internal injuries, burns scalding and freezing) caused by occupational accidents in automotive repair workshops. Study of a sample consisting of 89,954 industry accidents reported from 2003 to 2008. Odds ratios were calculated with a 95% confidence interval. Belonging to a small company is a risk factor for suffering three of the five types of injury studied. Women are less likely to suffer burns and superficial wounds, and more likely to suffer dislocations or sprains. Foreign workers are more likely to suffer concussion and internal injuries. Health and safety strategies and accident prevention measures should be individualized and adapted to the type of worker most likely to be injured in each type of accident. Occupational health and safety training courses designed according to worker profile, and improving the participation of the workers in small firms creating regional or roving safety representatives would improve working conditions.

  18. An Accident Precursor Analysis Process Tailored for NASA Space Systems

    NASA Technical Reports Server (NTRS)

    Groen, Frank; Stamatelatos, Michael; Dezfuli, Homayoon; Maggio, Gaspare

    2010-01-01

    Accident Precursor Analysis (APA) serves as the bridge between existing risk modeling activities, which are often based on historical or generic failure statistics, and system anomalies, which provide crucial information about the failure mechanisms that are actually operative in the system and which may differ in frequency or type from those in the various models. These discrepancies between the models (perceived risk) and the system (actual risk) provide the leading indication of an underappreciated risk. This paper presents an APA process developed specifically for NASA Earth-to-Orbit space systems. The purpose of the process is to identify and characterize potential sources of system risk as evidenced by anomalous events which, although not necessarily presenting an immediate safety impact, may indicate that an unknown or insufficiently understood risk-significant condition exists in the system. Such anomalous events are considered accident precursors because they signal the potential for severe consequences that may occur in the future, due to causes that are discernible from their occurrence today. Their early identification allows them to be integrated into the overall system risk model used to intbrm decisions relating to safety.

  19. Software Programs Derive Measurements from Photographs

    NASA Technical Reports Server (NTRS)

    2012-01-01

    Even under the most unfortunate circumstances, NASA continues on a path of innovation. After the Space Shuttle Columbia reentered the atmosphere on February 1, 2003, it experienced a catastrophic failure, and the entire crew and vehicle were lost. For the two weeks prior to the accident, Columbia STS-107 was on a mission to perform physical, life, and space sciences research in the unique environment of microgravity. Following the accident, the remaining shuttles - Endeavor, Atlantis, and Discovery - were grounded, and an intense investigation ensued. The Columbia Accident Investigation Board spent nearly 7 months examining the cause of the accident and determining what would ensure a safe return to flight. To this end, investigators performed an extensive review down five analytic paths: aerodynamic, thermodynamic, sensor data timeline, debris reconstruction, and imaging. As part of the evaluation of all the available imagery from Columbia's ascent, orbit, and entry, investigators needed a new method for analyzing still video images to determine the size of the material that fell from Columbia, as well as the distance that the material traveled. John Lane, a scientist at Kennedy Space Center, devised a software program to calculate the unknown dimension of the material in the images, and soon after the investigation was complete, continued to enhance the technology. Eventually, the program that assisted in the Columbia investigation became available for licensing.

  20. Development of Data Processing Software for NBI Spectroscopic Analysis System

    NASA Astrophysics Data System (ADS)

    Zhang, Xiaodan; Hu, Chundong; Sheng, Peng; Zhao, Yuanzhe; Wu, Deyun; Cui, Qinglong

    2015-04-01

    A set of data processing software is presented in this paper for processing NBI spectroscopic data. For better and more scientific managment and querying these data, they are managed uniformly by the NBI data server. The data processing software offers the functions of uploading beam spectral original and analytic data to the data server manually and automatically, querying and downloading all the NBI data, as well as dealing with local LZO data. The set software is composed of a server program and a client program. The server software is programmed in C/C++ under a CentOS development environment. The client software is developed under a VC 6.0 platform, which offers convenient operational human interfaces. The network communications between the server and the client are based on TCP. With the help of this set software, the NBI spectroscopic analysis system realizes the unattended automatic operation, and the clear interface also makes it much more convenient to offer beam intensity distribution data and beam power data to operators for operation decision-making. supported by National Natural Science Foundation of China (No. 11075183), the Chinese Academy of Sciences Knowledge Innovation

  1. Accidents in the school environment: perspectives of staff concerned with data collection and reporting procedures.

    PubMed

    Williams, W R; Latif, A H; Cater, L

    2003-05-01

    School-accident reports document incidents that have resulted in children requiring assistance from staff in the education and healthcare sectors. This study was undertaken to investigate the collection and use of data by agencies concerned with the school-accident problem. Our aim was to determine if the annual collection and use of such a large body of data might be improved through better management procedures. Interviews were conducted with primary and secondary school staff in one education authority. Interviewees completed a questionnaire on accident activity and accident reporting in their school. In the healthcare sector, staff from the Schools' Office and the ambulance unit servicing the schools provided information on their collection and use of data. Our survey found that accident activity is usually a private matter for individual schools, shared to varying degrees with the education authority. Playgrounds, children's behaviour and footwear carried much of the blame for the injuries sustained. Staff generally accepted the current accident rates. The compilation of accident data by the Schools' Office, accident and emergency department, and ambulance service were compromised by deficiencies in computerization and computer software. The management and utilization of school-accident data could be improved by better collaboration within and between the education and healthcare agencies.

  2. IMMAN: free software for information theory-based chemometric analysis.

    PubMed

    Urias, Ricardo W Pino; Barigye, Stephen J; Marrero-Ponce, Yovani; García-Jacas, César R; Valdes-Martiní, José R; Perez-Gimenez, Facundo

    2015-05-01

    The features and theoretical background of a new and free computational program for chemometric analysis denominated IMMAN (acronym for Information theory-based CheMoMetrics ANalysis) are presented. This is multi-platform software developed in the Java programming language, designed with a remarkably user-friendly graphical interface for the computation of a collection of information-theoretic functions adapted for rank-based unsupervised and supervised feature selection tasks. A total of 20 feature selection parameters are presented, with the unsupervised and supervised frameworks represented by 10 approaches in each case. Several information-theoretic parameters traditionally used as molecular descriptors (MDs) are adapted for use as unsupervised rank-based feature selection methods. On the other hand, a generalization scheme for the previously defined differential Shannon's entropy is discussed, as well as the introduction of Jeffreys information measure for supervised feature selection. Moreover, well-known information-theoretic feature selection parameters, such as information gain, gain ratio, and symmetrical uncertainty are incorporated to the IMMAN software ( http://mobiosd-hub.com/imman-soft/ ), following an equal-interval discretization approach. IMMAN offers data pre-processing functionalities, such as missing values processing, dataset partitioning, and browsing. Moreover, single parameter or ensemble (multi-criteria) ranking options are provided. Consequently, this software is suitable for tasks like dimensionality reduction, feature ranking, as well as comparative diversity analysis of data matrices. Simple examples of applications performed with this program are presented. A comparative study between IMMAN and WEKA feature selection tools using the Arcene dataset was performed, demonstrating similar behavior. In addition, it is revealed that the use of IMMAN unsupervised feature selection methods improves the performance of both IMMAN and WEKA

  3. Learning from examples - Generation and evaluation of decision trees for software resource analysis

    NASA Technical Reports Server (NTRS)

    Selby, Richard W.; Porter, Adam A.

    1988-01-01

    A general solution method for the automatic generation of decision (or classification) trees is investigated. The approach is to provide insights through in-depth empirical characterization and evaluation of decision trees for software resource data analysis. The trees identify classes of objects (software modules) that had high development effort. Sixteen software systems ranging from 3,000 to 112,000 source lines were selected for analysis from a NASA production environment. The collection and analysis of 74 attributes (or metrics), for over 4,700 objects, captured information about the development effort, faults, changes, design style, and implementation style. A total of 9,600 decision trees were automatically generated and evaluated. The trees correctly identified 79.3 percent of the software modules that had high development effort or faults, and the trees generated from the best parameter combinations correctly identified 88.4 percent of the modules on the average.

  4. 1997 Oregon state highway accident rate tables

    DOT National Transportation Integrated Search

    1998-08-01

    The three parts of this report are: : I Results of Analysis containing comparative tables and the Signed Route on Highway list, : II Five year accident rate data by highway sections, : III A summary of this year's fatal traffic accidents. : The first...

  5. Methods for nuclear air-cleaning-system accident-consequence assessment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Andrae, R.W.; Bolstad, J.W.; Gregory, W.S.

    1982-01-01

    This paper describes a multilaboratory research program that is directed toward addressing many questions that analysts face when performing air cleaning accident consequence assessments. The program involves developing analytical tools and supportive experimental data that will be useful in making more realistic assessments of accident source terms within and up to the atmospheric boundaries of nuclear fuel cycle facilities. The types of accidents considered in this study includes fires, explosions, spills, tornadoes, criticalities, and equipment failures. The main focus of the program is developing an accident analysis handbook (AAH). We will describe the contents of the AAH, which include descriptionsmore » of selected nuclear fuel cycle facilities, process unit operations, source-term development, and accident consequence analyses. Three computer codes designed to predict gas and material propagation through facility air cleaning systems are described. These computer codes address accidents involving fires (FIRAC), explosions (EXPAC), and tornadoes (TORAC). The handbook relies on many illustrative examples to show the analyst how to approach accident consequence assessments. We will use the FIRAC code and a hypothetical fire scenario to illustrate the accident analysis capability.« less

  6. Primary school accident reporting in one education authority.

    PubMed

    Latif, A H A; Williams, W R; Sibert, J

    2002-02-01

    Studies have shown a correlation between increased accident rates and levels of deprivation in the community. School accident reporting is one area where an association might be expected. To investigate differences in primary school accident rates in deprived and more affluent wards, in an area managed by one education authority. Statistical analysis of accident form returns for 100 primary schools in one education authority in Wales over a two year period, in conjunction with visits to over one third of school sites. Accident report rates from schools in deprived wards were three times higher than those from schools in more affluent wards. School visits showed that this discrepancy was attributable primarily to differences in reporting procedures. One third of schools did not report accidents and approximately half did not keep records of minor accidents. The association between school accident report rates and deprivation in the community is complex. School accident data from local education authorities may be unreliable for most purposes of collection.

  7. Development of a bespoke human factors taxonomy for gliding accident analysis and its revelations about highly inexperienced UK glider pilots.

    PubMed

    Jarvis, Steve; Harris, Don

    2009-08-01

    Low-hours solo glider pilots have a high risk of accidents compared to more experienced pilots. Numerous taxonomies for causal accident analysis have been produced for powered aviation but none of these is suitable for gliding, so a new taxonomy was required. A human factors taxonomy specifically for glider operations was developed and used to analyse all UK gliding accidents from 2002 to 2006 for their overall causes as well as factors specific to low hours pilots. Fifty-nine categories of pilot-related accident causation emerged, which were formed into progressively larger categories until four overall human factors groups were arrived at: 'judgement'; 'handling'; 'strategy'; 'attention'. 'Handling' accounted for a significantly higher proportion of injuries than other categories. Inexperienced pilots had considerably more accidents in all categories except 'strategy'. Approach control (path judgement, airbrake and speed handling) as well as landing flare misjudgement were chiefly responsible for the high accident rate in early solo glider pilots.

  8. A comparison of the hazard perception ability of accident-involved and accident-free motorcycle riders.

    PubMed

    Cheng, Andy S K; Ng, Terry C K; Lee, Hoe C

    2011-07-01

    Hazard perception is the ability to read the road and is closely related to involvement in traffic accidents. It consists of both cognitive and behavioral components. Within the cognitive component, visual attention is an important function of driving whereas driving behavior, which represents the behavioral component, can affect the hazard perception of the driver. Motorcycle riders are the most vulnerable types of road user. The primary purpose of this study was to deepen our understanding of the correlation of different subtypes of visual attention and driving violation behaviors and their effect on hazard perception between accident-free and accident-involved motorcycle riders. Sixty-three accident-free and 46 accident-involved motorcycle riders undertook four neuropsychological tests of attention (Digit Vigilance Test, Color Trails Test-1, Color Trails Test-2, and Symbol Digit Modalities Test), filled out the Chinese Motorcycle Rider Driving Violation (CMRDV) Questionnaire, and viewed a road-user-based hazard situation with an eye-tracking system to record the response latencies to potentially dangerous traffic situations. The results showed that both the divided and selective attention of accident-involved motorcycle riders were significantly inferior to those of accident-free motorcycle riders, and that accident-involved riders exhibited significantly higher driving violation behaviors and took longer to identify hazardous situations compared to their accident-free counterparts. However, the results of the regression analysis showed that aggressive driving violation CMRDV score significantly predicted hazard perception and accident involvement of motorcycle riders. Given that all participants were mature and experienced motorcycle riders, the most plausible explanation for the differences between them is their driving style (influenced by an undesirable driving attitude), rather than skill deficits per se. The present study points to the importance of

  9. GammaLib and ctools. A software framework for the analysis of astronomical gamma-ray data

    NASA Astrophysics Data System (ADS)

    Knödlseder, J.; Mayer, M.; Deil, C.; Cayrou, J.-B.; Owen, E.; Kelley-Hoskins, N.; Lu, C.-C.; Buehler, R.; Forest, F.; Louge, T.; Siejkowski, H.; Kosack, K.; Gerard, L.; Schulz, A.; Martin, P.; Sanchez, D.; Ohm, S.; Hassan, T.; Brau-Nogué, S.

    2016-08-01

    The field of gamma-ray astronomy has seen important progress during the last decade, yet to date no common software framework has been developed for the scientific analysis of gamma-ray telescope data. We propose to fill this gap by means of the GammaLib software, a generic library that we have developed to support the analysis of gamma-ray event data. GammaLib was written in C++ and all functionality is available in Python through an extension module. Based on this framework we have developed the ctools software package, a suite of software tools that enables flexible workflows to be built for the analysis of Imaging Air Cherenkov Telescope event data. The ctools are inspired by science analysis software available for existing high-energy astronomy instruments, and they follow the modular ftools model developed by the High Energy Astrophysics Science Archive Research Center. The ctools were written in Python and C++, and can be either used from the command line via shell scripts or directly from Python. In this paper we present the GammaLib and ctools software versions 1.0 that were released at the end of 2015. GammaLib and ctools are ready for the science analysis of Imaging Air Cherenkov Telescope event data, and also support the analysis of Fermi-LAT data and the exploitation of the COMPTEL legacy data archive. We propose using ctools as the science tools software for the Cherenkov Telescope Array Observatory.

  10. Physician flight accidents.

    DOT National Transportation Integrated Search

    1966-09-01

    An analysis of physician flight accidents during the period 1964-1965 is presented. More than thirty physicians sustained fatal injuries while piloting light aircraft: a fatality record four times the ratio of physician pilots in the general aviation...

  11. Development of automation software for neutron activation analysis process in Malaysian nuclear agency

    NASA Astrophysics Data System (ADS)

    Yussup, N.; Rahman, N. A. A.; Ibrahim, M. M.; Mokhtar, M.; Salim, N. A. A.; Soh@Shaari, S. C.; Azman, A.

    2017-01-01

    Neutron Activation Analysis (NAA) process has been established in Malaysian Nuclear Agency (Nuclear Malaysia) since 1980s. Most of the procedures established especially from sample registration to sample analysis are performed manually. These manual procedures carried out by the NAA laboratory personnel are time consuming and inefficient. Hence, a software to support the system automation is developed to provide an effective method to replace redundant manual data entries and produce faster sample analysis and calculation process. This paper describes the design and development of automation software for NAA process which consists of three sub-programs. The sub-programs are sample registration, hardware control and data acquisition; and sample analysis. The data flow and connection between the sub-programs will be explained. The software is developed by using National Instrument LabView development package.

  12. [Violence and accidents among older and younger adults: evidence from the Surveillance System for Violence and Accidents (VIVA), Brazil].

    PubMed

    Luz, Tatiana Chama Borges; Malta, Deborah Carvalho; Sá, Naíza Nayla Bandeira de; Silva, Marta Maria Alves da; Lima-Costa, Maria Fernanda

    2011-11-01

    Data from the Brazilian Surveillance System for Violence and Accidents (VIVA) in 2009 were used to examine socio-demographic characteristics, outcomes, and types of accidents and violence treated at 74 sentinel emergency services in 23 Brazilian State capitals and the Federal District. The analysis included 25,201 individuals aged > 20 years (10.1% > 60 years); 89.3% were victims of accidents and 11.9% victims of violence. Hospitalization was the outcome in 11.1% of cases. Compared to the general population, there were more men and non-white individuals among victims of accidents, and especially among victims of violence. As compared to younger adults (20-59 years), accidents and violence against elderly victims showed less association with alcohol, a higher proportion of domestic incidents, more falls and pedestrian accidents, and aggression by family members. Policies for the prevention of accidents and violence should consider the characteristics of these events in the older population.

  13. Secondary school accident reporting in one education authority.

    PubMed

    Williams, W R; Latif, A H A; Sibert, J

    2002-01-01

    Secondary schools appear to have very different accident rates when they are compared on the basis of accident report returns. The variation may be as a result of real differences in accident rates or different reporting procedures. This study investigates accident reporting from secondary schools and, in particular, the role of the school nurse. Accident form returns covering a 2-year period were collected for statistical analysis from 13 comprehensive schools in one local education authority in Wales. School sites were visited in the following school year to obtain information about accident records held on site and accident reporting procedures. The main factors determining the number of school accident reports submitted to the education authority relate to differences in recording and reporting procedures, such as the employment of a nurse and the policy of the head teacher/safety officer on submitting accident returns. Accident and emergency department referrals from similar schools may show significant differences in specific injuries and their causes. The level of school accident activity cannot be gauged from reports submitted to the education authority. Lack of incentives for collecting good accident data, in conjunction with the degree of complacency in the current system, suggest that future accident rates and reporting activity are unlikely to change.

  14. An analysis of functional shoulder movements during task performance using Dartfish movement analysis software.

    PubMed

    Khadilkar, Leenesh; MacDermid, Joy C; Sinden, Kathryn E; Jenkyn, Thomas R; Birmingham, Trevor B; Athwal, George S

    2014-01-01

    Video-based movement analysis software (Dartfish) has potential for clinical applications for understanding shoulder motion if functional measures can be reliably obtained. The primary purpose of this study was to describe the functional range of motion (ROM) of the shoulder used to perform a subset of functional tasks. A second purpose was to assess the reliability of functional ROM measurements obtained by different raters using Dartfish software. Ten healthy participants, mean age 29 ± 5 years, were videotaped while performing five tasks selected from the Disabilities of the Arm, Shoulder and Hand (DASH). Video cameras and markers were used to obtain video images suitable for analysis in Dartfish software. Three repetitions of each task were performed. Shoulder movements from all three repetitions were analyzed using Dartfish software. The tracking tool of the Dartfish software was used to obtain shoulder joint angles and arcs of motion. Test-retest and inter-rater reliability of the measurements were evaluated using intraclass correlation coefficients (ICC). Maximum (coronal plane) abduction (118° ± 16°) and (sagittal plane) flexion (111° ± 15°) was observed during 'washing one's hair;' maximum extension (-68° ± 9°) was identified during 'washing one's own back.' Minimum shoulder ROM was observed during 'opening a tight jar' (33° ± 13° abduction and 13° ± 19° flexion). Test-retest reliability (ICC = 0.45 to 0.94) suggests high inter-individual task variability, and inter-rater reliability (ICC = 0.68 to 1.00) showed moderate to excellent agreement. KEY FINDINGS INCLUDE: 1) functional shoulder ROM identified in this study compared to similar studies; 2) healthy individuals require less than full ROM when performing five common ADL tasks 3) high participant variability was observed during performance of the five ADL tasks; and 4) Dartfish software provides a clinically relevant tool to analyze shoulder function.

  15. Analysis of Performance of Stereoscopic-Vision Software

    NASA Technical Reports Server (NTRS)

    Kim, Won; Ansar, Adnan; Steele, Robert; Steinke, Robert

    2007-01-01

    A team of JPL researchers has analyzed stereoscopic vision software and produced a document describing its performance. This software is of the type used in maneuvering exploratory robotic vehicles on Martian terrain. The software in question utilizes correlations between portions of the images recorded by two electronic cameras to compute stereoscopic disparities, which, in conjunction with camera models, are used in computing distances to terrain points to be included in constructing a three-dimensional model of the terrain. The analysis included effects of correlation- window size, a pyramidal image down-sampling scheme, vertical misalignment, focus, maximum disparity, stereo baseline, and range ripples. Contributions of sub-pixel interpolation, vertical misalignment, and foreshortening to stereo correlation error were examined theoretically and experimentally. It was found that camera-calibration inaccuracy contributes to both down-range and cross-range error but stereo correlation error affects only the down-range error. Experimental data for quantifying the stereo disparity error were obtained by use of reflective metrological targets taped to corners of bricks placed at known positions relative to the cameras. For the particular 1,024-by-768-pixel cameras of the system analyzed, the standard deviation of the down-range disparity error was found to be 0.32 pixel.

  16. Power Analysis Tutorial for Experimental Design Software

    DTIC Science & Technology

    2014-11-01

    I N S T I T U T E F O R D E F E N S E A N A L Y S E S IDA Document D-5205 November 2014 Power Analysis Tutorial for Experimental Design Software...16) [Jun 2013]. I N S T I T U T E F O R D E F E N S E A N A L Y S E S IDA Document D-5205 Power Analysis Tutorial for Experimental Design ...Test and Evaluation (T&E) community is increasing its employment of Design of Experiments (DOE), a rigorous methodology for planning and evaluating

  17. Risk of road accident associated with the use of drugs: a systematic review and meta-analysis of evidence from epidemiological studies.

    PubMed

    Elvik, Rune

    2013-11-01

    This paper is a corrigendum to a previously published paper where errors were detected. The errors have been corrected in this paper. The paper is otherwise identical to the previously published paper. A systematic review and meta-analysis of studies that have assessed the risk of accident associated with the use of drugs when driving is presented. The meta-analysis included 66 studies containing a total of 264 estimates of the effects on accident risk of using illicit or prescribed drugs when driving. Summary estimates of the odds ratio of accident involvement are presented for amphetamines, analgesics, anti-asthmatics, anti-depressives, anti-histamines, benzodiazepines, cannabis, cocaine, opiates, penicillin and zopiclone (a sleeping pill). For most of the drugs, small or moderate increases in accident risk associated with the use of the drugs were found. Information about whether the drugs were actually used while driving and about the doses used was often imprecise. Most studies that have evaluated the presence of a dose-response relationship between the dose of drugs taken and the effects on accident risk confirm the existence of a dose-response relationship. Use of drugs while driving tends to have a larger effect on the risk of fatal and serious injury accidents than on the risk of less serious accidents (usually property-damage-only accidents). The quality of the studies that have assessed risk varied greatly. There was a tendency for the estimated effects of drug use on accident risk to be smaller in well-controlled studies than in poorly controlled studies. Evidence of publication bias was found for some drugs. The associations found cannot be interpreted as causal relationships, principally because most studies do not control very well for potentially confounding factors. Copyright © 2012 Elsevier Ltd. All rights reserved.

  18. Final safety analysis report for the Galileo Mission: Volume 2, Book 2: Accident model document: Appendices

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1988-12-15

    This section of the Accident Model Document (AMD) presents the appendices which describe the various analyses that have been conducted for use in the Galileo Final Safety Analysis Report II, Volume II. Included in these appendices are the approaches, techniques, conditions and assumptions used in the development of the analytical models plus the detailed results of the analyses. Also included in these appendices are summaries of the accidents and their associated probabilities and environment models taken from the Shuttle Data Book (NSTS-08116), plus summaries of the several segments of the recent GPHS safety test program. The information presented in thesemore » appendices is used in Section 3.0 of the AMD to develop the Failure/Abort Sequence Trees (FASTs) and to determine the fuel releases (source terms) resulting from the potential Space Shuttle/IUS accidents throughout the missions.« less

  19. School sports accidents: analysis of causes, modes, and frequencies.

    PubMed

    Kelm, J; Ahlhelm, F; Pape, D; Pitsch, W; Engel, C

    2001-01-01

    About 5% of all school children are seriously injured during physical education every year. Because of its influence on children's attitude toward sports and the economic aspects, an evaluation of causes and medical consequences is necessary. In this study, 213 school sports accidents were investigated. Besides diagnosis, the localization of injuries, as well as the duration of the sick leave were documented. Average age of injured students was 13 years. Most of the injured students blamed themselves for the accident. The most common injuries were sprains, contusions, and fractures. Main reasons for the accidents were faults in basic motion training. Playing soccer and basketball were the most frequent reasons for injuries. The upper extremity was more frequently involved than the lower extremity. Sports physicians and teachers should work out a program outlining the individual needs and capabilities of the injured students to reintegrate them into physical education.

  20. Linguistic methodology for the analysis of aviation accidents

    NASA Technical Reports Server (NTRS)

    Goguen, J. A.; Linde, C.

    1983-01-01

    A linguistic method for the analysis of small group discourse, was developed and the use of this method on transcripts of commercial air transpot accidents is demonstrated. The method identifies the discourse types that occur and determine their linguistic structure; it identifies significant linguistic variables based upon these structures or other linguistic concepts such as speech act and topic; it tests hypotheses that support significance and reliability of these variables; and it indicates the implications of the validated hypotheses. These implications fall into three categories: (1) to train crews to use more nearly optimal communication patterns; (2) to use linguistic variables as indices for aspects of crew performance such as attention; and (3) to provide guidelines for the design of aviation procedures and equipment, especially those that involve speech.

  1. Reliability of skeletal maturity analysis using the cervical vertebrae maturation method on dedicated software.

    PubMed

    Padalino, Saverio; Sfondrini, Maria Francesca; Chenuil, Laura; Scudeller, Luigia; Gandini, Paola

    2014-12-01

    The aim of this study was to assess the feasibility of skeletal maturation analysis using the Cervical Vertebrae Maturation (CVM) method by means of dedicated software, developed in collaboration with Outside Format (Paullo-Milan), as compared with manual analysis. From a sample of patients aged 7-21 years, we gathered 100 lateral cephalograms, 20 for each of the five CVM stages. For each cephalogram, we traced cervical vertebrae C2, C3 and C4 by hand using a lead pencil and an acetate sheet and dedicated software. All the tracings were made by an experienced operator (a dentofacial orthopedics resident) and by an inexperienced operator (a student in dental surgery). Each operator recorded the time needed to make each tracing in order to demonstrate differences in the times taken. Concordance between the manual analysis and the analysis performed using the dedicated software was 94% for the resident and 93% for the student. Interobserver concordance was 99%. The hand-tracing was quicker than that performed by means of the software (28 seconds more on average). The cervical vertebrae analysis software offers excellent clinical performance, even if the method takes longer than the manual technique. Copyright © 2014 Elsevier Masson SAS. All rights reserved.

  2. Spectral Analysis of the Effects of Daylight Saving Time on Motor Vehicle Fatal Traffic Accidents

    DOT National Transportation Integrated Search

    1977-04-01

    This report shows that Daylight Saving Time (DST) reduces the number of persons killed in motor vehicle fatal traffic accidents by about one percent. This estimate is based on a spectral (Fourier) analysis of these fatalities which utilizes a filteri...

  3. A Look at Aircraft Accident Analysis in the Early Days: Do Early 20th Century Accident Investigation Techniques Have Any Lessons for Today?

    NASA Technical Reports Server (NTRS)

    Holloway, C. M.; Johnson, C. W.

    2007-01-01

    In the early years of powered flight, the National Advisory Committee on Aeronautics in the United States produced three reports describing a method of analysis of aircraft accidents. The first report was published in 1928; the second, which was a revision of the first, was published in 1930; and the third, which was a revision and update of the second, was published in 1936. This paper describes the contents of these reports, and compares the method of analysis proposed therein to the methods used today.

  4. Knickpoint finder: A software tool that improves neotectonic analysis

    NASA Astrophysics Data System (ADS)

    Queiroz, G. L.; Salamuni, E.; Nascimento, E. R.

    2015-03-01

    This work presents a new software tool for morphometric analysis of drainage networks based on the methods of Hack (1973) and Etchebehere et al. (2004). This tool is applicable to studies of morphotectonics and neotectonics. The software used a digital elevation model (DEM) to identify the relief breakpoints along drainage profiles (knickpoints). The program was coded in Python for use on the ArcGIS platform and is called Knickpoint Finder. A study area was selected to test and evaluate the software's ability to analyze and identify neotectonic morphostructures based on the morphology of the terrain. For an assessment of its validity, we chose an area of the James River basin, which covers most of the Piedmont area of Virginia (USA), which is an area of constant intraplate seismicity and non-orogenic active tectonics and exhibits a relatively homogeneous geodesic surface currently being altered by the seismogenic features of the region. After using the tool in the chosen area, we found that the knickpoint locations are associated with the geologic structures, epicenters of recent earthquakes, and drainages with rectilinear anomalies. The regional analysis demanded the use of a spatial representation of the data after processing using Knickpoint Finder. The results were satisfactory in terms of the correlation of dense areas of knickpoints with active lineaments and the rapidity of the identification of deformed areas. Therefore, this software tool may be considered useful in neotectonic analyses of large areas and may be applied to any area where there is DEM coverage.

  5. Using Statistical Analysis Software to Advance Nitro Plasticizer Wettability

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shear, Trevor Allan

    Statistical analysis in science is an extremely powerful tool that is often underutilized. Additionally, it is frequently the case that data is misinterpreted or not used to its fullest extent. Utilizing the advanced software JMP®, many aspects of experimental design and data analysis can be evaluated and improved. This overview will detail the features of JMP® and how they were used to advance a project, resulting in time and cost savings, as well as the collection of scientifically sound data. The project analyzed in this report addresses the inability of a nitro plasticizer to coat a gold coated quartz crystalmore » sensor used in a quartz crystal microbalance. Through the use of the JMP® software, the wettability of the nitro plasticizer was increased by over 200% using an atmospheric plasma pen, ensuring good sample preparation and reliable results.« less

  6. IUE Data Analysis Software for Personal Computers

    NASA Technical Reports Server (NTRS)

    Thompson, R.; Caplinger, J.; Taylor, L.; Lawton , P.

    1996-01-01

    This report summarizes the work performed for the program titled, "IUE Data Analysis Software for Personal Computers" awarded under Astrophysics Data Program NRA 92-OSSA-15. The work performed was completed over a 2-year period starting in April 1994. As a result of the project, 450 IDL routines and eight database tables are now available for distribution for Power Macintosh computers and Personal Computers running Windows 3.1.

  7. Development of new vibration energy flow analysis software and its applications to vehicle systems

    NASA Astrophysics Data System (ADS)

    Kim, D.-J.; Hong, S.-Y.; Park, Y.-H.

    2005-09-01

    The Energy flow analysis (EFA) offers very promising results in predicting the noise and vibration responses of system structures in medium-to-high frequency ranges. We have developed the Energy flow finite element method (EFFEM) based software, EFADSC++ R4, for the vibration analysis. The software can analyze the system structures composed of beam, plate, spring-damper, rigid body elements and many other components developed, and has many useful functions in analysis. For convenient use of the software, the main functions of the whole software are modularized into translator, model-converter, and solver. The translator module makes it possible to use finite element (FE) model for the vibration analysis. The model-converter module changes FE model into energy flow finite element (EFFE) model, and generates joint elements to cover the vibrational attenuation in the complex structures composed of various elements and can solve the joint element equations by using the wave tra! nsmission approach very quickly. The solver module supports the various direct and iterative solvers for multi-DOF structures. The predictions of vibration for real vehicles by using the developed software were performed successfully.

  8. Analysis on tank truck accidents involved in road hazardous materials transportation in china.

    PubMed

    Shen, Xiaoyan; Yan, Ying; Li, Xiaonan; Xie, Chenjiang; Wang, Lihua

    2014-01-01

    Due to the sheer size and capacity of the tanker and the properties of cargo transported in the tank, hazmat tanker accidents are more disastrous than other types of vehicle accidents. The aim of this study was to provide a current survey on the situation of accidents involving tankers transporting hazardous materials in China. Detailed descriptions of 708 tanker accidents associated with hazmat transportation in China from 2004 to 2011 were analyzed to identify causes, location, types, time of occurrence, hazard class for materials involved, consequences, and the corresponding probability. Hazmat tanker accidents mainly occurred in eastern (38.1%) and southwest China (12.3%). The most frequent hazmat tanker accidents involved classes 2, 3, and 8. The predominant accident types were rollover (29.10%), run-off-the-road (16.67%), and rear-end collisions (13.28%), with a high likelihood of a large spill occurring. About 55.93% of the accidents occurred on freeways and class 1 roads, with the spill percentage reaching 75.00% and the proportion of spills that occurred in the total accidents amounting to 77.82%, of which 61.72% are considered large spills. The month with the highest accident probability was July (12.29%), and most crashes occurred during the early morning (4:00-6:00 a.m.) and midday (10:00 a.m.-12:00 p.m.) hours, 19.63% versus 16.10%. Human-related errors (73.8%) and vehicle-related defects (19.6%) were the primary reasons for hazmat tanker crashes. The most common outcomes of a hazmat tanker accident was a spill without further events (55.51%), followed by a release with fire (7.77%), and release with an explosion (2.54%). The safety situation of China's hazmat tanker transportation is grim. Such accidents not only have high spill percentages and consistently large spills but they can also cause serious consequences, such as fires and explosions. Improving the training of drivers and the quality of vehicles, deploying roll stability aids, enhancing

  9. Software system safety

    NASA Technical Reports Server (NTRS)

    Uber, James G.

    1988-01-01

    Software itself is not hazardous, but since software and hardware share common interfaces there is an opportunity for software to create hazards. Further, these software systems are complex, and proven methods for the design, analysis, and measurement of software safety are not yet available. Some past software failures, future NASA software trends, software engineering methods, and tools and techniques for various software safety analyses are reviewed. Recommendations to NASA are made based on this review.

  10. Software attribute visualization for high integrity software

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pollock, G.M.

    1998-03-01

    This report documents a prototype tool developed to investigate the use of visualization and virtual reality technologies for improving software surety confidence. The tool is utilized within the execution phase of the software life cycle. It provides a capability to monitor an executing program against prespecified requirements constraints provided in a program written in the requirements specification language SAGE. The resulting Software Attribute Visual Analysis Tool (SAVAnT) also provides a technique to assess the completeness of a software specification.

  11. Primary school accident reporting in one education authority

    PubMed Central

    Latif, A; Williams, W; Sibert, J

    2002-01-01

    Background: Studies have shown a correlation between increased accident rates and levels of deprivation in the community. School accident reporting is one area where an association might be expected. Aims: To investigate differences in primary school accident rates in deprived and more affluent wards, in an area managed by one education authority. Methods: Statistical analysis of accident form returns for 100 primary schools in one education authority in Wales over a two year period, in conjunction with visits to over one third of school sites. Results: Accident report rates from schools in deprived wards were three times higher than those from schools in more affluent wards. School visits showed that this discrepancy was attributable primarily to differences in reporting procedures. One third of schools did not report accidents and approximately half did not keep records of minor accidents. Conclusions: The association between school accident report rates and deprivation in the community is complex. School accident data from local education authorities may be unreliable for most purposes of collection. PMID:11827900

  12. Circuit board accident--organizational dimension hidden by prescribed safety.

    PubMed

    de Almeida, Ildeberto Muniz; Buoso, Eduardo; do Amaral Dias, Maria Dionísia; Vilela, Rodolfo Andrade Gouveia

    2012-01-01

    This study analyzes an accident in which two maintenance workers suffered severe burns while replacing a circuit breaker panel in a steel mill, following model of analysis and prevention of accidents (MAPA) developed with the objective of enlarging the perimeter of interventions and contributing to deconstruction of blame attribution practices. The study was based on materials produced by a health service team in an in-depth analysis of the accident. The analysis shows that decisions related to system modernization were taken without considering their implications in maintenance scheduling and creating conflicts of priorities and of interests between production and safety; and also reveals that the lack of a systemic perspective in safety management was its principal failure. To explain the accident as merely non-fulfillment of idealized formal safety rules feeds practices of blame attribution supported by alibi norms and inhibits possible prevention. In contrast, accident analyses undertaken in worker health surveillance services show potential to reveal origins of these events incubated in the history of the system ignored in practices guided by the traditional paradigm.

  13. Personality, Driving Behavior and Mental Disorders Factors as Predictors of Road Traffic Accidents Based on Logistic Regression.

    PubMed

    Alavi, Seyyed Salman; Mohammadi, Mohammad Reza; Souri, Hamid; Mohammadi Kalhori, Soroush; Jannatifard, Fereshteh; Sepahbodi, Ghazal

    2017-01-01

    The aim of this study was to evaluate the effect of variables such as personality traits, driving behavior and mental illness on road traffic accidents among the drivers with accidents and those without road crash. In this cohort study, 800 bus and truck drivers were recruited. Participants were selected among drivers who referred to Imam Sajjad Hospital (Tehran, Iran) during 2013-2015. The Manchester driving behavior questionnaire (MDBQ), big five personality test (NEO personality inventory) and semi-structured interview (schizophrenia and affective disorders scale) were used. After two years, we surveyed all accidents due to human factors that involved the recruited drivers. The data were analyzed using the SPSS software by performing the descriptive statistics, t-test, and multiple logistic regression analysis methods. P values less than 0.05 were considered statistically significant. In terms of controlling the effective and demographic variables, the findings revealed significant differences between the two groups of drivers that were and were not involved in road accidents. In addition, it was found that depression and anxiety could increase the odds ratio (OR) of road accidents by 2.4- and 2.7-folds, respectively (P=0.04, P=0.004). It is noteworthy to mention that neuroticism alone can increase the odds of road accidents by 1.1-fold (P=0.009), but other personality factors did not have a significant effect on the equation. The results revealed that some mental disorders affect the incidence of road collisions. Considering the importance and sensitivity of driving behavior, it is necessary to evaluate multiple psychological factors influencing drivers before and after receiving or renewing their driver's license.

  14. Personality, Driving Behavior and Mental Disorders Factors as Predictors of Road Traffic Accidents Based on Logistic Regression

    PubMed Central

    Alavi, Seyyed Salman; Mohammadi, Mohammad Reza; Souri, Hamid; Mohammadi Kalhori, Soroush; Jannatifard, Fereshteh; Sepahbodi, Ghazal

    2017-01-01

    Background: The aim of this study was to evaluate the effect of variables such as personality traits, driving behavior and mental illness on road traffic accidents among the drivers with accidents and those without road crash. Methods: In this cohort study, 800 bus and truck drivers were recruited. Participants were selected among drivers who referred to Imam Sajjad Hospital (Tehran, Iran) during 2013-2015. The Manchester driving behavior questionnaire (MDBQ), big five personality test (NEO personality inventory) and semi-structured interview (schizophrenia and affective disorders scale) were used. After two years, we surveyed all accidents due to human factors that involved the recruited drivers. The data were analyzed using the SPSS software by performing the descriptive statistics, t-test, and multiple logistic regression analysis methods. P values less than 0.05 were considered statistically significant. Results: In terms of controlling the effective and demographic variables, the findings revealed significant differences between the two groups of drivers that were and were not involved in road accidents. In addition, it was found that depression and anxiety could increase the odds ratio (OR) of road accidents by 2.4- and 2.7-folds, respectively (P=0.04, P=0.004). It is noteworthy to mention that neuroticism alone can increase the odds of road accidents by 1.1-fold (P=0.009), but other personality factors did not have a significant effect on the equation. Conclusion: The results revealed that some mental disorders affect the incidence of road collisions. Considering the importance and sensitivity of driving behavior, it is necessary to evaluate multiple psychological factors influencing drivers before and after receiving or renewing their driver’s license. PMID:28293047

  15. ATWS at Browns Ferry Unit One - accident sequence analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Harrington, R.M.; Hodge, S.A.

    1984-07-01

    This study describes the predicted response of Unit One at the Browns Ferry Nuclear Plant to a postulated complete failure to scram following a transient occurrence that has caused closure of all Main Steam Isolation Valves (MSIVs). This hypothetical event constitutes the most severe example of the type of accident classified as Anticipated Transient Without Scram (ATWS). Without the automatic control rod insertion provided by scram, the void coefficient of reactivity and the mechanisms by which voids are formed in the moderator/coolant play a dominant role in the progression of the accident. Actions taken by the operator greatly influence themore » quantity of voids in the coolant and the effect is analyzed in this report. The progression of the accident sequence under existing and under recommended procedures is discussed. For the extremely unlikely cases in which equipment failure and wrongful operator actions might lead to severe core damage, the sequence of emergency action levels and the associated timing of events are presented.« less

  16. The Fukushima Daiichi Accident Study Information Portal

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shawn St. Germain; Curtis Smith; David Schwieder

    This paper presents a description of The Fukushima Daiichi Accident Study Information Portal. The Information Portal was created by the Idaho National Laboratory as part of joint NRC and DOE project to assess the severe accident modeling capability of the MELCOR analysis code. The Fukushima Daiichi Accident Study Information Portal was created to collect, store, retrieve and validate information and data for use in reconstructing the Fukushima Daiichi accident. In addition to supporting the MELCOR simulations, the Portal will be the main DOE repository for all data, studies and reports related to the accident at the Fukushima Daiichi nuclear powermore » station. The data is stored in a secured (password protected and encrypted) repository that is searchable and accessible to researchers at diverse locations.« less

  17. Methodological guidelines for developing accident modification functions.

    PubMed

    Elvik, Rune

    2015-07-01

    This paper proposes methodological guidelines for developing accident modification functions. An accident modification function is a mathematical function describing systematic variation in the effects of road safety measures. The paper describes ten guidelines. An example is given of how to use the guidelines. The importance of exploratory analysis and an iterative approach in developing accident modification functions is stressed. The example shows that strict compliance with all the guidelines may be difficult, but represents a level of stringency that should be strived for. Currently the main limitations in developing accident modification functions are the small number of good evaluation studies and the often huge variation in estimates of effect. It is therefore still not possible to develop accident modification functions for very many road safety measures. Copyright © 2015 Elsevier Ltd. All rights reserved.

  18. VALIDATION OF ANSYS FINITE ELEMENT ANALYSIS SOFTWARE

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    HAMM, E.R.

    2003-06-27

    This document provides a record of the verification and Validation of the ANSYS Version 7.0 software that is installed on selected CH2M HILL computers. The issues addressed include: Software verification, installation, validation, configuration management and error reporting. The ANSYS{reg_sign} computer program is a large scale multi-purpose finite element program which may be used for solving several classes of engineering analysis. The analysis capabilities of ANSYS Full Mechanical Version 7.0 installed on selected CH2M Hill Hanford Group (CH2M HILL) Intel processor based computers include the ability to solve static and dynamic structural analyses, steady-state and transient heat transfer problems, mode-frequency andmore » buckling eigenvalue problems, static or time-varying magnetic analyses and various types of field and coupled-field applications. The program contains many special features which allow nonlinearities or secondary effects to be included in the solution, such as plasticity, large strain, hyperelasticity, creep, swelling, large deflections, contact, stress stiffening, temperature dependency, material anisotropy, and thermal radiation. The ANSYS program has been in commercial use since 1970, and has been used extensively in the aerospace, automotive, construction, electronic, energy services, manufacturing, nuclear, plastics, oil and steel industries.« less

  19. Rosetta CONSERT operations and data analysis preparation: simulation software tools.

    NASA Astrophysics Data System (ADS)

    Rogez, Yves; Hérique, Alain; Cardiet, Maël; Zine, Sonia; Westphal, Mathieu; Micallef, Mickael; Berquin, Yann; Kofman, Wlodek

    2014-05-01

    The CONSERT experiment onboard Rosetta and Philae will perform the tomography of the 67P/CG comet nucleus by measuring radio waves transmission from the Rosetta S/C to the Philae Lander. The accurate analysis of travel time measurements will deliver unique knowledge of the nucleus interior dielectric properties. The challenging complexity of CONSERT operations requirements, combining both Rosetta and Philae, allows only a few set of opportunities to acquire data. Thus, we need a fine analysis of the impact of Rosetta trajectory, Philae position and comet shape on CONSERT measurements, in order to take optimal decisions in a short time. The integration of simulation results and mission parameters provides synthetic information to evaluate performances and risks for each opportunity. The preparation of CONSERT measurements before space operations is a key to achieve the best science return of the experiment. In addition, during Rosetta space operations, these software tools will allow a "real-time" first analysis of the latest measurements to improve the next acquisition sequences. The software tools themselves are built around a 3D electromagnetic radio wave simulation, taking into account the signal polarization. It is based on ray-tracing algorithms specifically designed for quick orbit analysis and radar signal generation. This allows computation on big domains relatively to the wavelength. The extensive use of 3D visualization tools provides comprehensive and synthetic views of the results. The software suite is designed to be extended, after Rosetta operations, to the full 3D measurement data analysis using inversion methods.

  20. Critical Factors Analysis for Offshore Software Development Success by Structural Equation Modeling

    NASA Astrophysics Data System (ADS)

    Wada, Yoshihisa; Tsuji, Hiroshi

    In order to analyze the success/failure factors in offshore software development service by the structural equation modeling, this paper proposes to follow two approaches together; domain knowledge based heuristic analysis and factor analysis based rational analysis. The former works for generating and verifying of hypothesis to find factors and causalities. The latter works for verifying factors introduced by theory to build the model without heuristics. Following the proposed combined approaches for the responses from skilled project managers of the questionnaire, this paper found that the vendor property has high causality for the success compared to software property and project property.

  1. Paralysis from sport and diving accidents.

    PubMed

    Schmitt, H; Gerner, H J

    2001-01-01

    To examine the causes of sport-related spinal cord injuries that developed into paraplegia or tetraplegia, and to compare data from different sports with previous studies in the same geographical region. A retrospective epidemiological study and comparison with previous studies. The Orthopedic Department, specializing in the treatment and rehabilitation of paralyzed patients, at the University of Heidelberg, Germany. Between 1985 and 1997, 1,016 cases of traumatic spinal cord injury presented at the Orthopedic Department at the University of Heidelberg: 6.8% were caused by sport and 7.7% by diving accidents. Sport-related spinal cord injuries with paralysis. A total of 1.016 cases of traumatic spinal cord injury were reviewed. Of these, 14.5% were caused by sport accidents (n = 69) or diving accidents (n = 78). Age of patients ranged from 9 to 52 years. 83% were male. 77% of the patients developed tetraplegia, and 23%, paraplegia. 16 of the sport accidents resulted from downhill skiing, 9 resulted from horseback riding, 7 from modern air sports, 6 from gymnastics, 5 from trampolining, and 26 from other sports. Previous analyses had revealed that paraplegia had mainly occurred from gymnastics, trampolining, or high diving accidents. More recently, however, the number of serious spinal injuries caused by risk-filled sports such as hang gliding and paragliding has significantly increased (p = 0.095), as it has for horseback riding and skiing. Examinations have shown that all patients who were involved in diving accidents developed tetraplegia. An analysis of injury from specific sports is still under way. Analysis of accidents resulting in damage to the spinal cord in respect to different sports shows that sports that have become popular during the last 10 years show an increasing risk of injury. Modern air sports hold the most injuries. Injury-preventing strategies also are presented.

  2. Development of Software for Automatic Analysis of Intervention in the Field of Homeopathy.

    PubMed

    Jain, Rajesh Kumar; Goyal, Shagun; Bhat, Sushma N; Rao, Srinath; Sakthidharan, Vivek; Kumar, Prasanna; Sajan, Kannanaikal Rappayi; Jindal, Sameer Kumar; Jindal, Ghanshyam D

    2018-05-01

    To study the effect of homeopathic medicines (in higher potencies) in normal subjects, Peripheral Pulse Analyzer (PPA) has been used to record physiologic variability parameters before and after administration of the medicine/placebo in 210 normal subjects. Data have been acquired in seven rounds; placebo was administered in rounds 1 and 2 and medicine in potencies 6, 30, 200, 1 M, and 10 M was administered in rounds 3 to 7, respectively. Five different medicines in the said potencies were given to a group of around 40 subjects each. Although processing of data required human intervention, a software application has been developed to analyze the processed data and detect the response to eliminate the undue delay as well as human bias in subjective analysis. This utility named Automatic Analysis of Intervention in the Field of Homeopathy is run on the processed PPA data and the outcome has been compared with the manual analysis. The application software uses adaptive threshold based on statistics for detecting responses in contrast to fixed threshold used in manual analysis. The automatic analysis has detected 12.96% higher responses than subjective analysis. Higher response rates have been manually verified to be true positive. This indicates robustness of the application software. The automatic analysis software was run on another set of pulse harmonic parameters derived from the same data set to study cardiovascular susceptibility and 385 responses were detected in contrast to 272 of variability parameters. It was observed that 65% of the subjects, eliciting response, were common. This not only validates the software utility for giving consistent yield but also reveals the certainty of the response. This development may lead to electronic proving of homeopathic medicines (e-proving).

  3. Type A Accident Investigation Board report on the January 17, 1996, electrical accident with injury in Technical Area 21 Tritium Science and Fabrication Facility Los Alamos National Laboratory. Final report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    NONE

    1996-04-01

    An electrical accident was investigated in which a crafts person received serious injuries as a result of coming into contact with a 13.2 kilovolt (kV) electrical cable in the basement of Building 209 in Technical Area 21 (TA-21-209) in the Tritium Science and Fabrication Facility (TSFF) at Los Alamos National Laboratory (LANL). In conducting its investigation, the Accident Investigation Board used various analytical techniques, including events and causal factor analysis, barrier analysis, change analysis, fault tree analysis, materials analysis, and root cause analysis. The board inspected the accident site, reviewed events surrounding the accident, conducted extensive interviews and document reviews,more » and performed causation analyses to determine the factors that contributed to the accident, including any management system deficiencies. Relevant management systems and factors that could have contributed to the accident were evaluated in accordance with the guiding principles of safety management identified by the Secretary of Energy in an October 1994 letter to the Defense Nuclear Facilities Safety Board and subsequently to Congress.« less

  4. Accident models for two-lane rural roads : segments and intersections

    DOT National Transportation Integrated Search

    1998-10-01

    This report is a direct step for the implementation of the Accident Analysis Module in the Interactive Highway Safety Design Model (IHSDM). The Accident Analysis Module is expected to estimate the safety of two-lane rural highway characteristics for ...

  5. FASEA: A FPGA Acquisition System and Software Event Analysis for liquid scintillation counting

    NASA Astrophysics Data System (ADS)

    Steele, T.; Mo, L.; Bignell, L.; Smith, M.; Alexiev, D.

    2009-10-01

    The FASEA (FPGA based Acquisition and Software Event Analysis) system has been developed to replace the MAC3 for coincidence pulse processing. The system uses a National Instruments Virtex 5 FPGA card (PXI-7842R) for data acquisition and a purpose developed data analysis software for data analysis. Initial comparisons to the MAC3 unit are included based on measurements of 89Sr and 3H, confirming that the system is able to accurately emulate the behaviour of the MAC3 unit.

  6. Ability and efficiency of an automatic analysis software to measure microvascular parameters.

    PubMed

    Carsetti, Andrea; Aya, Hollmann D; Pierantozzi, Silvia; Bazurro, Simone; Donati, Abele; Rhodes, Andrew; Cecconi, Maurizio

    2017-08-01

    Analysis of the microcirculation is currently performed offline, is time consuming and operator dependent. The aim of this study was to assess the ability and efficiency of the automatic analysis software CytoCamTools 1.7.12 (CC) to measure microvascular parameters in comparison with Automated Vascular Analysis (AVA) software 3.2. 22 patients admitted to the cardiothoracic intensive care unit following cardiac surgery were prospectively enrolled. Sublingual microcirculatory videos were analysed using AVA and CC software. The total vessel density (TVD) for small vessels, perfused vessel density (PVD) and proportion of perfused vessels (PPV) were calculated. Blood flow was assessed using the microvascular flow index (MFI) for AVA software and the averaged perfused speed indicator (APSI) for the CC software. The duration of the analysis was also recorded. Eighty-four videos from 22 patients were analysed. The bias between TVD-CC and TVD-AVA was 2.20 mm/mm 2 (95 % CI 1.37-3.03) with limits of agreement (LOA) of -4.39 (95 % CI -5.66 to -3.16) and 8.79 (95 % CI 7.50-10.01) mm/mm 2 . The percentage error (PE) for TVD was ±32.2 %. TVD was positively correlated between CC and AVA (r = 0.74, p < 0.001). The bias between PVD-CC and PVD-AVA was 6.54 mm/mm 2 (95 % CI 5.60-7.48) with LOA of -4.25 (95 % CI -8.48 to -0.02) and 17.34 (95 % CI 13.11-21.57) mm/mm 2 . The PE for PVD was ±61.2 %. PVD was positively correlated between CC and AVA (r = 0.66, p < 0.001). The median PPV-AVA was significantly higher than the median PPV-CC [97.39 % (95.25, 100 %) vs. 81.65 % (61.97, 88.99), p < 0.0001]. MFI categories cannot estimate or predict APSI values (p = 0.45). The time required for the analysis was shorter with CC than with AVA system [2'42″ (2'12″, 3'31″) vs. 16'12″ (13'38″, 17'57″), p < 0.001]. TVD is comparable between the two softwares, although faster with CC software. The values for PVD and PPV are not interchangeable given the

  7. BAM/DASS: Data Analysis Software for Sub-Microarcsecond Astrometry Device

    NASA Astrophysics Data System (ADS)

    Gardiol, D.; Bonino, D.; Lattanzi, M. G.; Riva, A.; Russo, F.

    2010-12-01

    The INAF - Osservatorio Astronomico di Torino is part of the Data Processing and Analysis Consortium (DPAC) for Gaia, a cornerstone mission of the European Space Agency. Gaia will perform global astrometry by means of two telescopes looking at the sky along two different lines of sight oriented at a fixed angle, also called basic angle. Knowledge of the basic angle fluctuations at the sub-microarcsecond level over periods of the order of the minute is crucial to reach the mission goals. A specific device, the Basic Angle Monitoring, will be dedicated to this purpose. We present here the software system we are developing to analyze the BAM data and recover the basic angle variations. This tool is integrated into the whole DPAC data analysis software.

  8. Final report on the analyses of traffic accidents : Fast-Trac--phase 3, deliverable. Semi-annual reports on total accidents : trends, types and analysis of before and after studies

    DOT National Transportation Integrated Search

    1996-12-01

    This report contains the results of an analysis of : traffic accidents in the City of Troy, Michigan, where : the Sydney Coordinated Adaptive Traffic System : (SCATS) was deployed as part of a federal demonstration : program. The analyses includes a ...

  9. [The analysis of threshold effect using Empower Stats software].

    PubMed

    Lin, Lin; Chen, Chang-zhong; Yu, Xiao-dan

    2013-11-01

    In many studies about biomedical research factors influence on the outcome variable, it has no influence or has a positive effect within a certain range. Exceeding a certain threshold value, the size of the effect and/or orientation will change, which called threshold effect. Whether there are threshold effects in the analysis of factors (x) on the outcome variable (y), it can be observed through a smooth curve fitting to see whether there is a piecewise linear relationship. And then using segmented regression model, LRT test and Bootstrap resampling method to analyze the threshold effect. Empower Stats software developed by American X & Y Solutions Inc has a threshold effect analysis module. You can input the threshold value at a given threshold segmentation simulated data. You may not input the threshold, but determined the optimal threshold analog data by the software automatically, and calculated the threshold confidence intervals.

  10. Software for Data Analysis with Graphical Models

    NASA Technical Reports Server (NTRS)

    Buntine, Wray L.; Roy, H. Scott

    1994-01-01

    Probabilistic graphical models are being used widely in artificial intelligence and statistics, for instance, in diagnosis and expert systems, as a framework for representing and reasoning with probabilities and independencies. They come with corresponding algorithms for performing statistical inference. This offers a unifying framework for prototyping and/or generating data analysis algorithms from graphical specifications. This paper illustrates the framework with an example and then presents some basic techniques for the task: problem decomposition and the calculation of exact Bayes factors. Other tools already developed, such as automatic differentiation, Gibbs sampling, and use of the EM algorithm, make this a broad basis for the generation of data analysis software.

  11. A Critical Appraisal of Techniques, Software Packages, and Standards for Quantitative Proteomic Analysis

    PubMed Central

    Lawless, Craig; Hubbard, Simon J.; Fan, Jun; Bessant, Conrad; Hermjakob, Henning; Jones, Andrew R.

    2012-01-01

    Abstract New methods for performing quantitative proteome analyses based on differential labeling protocols or label-free techniques are reported in the literature on an almost monthly basis. In parallel, a correspondingly vast number of software tools for the analysis of quantitative proteomics data has also been described in the literature and produced by private companies. In this article we focus on the review of some of the most popular techniques in the field and present a critical appraisal of several software packages available to process and analyze the data produced. We also describe the importance of community standards to support the wide range of software, which may assist researchers in the analysis of data using different platforms and protocols. It is intended that this review will serve bench scientists both as a useful reference and a guide to the selection and use of different pipelines to perform quantitative proteomics data analysis. We have produced a web-based tool (http://www.proteosuite.org/?q=other_resources) to help researchers find appropriate software for their local instrumentation, available file formats, and quantitative methodology. PMID:22804616

  12. Development of multivariate exposure and fatal accident involvement rates for 1977

    DOT National Transportation Integrated Search

    1985-10-01

    The need for multivariate accident involvement rates is often encounted in : accident analysis. The FARS (Fatal Accident Reporting System) files contain : records of fatal involvements characterized by many variables while NPTS : (National Personal T...

  13. [Automobile Traffic Accident Death Case Analysis of Characteristics of Driver Injury].

    PubMed

    Du, Y L; Zhang, W L

    2017-02-01

    To distinguish the injury characteristic changes on the drivers between the injuries of drivers and passengers in traffic accidents, and to provide scientific evidence for confirming the identity of driver in traffic accidents. Data of 126 automobile traffic accident death cases in the reclamation areas of Heilongjiang province from 2006-2014 were retrospectively studied. The injury characteristics on the drivers of automobile traffic accident death cases were analyzed and the forensic identification problem in the injuries of drivers and passengers were discussed. Injuries were frequently observed on driver's neck, chest and abdomen. The characteristic injuries caused by auto parts were also found, which appeared at the places of passenger's head, face and limbs contacted with automobile. Such characteristic injuries were not found at other places. The location and type of injury are associated with the identity of the deceased. Copyright© by the Editorial Department of Journal of Forensic Medicine

  14. Review the number of accidents in Tehran over a two-year period and prediction of the number of events based on a time-series model

    PubMed Central

    Teymuri, Ghulam Heidar; Sadeghian, Marzieh; Kangavari, Mehdi; Asghari, Mehdi; Madrese, Elham; Abbasinia, Marzieh; Ahmadnezhad, Iman; Gholizadeh, Yavar

    2013-01-01

    Background: One of the significant dangers that threaten people’s lives is the increased risk of accidents. Annually, more than 1.3 million people die around the world as a result of accidents, and it has been estimated that approximately 300 deaths occur daily due to traffic accidents in the world with more than 50% of that number being people who were not even passengers in the cars. The aim of this study was to examine traffic accidents in Tehran and forecast the number of future accidents using a time-series model. Methods: The study was a cross-sectional study that was conducted in 2011. The sample population was all traffic accidents that caused death and physical injuries in Tehran in 2010 and 2011, as registered in the Tehran Emergency ward. The present study used Minitab 15 software to provide a description of accidents in Tehran for the specified time period as well as those that occurred during April 2012. Results: The results indicated that the average number of daily traffic accidents in Tehran in 2010 was 187 with a standard deviation of 83.6. In 2011, there was an average of 180 daily traffic accidents with a standard deviation of 39.5. One-way analysis of variance indicated that the average number of accidents in the city was different for different months of the year (P < 0.05). Most of the accidents occurred in March, July, August, and September. Thus, more accidents occurred in the summer than in the other seasons. The number of accidents was predicted based on an auto-regressive, moving average (ARMA) for April 2012. The number of accidents displayed a seasonal trend. The prediction of the number of accidents in the city during April of 2012 indicated that a total of 4,459 accidents would occur with mean of 149 accidents per day during these three months. Conclusion: The number of accidents in Tehran displayed a seasonal trend, and the number of accidents was different for different seasons of the year. PMID:26120405

  15. Sleepiness and sleep-disordered breathing in truck drivers : risk analysis of road accidents.

    PubMed

    Catarino, Rosa; Spratley, Jorge; Catarino, Isabel; Lunet, Nuno; Pais-Clemente, Manuel

    2014-03-01

    Portugal has one of the highest road traffic fatality rates in Europe. A clear association between sleep-disordered breathing (SDB) and traffic accidents has been previously demonstrated. This study aimed to determine prevalence of excessive daytime sleepiness (EDS) and other sleep disorder symptoms among truck drivers and to identify which individual traits and work habits are associated to increased sleepiness and accident risk. We evaluated a sample of 714 truck drivers using a questionnaire (244 face-to-face interviews, 470 self-administered) that included sociodemographic data, personal habits, previous accidents, Epworth Sleepiness Scale (ESS), and the Berlin questionnaire (BQ). Twenty percent of drivers had EDS and 29 % were at high risk for having obstructive sleep apnea syndrome (OSAS). Two hundred sixty-one drivers (36.6 %) reported near-miss accidents (42.5 % sleep related) and 264 (37.0 %), a driving accident (16.3 % sleep related). ESS score ≥ 11 was a risk factor for both near-miss accidents (odds ratio (OR)=3.84, p<0.01) and accidents (OR=2.25, p<0.01). Antidepressant use was related to accidents (OR=3.30, p=0.03). We found an association between high Mallampati score (III-IV) and near misses (OR=1.89, p=0.04). In this sample of Portuguese truck drivers, we observed a high prevalence of EDS and other sleep disorder symptoms. Accident risk was related to sleepiness and antidepressant use. Identifying drivers at risk for OSAS should be a major priority of medical assessment centers, as a public safety policy.

  16. FTOOLS: A FITS Data Processing and Analysis Software Package

    NASA Astrophysics Data System (ADS)

    Blackburn, J. K.

    FTOOLS, a highly modular collection of over 110 utilities for processing and analyzing data in the FITS (Flexible Image Transport System) format, has been developed in support of the HEASARC (High Energy Astrophysics Science Archive Research Center) at NASA's Goddard Space Flight Center. Each utility performs a single simple task such as presentation of file contents, extraction of specific rows or columns, appending or merging tables, binning values in a column or selecting subsets of rows based on a boolean expression. Individual utilities can easily be chained together in scripts to achieve more complex operations such as the generation and displaying of spectra or light curves. The collection of utilities provides both generic processing and analysis utilities and utilities specific to high energy astrophysics data sets used for the ASCA, ROSAT, GRO, and XTE missions. A core set of FTOOLS providing support for generic FITS data processing, FITS image analysis and timing analysis can easily be split out of the full software package for users not needing the high energy astrophysics mission utilities. The FTOOLS software package is designed to be both compatible with IRAF and completely stand alone in a UNIX or VMS environment. The user interface is controlled by standard IRAF parameter files. The package is self documenting through the IRAF help facility and a stand alone help task. Software is written in ANSI C and \\fortran to provide portability across most computer systems. The data format dependencies between hardware platforms are isolated through the FITSIO library package.

  17. Traffic accident reconstruction and an approach for prediction of fault rates using artificial neural networks: A case study in Turkey.

    PubMed

    Can Yilmaz, Ali; Aci, Cigdem; Aydin, Kadir

    2016-08-17

    Currently, in Turkey, fault rates in traffic accidents are determined according to the initiative of accident experts (no speed analyses of vehicles just considering accident type) and there are no specific quantitative instructions on fault rates related to procession of accidents which just represents the type of collision (side impact, head to head, rear end, etc.) in No. 2918 Turkish Highway Traffic Act (THTA 1983). The aim of this study is to introduce a scientific and systematic approach for determination of fault rates in most frequent property damage-only (PDO) traffic accidents in Turkey. In this study, data (police reports, skid marks, deformation, crush depth, etc.) collected from the most frequent and controversial accident types (4 sample vehicle-vehicle scenarios) that consist of PDO were inserted into a reconstruction software called vCrash. Sample real-world scenarios were simulated on the software to generate different vehicle deformations that also correspond to energy-equivalent speed data just before the crash. These values were used to train a multilayer feedforward artificial neural network (MFANN), function fitting neural network (FITNET, a specialized version of MFANN), and generalized regression neural network (GRNN) models within 10-fold cross-validation to predict fault rates without using software. The performance of the artificial neural network (ANN) prediction models was evaluated using mean square error (MSE) and multiple correlation coefficient (R). It was shown that the MFANN model performed better for predicting fault rates (i.e., lower MSE and higher R) than FITNET and GRNN models for accident scenarios 1, 2, and 3, whereas FITNET performed the best for scenario 4. The FITNET model showed the second best results for prediction for the first 3 scenarios. Because there is no training phase in GRNN, the GRNN model produced results much faster than MFANN and FITNET models. However, the GRNN model had the worst prediction results. The

  18. Investigations on optimization of accident management measures following a station blackout accident in a VVER-1000 pressurized water reactor

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tusheva, P.; Schaefer, F.; Kliem, S.

    2012-07-01

    The reactor safety issues are of primary importance for preserving the health of the population and ensuring no release of radioactivity and fission products into the environment. A part of the nuclear research focuses on improvement of the safety of existing nuclear power plants. Studies, research and efforts are a continuing process at improving the safety and reliability of existing and newly developed nuclear power plants at prevention of a core melt accident. Station blackout (loss of AC power supply) is one of the dominant accidents taken into consideration at performing accident analysis. In case of multiple failures of safetymore » systems it leads to a severe accident. To prevent an accident to turn into a severe one or to mitigate the consequences, accident management measures must be performed. The present paper outlines possibilities for application and optimization of accident management measures following a station blackout accident. Assessed is the behaviour of the nuclear power plant during a station blackout accident without accident management measures and with application of primary/secondary side oriented accident management measures. Discussed are the possibilities for operators ' intervention and the influence of the performed accident management measures on the course of the accident. Special attention has been paid to the effectiveness of the passive feeding and physical phenomena having an influence on the system behaviour. The performed simulations show that the effectiveness of the secondary side feeding procedure can be limited due to an early evaporation or flashing effects in the feed water system. The analyzed cases show that the effectiveness of the accident management measures strongly depends on the initiation criteria applied for depressurization of the reactor coolant system. (authors)« less

  19. Offsite Radiological Consequence Analysis for the Bounding Flammable Gas Accident

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    CARRO, C.A.

    2003-07-30

    This document quantifies the offsite radiological consequences of the bounding flammable gas accident for comparison with the 25 rem Evaluation Guideline established in DOE-STD-3009, Appendix A. The bounding flammable gas accident is a detonation in a single-shell tank The calculation applies reasonably conservation input parameters in accordance with DOE-STD-3009, Appendix A, guidance. Revision 1 incorporates comments received from Office of River Protection.

  20. SUBSONIC WIND TUNNEL PERFORMANCE ANALYSIS SOFTWARE

    NASA Technical Reports Server (NTRS)

    Eckert, W. T.

    1994-01-01

    This program was developed as an aid in the design and analysis of subsonic wind tunnels. It brings together and refines previously scattered and over-simplified techniques used for the design and loss prediction of the components of subsonic wind tunnels. It implements a system of equations for determining the total pressure losses and provides general guidelines for the design of diffusers, contractions, corners and the inlets and exits of non-return tunnels. The algorithms used in the program are applicable to compressible flow through most closed- or open-throated, single-, double- or non-return wind tunnels or ducts. A comparison between calculated performance and that actually achieved by several existing facilities produced generally good agreement. Any system through which air is flowing which involves turns, fans, contractions etc. (e.g., an HVAC system) may benefit from analysis using this software. This program is an update of ARC-11138 which includes PC compatibility and an improved user interface. The method of loss analysis used by the program is a synthesis of theoretical and empirical techniques. Generally, the algorithms used are those which have been substantiated by experimental test. The basic flow-state parameters used by the program are determined from input information about the reference control section and the test section. These parameters were derived from standard relationships for compressible flow. The local flow conditions, including Mach number, Reynolds number and friction coefficient are determined for each end of each component or section. The loss in total pressure caused by each section is calculated in a form non-dimensionalized by local dynamic pressure. The individual losses are based on the nature of the section, local flow conditions and input geometry and parameter information. The loss forms for typical wind tunnel sections considered by the program include: constant area ducts, open throat ducts, contractions, constant

  1. Software Safety Analysis of a Flight Guidance System

    NASA Technical Reports Server (NTRS)

    Butler, Ricky W. (Technical Monitor); Tribble, Alan C.; Miller, Steven P.; Lempia, David L.

    2004-01-01

    This document summarizes the safety analysis performed on a Flight Guidance System (FGS) requirements model. In particular, the safety properties desired of the FGS model are identified and the presence of the safety properties in the model is formally verified. Chapter 1 provides an introduction to the entire project, while Chapter 2 gives a brief overview of the problem domain, the nature of accidents, model based development, and the four-variable model. Chapter 3 outlines the approach. Chapter 4 presents the results of the traditional safety analysis techniques and illustrates how the hazardous conditions associated with the system trace into specific safety properties. Chapter 5 presents the results of the formal methods analysis technique model checking that was used to verify the presence of the safety properties in the requirements model. Finally, Chapter 6 summarizes the main conclusions of the study, first and foremost that model checking is a very effective verification technique to use on discrete models with reasonable state spaces. Additional supporting details are provided in the appendices.

  2. Software applications for flux balance analysis.

    PubMed

    Lakshmanan, Meiyappan; Koh, Geoffrey; Chung, Bevan K S; Lee, Dong-Yup

    2014-01-01

    Flux balance analysis (FBA) is a widely used computational method for characterizing and engineering intrinsic cellular metabolism. The increasing number of its successful applications and growing popularity are possibly attributable to the availability of specific software tools for FBA. Each tool has its unique features and limitations with respect to operational environment, user-interface and supported analysis algorithms. Presented herein is an in-depth evaluation of currently available FBA applications, focusing mainly on usability, functionality, graphical representation and inter-operability. Overall, most of the applications are able to perform basic features of model creation and FBA simulation. COBRA toolbox, OptFlux and FASIMU are versatile to support advanced in silico algorithms to identify environmental and genetic targets for strain design. SurreyFBA, WEbcoli, Acorn, FAME, GEMSiRV and MetaFluxNet are the distinct tools which provide the user friendly interfaces in model handling. In terms of software architecture, FBA-SimVis and OptFlux have the flexible environments as they enable the plug-in/add-on feature to aid prospective functional extensions. Notably, an increasing trend towards the implementation of more tailored e-services such as central model repository and assistance to collaborative efforts was observed among the web-based applications with the help of advanced web-technologies. Furthermore, most recent applications such as the Model SEED, FAME, MetaFlux and MicrobesFlux have even included several routines to facilitate the reconstruction of genome-scale metabolic models. Finally, a brief discussion on the future directions of FBA applications was made for the benefit of potential tool developers.

  3. Acceptable Risk Analysis for Abrupt Environmental Pollution Accidents in Zhangjiakou City, China.

    PubMed

    Du, Xi; Zhang, Zhijiao; Dong, Lei; Liu, Jing; Borthwick, Alistair G L; Liu, Renzhi

    2017-04-20

    Abrupt environmental pollution accidents cause considerable damage worldwide to the ecological environment, human health, and property. The concept of acceptable risk aims to answer whether or not a given environmental pollution risk exceeds a societally determined criterion. This paper presents a case study on acceptable environmental pollution risk conducted through a questionnaire survey carried out between August and October 2014 in five representative districts and two counties of Zhangjiakou City, Hebei Province, China. Here, environmental risk primarily arises from accidental water pollution, accidental air pollution, and tailings dam failure. Based on 870 valid questionnaires, demographic and regional differences in public attitudes towards abrupt environmental pollution risks were analyzed, and risk acceptance impact factors determined. The results showed females, people between 21-40 years of age, people with higher levels of education, public servants, and people with higher income had lower risk tolerance. People with lower perceived risk, low-level risk knowledge, high-level familiarity and satisfaction with environmental management, and without experience of environmental accidents had higher risk tolerance. Multiple logistic regression analysis indicated that public satisfaction with environmental management was the most significant factor in risk acceptance, followed by perceived risk of abrupt air pollution, occupation, perceived risk of tailings dam failure, and sex. These findings should be helpful to local decision-makers concerned with environmental risk management (e.g., selecting target groups for effective risk communication) in the context of abrupt environmental accidents.

  4. Acceptable Risk Analysis for Abrupt Environmental Pollution Accidents in Zhangjiakou City, China

    PubMed Central

    Du, Xi; Zhang, Zhijiao; Dong, Lei; Liu, Jing; Borthwick, Alistair G. L.; Liu, Renzhi

    2017-01-01

    Abrupt environmental pollution accidents cause considerable damage worldwide to the ecological environment, human health, and property. The concept of acceptable risk aims to answer whether or not a given environmental pollution risk exceeds a societally determined criterion. This paper presents a case study on acceptable environmental pollution risk conducted through a questionnaire survey carried out between August and October 2014 in five representative districts and two counties of Zhangjiakou City, Hebei Province, China. Here, environmental risk primarily arises from accidental water pollution, accidental air pollution, and tailings dam failure. Based on 870 valid questionnaires, demographic and regional differences in public attitudes towards abrupt environmental pollution risks were analyzed, and risk acceptance impact factors determined. The results showed females, people between 21–40 years of age, people with higher levels of education, public servants, and people with higher income had lower risk tolerance. People with lower perceived risk, low-level risk knowledge, high-level familiarity and satisfaction with environmental management, and without experience of environmental accidents had higher risk tolerance. Multiple logistic regression analysis indicated that public satisfaction with environmental management was the most significant factor in risk acceptance, followed by perceived risk of abrupt air pollution, occupation, perceived risk of tailings dam failure, and sex. These findings should be helpful to local decision-makers concerned with environmental risk management (e.g., selecting target groups for effective risk communication) in the context of abrupt environmental accidents. PMID:28425956

  5. Analysis of Incident and Accident Reports and Risk Management in Spine Surgery.

    PubMed

    Kobayashi, Kazuyoshi; Imagama, Shiro; Ando, Kei; Hida, Tetsuro; Ito, Kenyu; Tsushima, Mikito; Ishikawa, Yoshimoto; Matsumoto, Akiyuki; Morozumi, Masayoshi; Nishida, Yoshihiro; Nagao, Yoshimasa; Ishiguro, Naoki

    2017-08-01

    A review of accident and incident reports. To analyze prevalence, characteristics, and details of perioperative incidents and accidents in patients receiving spine surgery. In our institution, a clinical error that potentially results in an adverse event is usually submitted as an incident or accident report through a web database, to ensure anonymous and blame-free reporting. All reports are analyzed by a medical safety management group. These reports contain valuable data for management of medical safety, but there have been no studies evaluating such data for spine surgery. A total of 320 incidents and accidents that occurred perioperatively in 172 of 415 spine surgeries were included in the study. Incidents were defined as events that were "problematic, but with no damage to the patient," and accidents as events "with damage to the patient." The details of these events were analyzed. There were 278 incidents in 137 surgeries and 42 accidents in 35 surgeries, giving prevalence of 33% (137/415) and 8% (35/415), respectively. The proportion of accidents among all events was significantly higher for doctors than non-doctors [68.0% (17/25) vs. 8.5% (25/295), P < 0.01] and in the operating room compared with outside the operating room [40.5% (15/37) vs. 9.5% (27/283), P < 0.01]. There was no significant difference in years of experience among personnel involved in all events. The major types of events were medication-related, line and tube problems, and falls and slips. Accidents also occurred because of a long-term prone position, with complications such as laryngeal edema, ulnar nerve palsy, and tooth damage. Surgery and procedures in the operating room always have a risk of complications. Therefore, a particular effort is needed to establish safe management of this environment and to provide advice on risk to the doctor and medical care team. 4.

  6. Comparative analysis of the countermeasures taken to mitigate exposure of the public to radioiodine following the Chernobyl and Fukushima accidents: lessons from both accidents.

    PubMed

    Uyba, Vladimir; Samoylov, Alexander; Shinkarev, Sergey

    2018-04-01

    In the case of a severe radiation accident at a nuclear power station, the most important radiation hazard for the public is internal exposure of the thyroid to radioiodine. The purposes of this paper were (i) to compare countermeasures conducted (following the Chernobyl and Fukushima accidents) aimed at mitigation of exposure to the thyroid for the public, (ii) to present comparative estimates of doses to the thyroid and (iii) to derive lessons from the two accidents. The scale and time of countermeasures applied in the early phase of the accidents (sheltering, evacuation, and intake of stable iodine to block the thyroid) and at a later time (control of 131I concentration in foodstuffs) have been described. After the Chernobyl accident, the estimation of the thyroid doses for the public was mainly based on direct thyroid measurements of ~400 000 residents carried out within the first 2 months. The highest estimates of thyroid doses to children reached 50 Gy. After the Fukushima accident, the estimation of thyroid doses was based on radioecological models due to a lack of direct thyroid measurements (only slightly more than 1000 residents were measured). The highest estimates of thyroid doses to children were a few hundred mGy. Following the Chernobyl accident, ingestion of 131I through cows' milk was the dominant pathway. Following the Fukushima accident, it appears that inhalation of contaminated air was the dominant pathway. Some lessons learned following the Chernobyl and Fukushima accidents have been presented in this paper.

  7. CatReg Software for Categorical Regression Analysis (May 2016)

    EPA Science Inventory

    CatReg 3.0 is a Microsoft Windows enhanced version of the Agency’s categorical regression analysis (CatReg) program. CatReg complements EPA’s existing Benchmark Dose Software (BMDS) by greatly enhancing a risk assessor’s ability to determine whether data from separate toxicologic...

  8. Study of Benefits of Passenger Protective Breathing Equipment from Analysis of Past Accidents

    DTIC Science & Technology

    1988-03-01

    Rodeos (Tenerife) El 27 De Marzo De 1977 2. ICAO Aircraft Accident Digest No. 23, No. 2 B-30 AIRCRAFT ACCIDENT SUMMARY Carrier - Continental Airways...than FPL. However, a I’)-second donning de -lay of PBE may have resulted in a net disbenefit. k~f ¶ ~ 17. Key Words 18. Distributiion Stotement...in C-133 Test Article 23 with Postcrash Fire Conditions 5 Accident Profiles for 3/5/67 Varig DC-8 24 6 Accident Profiles for 4/8/68 British Overseas

  9. [Application of Stata software to test heterogeneity in meta-analysis method].

    PubMed

    Wang, Dan; Mou, Zhen-yun; Zhai, Jun-xia; Zong, Hong-xia; Zhao, Xiao-dong

    2008-07-01

    To introduce the application of Stata software to heterogeneity test in meta-analysis. A data set was set up according to the example in the study, and the corresponding commands of the methods in Stata 9 software were applied to test the example. The methods used were Q-test and I2 statistic attached to the fixed effect model forest plot, H statistic and Galbraith plot. The existence of the heterogeneity among studies could be detected by Q-test and H statistic and the degree of the heterogeneity could be detected by I2 statistic. The outliers which were the sources of the heterogeneity could be spotted from the Galbraith plot. Heterogeneity test in meta-analysis can be completed by the four methods in Stata software simply and quickly. H and I2 statistics are more robust, and the outliers of the heterogeneity can be clearly seen in the Galbraith plot among the four methods.

  10. PolyPhred analysis software for mutation detection from fluorescence-based sequence data.

    PubMed

    Montgomery, Kate T; Iartchouck, Oleg; Li, Li; Loomis, Stephanie; Obourn, Vanessa; Kucherlapati, Raju

    2008-10-01

    The ability to search for genetic variants that may be related to human disease is one of the most exciting consequences of the availability of the sequence of the human genome. Large cohorts of individuals exhibiting certain phenotypes can be studied and candidate genes resequenced. However, the challenge of analyzing sequence data from many individuals with accuracy, speed, and economy is great. This unit describes one set of software tools: Phred, Phrap, PolyPhred, and Consed. Coverage includes the advantages and disadvantages of these analysis tools, details for obtaining and using the software, and the results one may expect. The software is being continually updated to permit further automation of mutation analysis. Currently, however, at least some manual review is required if one wishes to identify 100% of the variants in a sample set.

  11. The Fukushima radiation accident: consequences for radiation accident medical management.

    PubMed

    Meineke, Viktor; Dörr, Harald

    2012-08-01

    The March 2011 radiation accident in Fukushima, Japan, is a textbook example of a radiation accident of global significance. In view of the global dimensions of the accident, it is important to consider the lessons learned. In this context, emphasis must be placed on consequences for planning appropriate medical management for radiation accidents including, for example, estimates of necessary human and material resources. The specific characteristics of the radiation accident in Fukushima are thematically divided into five groups: the exceptional environmental influences on the Fukushima radiation accident, particular circumstances of the accident, differences in risk perception, changed psychosocial factors in the age of the Internet and globalization, and the ignorance of the effects of ionizing radiation both among the general public and health care professionals. Conclusions like the need for reviewing international communication, interfacing, and interface definitions will be drawn from the Fukushima radiation accident.

  12. The effects of aircraft certification rules on general aviation accidents

    NASA Astrophysics Data System (ADS)

    Anderson, Carolina Lenz

    The purpose of this study was to analyze the frequency of general aviation airplane accidents and accident rates on the basis of aircraft certification to determine whether or not differences in aircraft certification rules had an influence on accidents. In addition, the narrative cause descriptions contained within the accident reports were analyzed to determine whether there were differences in the qualitative data for the different certification categories. The certification categories examined were: Federal Aviation Regulations Part 23, Civil Air Regulations 3, Light Sport Aircraft, and Experimental-Amateur Built. The accident causes examined were those classified as: Loss of Control, Controlled Flight into Terrain, Engine Failure, and Structural Failure. Airworthiness certification categories represent a wide diversity of government oversight. Part 23 rules have evolved from the initial set of simpler design standards and have progressed into a comprehensive and strict set of rules to address the safety issues of the more complex airplanes within the category. Experimental-Amateur Built airplanes have the least amount of government oversight and are the fastest growing segment. The Light Sport Aircraft category is a more recent certification category that utilizes consensus standards in the approval process. Civil Air Regulations 3 airplanes were designed and manufactured under simpler rules but modifying these airplanes has become lengthy and expensive. The study was conducted using a mixed methods methodology which involves both quantitative and qualitative elements. A Chi-Square test was used for a quantitative analysis of the accident frequency among aircraft certification categories. Accident rate analysis of the accidents among aircraft certification categories involved an ANCOVA test. The qualitative component involved the use of text mining techniques for the analysis of the narrative cause descriptions contained within the accident reports. The Chi

  13. Automated ultrasound edge-tracking software comparable to established semi-automated reference software for carotid intima-media thickness analysis.

    PubMed

    Shenouda, Ninette; Proudfoot, Nicole A; Currie, Katharine D; Timmons, Brian W; MacDonald, Maureen J

    2018-05-01

    Many commercial ultrasound systems are now including automated analysis packages for the determination of carotid intima-media thickness (cIMT); however, details regarding their algorithms and methodology are not published. Few studies have compared their accuracy and reliability with previously established automated software, and those that have were in asymptomatic adults. Therefore, this study compared cIMT measures from a fully automated ultrasound edge-tracking software (EchoPAC PC, Version 110.0.2; GE Medical Systems, Horten, Norway) to an established semi-automated reference software (Artery Measurement System (AMS) II, Version 1.141; Gothenburg, Sweden) in 30 healthy preschool children (ages 3-5 years) and 27 adults with coronary artery disease (CAD; ages 48-81 years). For both groups, Bland-Altman plots revealed good agreement with a negligible mean cIMT difference of -0·03 mm. Software differences were statistically, but not clinically, significant for preschool images (P = 0·001) and were not significant for CAD images (P = 0·09). Intra- and interoperator repeatability was high and comparable between software for preschool images (ICC, 0·90-0·96; CV, 1·3-2·5%), but slightly higher with the automated ultrasound than the semi-automated reference software for CAD images (ICC, 0·98-0·99; CV, 1·4-2·0% versus ICC, 0·84-0·89; CV, 5·6-6·8%). These findings suggest that the automated ultrasound software produces valid cIMT values in healthy preschool children and adults with CAD. Automated ultrasound software may be useful for ensuring consistency among multisite research initiatives or large cohort studies involving repeated cIMT measures, particularly in adults with documented CAD. © 2017 Scandinavian Society of Clinical Physiology and Nuclear Medicine. Published by John Wiley & Sons Ltd.

  14. Accidents of Electrical and Mechanical Works for Public Sector Projects in Hong Kong.

    PubMed

    Wong, Francis K W; Chan, Albert P C; Wong, Andy K D; Hon, Carol K H; Choi, Tracy N Y

    2018-03-10

    A study on electrical and mechanical (E&M) works-related accidents for public sector projects provided the opportunity to gain a better understanding of the causes of accidents by analyzing the circumstances of all E&M works accidents. The research aims to examine accidents of E&M works which happened in public sector projects. A total of 421 E&M works-related accidents in the "Public Works Programme Construction Site Safety and Environmental Statistics" (PCSES) system were extracted for analysis. Two-step cluster analysis was conducted to classify the E&M accidents into different groups. The results identified three E&M accidents groups: (1) electricians with over 15 years of experience were prone to 'fall of person from height'; (2) electricians with zero to five years of experience were prone to 'slip, trip or fall on same level'; (3) air-conditioning workers with zero to five years of experience were prone to multiple types of accidents. Practical measures were recommended for each specific cluster group to avoid recurrence of similar accidents. The accident analysis would be vital for industry practitioners to enhance the safety performance of public sector projects. This study contributes to filling the knowledge gap of how and why E&M accidents occur and promulgating preventive measures for E&M accidents which have been under researched.

  15. Accidents of Electrical and Mechanical Works for Public Sector Projects in Hong Kong

    PubMed Central

    Wong, Francis K. W.; Chan, Albert P. C.; Wong, Andy K. D.; Choi, Tracy N. Y.

    2018-01-01

    A study on electrical and mechanical (E&M) works-related accidents for public sector projects provided the opportunity to gain a better understanding of the causes of accidents by analyzing the circumstances of all E&M works accidents. The research aims to examine accidents of E&M works which happened in public sector projects. A total of 421 E&M works-related accidents in the “Public Works Programme Construction Site Safety and Environmental Statistics” (PCSES) system were extracted for analysis. Two-step cluster analysis was conducted to classify the E&M accidents into different groups. The results identified three E&M accidents groups: (1) electricians with over 15 years of experience were prone to ‘fall of person from height’; (2) electricians with zero to five years of experience were prone to ‘slip, trip or fall on same level’; (3) air-conditioning workers with zero to five years of experience were prone to multiple types of accidents. Practical measures were recommended for each specific cluster group to avoid recurrence of similar accidents. The accident analysis would be vital for industry practitioners to enhance the safety performance of public sector projects. This study contributes to filling the knowledge gap of how and why E&M accidents occur and promulgating preventive measures for E&M accidents which have been under researched. PMID:29534429

  16. ImatraNMR: Novel software for batch integration and analysis of quantitative NMR spectra

    NASA Astrophysics Data System (ADS)

    Mäkelä, A. V.; Heikkilä, O.; Kilpeläinen, I.; Heikkinen, S.

    2011-08-01

    Quantitative NMR spectroscopy is a useful and important tool for analysis of various mixtures. Recently, in addition of traditional quantitative 1D 1H and 13C NMR methods, a variety of pulse sequences aimed for quantitative or semiquantitative analysis have been developed. To obtain actual usable results from quantitative spectra, they must be processed and analyzed with suitable software. Currently, there are many processing packages available from spectrometer manufacturers and third party developers, and most of them are capable of analyzing and integration of quantitative spectra. However, they are mainly aimed for processing single or few spectra, and are slow and difficult to use when large numbers of spectra and signals are being analyzed, even when using pre-saved integration areas or custom scripting features. In this article, we present a novel software, ImatraNMR, designed for batch analysis of quantitative spectra. In addition to capability of analyzing large number of spectra, it provides results in text and CSV formats, allowing further data-analysis using spreadsheet programs or general analysis programs, such as Matlab. The software is written with Java, and thus it should run in any platform capable of providing Java Runtime Environment version 1.6 or newer, however, currently it has only been tested with Windows and Linux (Ubuntu 10.04). The software is free for non-commercial use, and is provided with source code upon request.

  17. Indianapolis Fire Department EMS Communications Center tracking truck, train HAZMAT cargoes with "Operation Respond" software

    DOT National Transportation Integrated Search

    1997-06-05

    When an accident involving the transportation of potentially dangerous materials occurs, local emergency response officials need accurate information about the material as quickly as possible. Using software donated to the Indianapolis Fire Departmen...

  18. Predictions of structural integrity of steam generator tubes under normal operating, accident, an severe accident conditions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Majumdar, S.

    1997-02-01

    Available models for predicting failure of flawed and unflawed steam generator tubes under normal operating, accident, and severe accident conditions are reviewed. Tests conducted in the past, though limited, tended to show that the earlier flow-stress model for part-through-wall axial cracks overestimated the damaging influence of deep cracks. This observation was confirmed by further tests at high temperatures, as well as by finite-element analysis. A modified correlation for deep cracks can correct this shortcoming of the model. Recent tests have shown that lateral restraint can significantly increase the failure pressure of tubes with unsymmetrical circumferential cracks. This observation was confirmedmore » by finite-element analysis. The rate-independent flow stress models that are successful at low temperatures cannot predict the rate-sensitive failure behavior of steam generator tubes at high temperatures. Therefore, a creep rupture model for predicting failure was developed and validated by tests under various temperature and pressure loadings that can occur during postulated severe accidents.« less

  19. Development of modified voxel phantoms for the numerical dosimetric reconstruction of radiological accidents involving external sources: implementation in SESAME tool.

    PubMed

    Courageot, Estelle; Sayah, Rima; Huet, Christelle

    2010-05-07

    Estimating the dose distribution in a victim's body is a relevant indicator in assessing biological damage from exposure in the event of a radiological accident caused by an external source. When the dose distribution is evaluated with a numerical anthropomorphic model, the posture and morphology of the victim have to be reproduced as realistically as possible. Several years ago, IRSN developed a specific software application, called the simulation of external source accident with medical images (SESAME), for the dosimetric reconstruction of radiological accidents by numerical simulation. This tool combines voxel geometry and the MCNP(X) Monte Carlo computer code for radiation-material interaction. This note presents a new functionality in this software that enables the modelling of a victim's posture and morphology based on non-uniform rational B-spline (NURBS) surfaces. The procedure for constructing the modified voxel phantoms is described, along with a numerical validation of this new functionality using a voxel phantom of the RANDO tissue-equivalent physical model.

  20. ICC-CLASS: isotopically-coded cleavable crosslinking analysis software suite

    PubMed Central

    2010-01-01

    Background Successful application of crosslinking combined with mass spectrometry for studying proteins and protein complexes requires specifically-designed crosslinking reagents, experimental techniques, and data analysis software. Using isotopically-coded ("heavy and light") versions of the crosslinker and cleavable crosslinking reagents is analytically advantageous for mass spectrometric applications and provides a "handle" that can be used to distinguish crosslinked peptides of different types, and to increase the confidence of the identification of the crosslinks. Results Here, we describe a program suite designed for the analysis of mass spectrometric data obtained with isotopically-coded cleavable crosslinkers. The suite contains three programs called: DX, DXDX, and DXMSMS. DX searches the mass spectra for the presence of ion signal doublets resulting from the light and heavy isotopic forms of the isotopically-coded crosslinking reagent used. DXDX searches for possible mass matches between cleaved and uncleaved isotopically-coded crosslinks based on the established chemistry of the cleavage reaction for a given crosslinking reagent. DXMSMS assigns the crosslinks to the known protein sequences, based on the isotopically-coded and un-coded MS/MS fragmentation data of uncleaved and cleaved peptide crosslinks. Conclusion The combination of these three programs, which are tailored to the analytical features of the specific isotopically-coded cleavable crosslinking reagents used, represents a powerful software tool for automated high-accuracy peptide crosslink identification. See: http://www.creativemolecules.com/CM_Software.htm PMID:20109223

  1. A Simplified Mesh Deformation Method Using Commercial Structural Analysis Software

    NASA Technical Reports Server (NTRS)

    Hsu, Su-Yuen; Chang, Chau-Lyan; Samareh, Jamshid

    2004-01-01

    Mesh deformation in response to redefined or moving aerodynamic surface geometries is a frequently encountered task in many applications. Most existing methods are either mathematically too complex or computationally too expensive for usage in practical design and optimization. We propose a simplified mesh deformation method based on linear elastic finite element analyses that can be easily implemented by using commercially available structural analysis software. Using a prescribed displacement at the mesh boundaries, a simple structural analysis is constructed based on a spatially varying Young s modulus to move the entire mesh in accordance with the surface geometry redefinitions. A variety of surface movements, such as translation, rotation, or incremental surface reshaping that often takes place in an optimization procedure, may be handled by the present method. We describe the numerical formulation and implementation using the NASTRAN software in this paper. The use of commercial software bypasses tedious reimplementation and takes advantage of the computational efficiency offered by the vendor. A two-dimensional airfoil mesh and a three-dimensional aircraft mesh were used as test cases to demonstrate the effectiveness of the proposed method. Euler and Navier-Stokes calculations were performed for the deformed two-dimensional meshes.

  2. A Pedagogical Software for the Analysis of Loudspeaker Systems

    ERIC Educational Resources Information Center

    Pueo, B.; Roma, M.; Escolano, J.; Lopez, J. J.

    2009-01-01

    In this paper, a pedagogical software for the design and analysis of loudspeaker systems is presented, with emphasis on training students in the interaction between system parameters. Loudspeakers are complex electromechanical system, whose behavior is neither intuitive nor easy to understand by inexperienced students. Although commercial…

  3. WinDAM C earthen embankment internal erosion analysis software

    USDA-ARS?s Scientific Manuscript database

    Two primary causes of dam failure are overtopping and internal erosion. For the purpose of evaluating dam safety for existing earthen embankment dams and proposed earthen embankment dams, Windows Dam Analysis Modules C (WinDAM C) software will simulate either internal erosion or erosion resulting f...

  4. Molecular structures and thermodynamic properties of monohydrated gaseous iodine compounds: Modelling for severe accident simulation

    NASA Astrophysics Data System (ADS)

    Sudolská, Mária; Cantrel, Laurent; Budzák, Šimon; Černušák, Ivan

    2014-03-01

    Monohydrated complexes of iodine species (I, I2, HI, and HOI) have been studied by correlated ab initio calculations. The standard enthalpies of formation, Gibbs free energy and the temperature dependence of the heat capacities at constant pressure were calculated. The values obtained have been implemented in ASTEC nuclear accident simulation software to check the thermodynamic stability of hydrated iodine compounds in the reactor coolant system and in the nuclear containment building of a pressurised water reactor during a severe accident. It can be concluded that iodine complexes are thermodynamically unstable by means of positive Gibbs free energies and would be represented by trace level concentrations in severe accident conditions; thus it is well justified to only consider pure iodine species and not hydrated forms.

  5. Major accident prevention through applying safety knowledge management approach.

    PubMed

    Kalatpour, Omid

    2016-01-01

    Many scattered resources of knowledge are available to use for chemical accident prevention purposes. The common approach to management process safety, including using databases and referring to the available knowledge has some drawbacks. The main goal of this article was to devise a new emerged knowledge base (KB) for the chemical accident prevention domain. The scattered sources of safety knowledge were identified and scanned. Then, the collected knowledge was formalized through a computerized program. The Protégé software was used to formalize and represent the stored safety knowledge. The domain knowledge retrieved as well as data and information. This optimized approach improved safety and health knowledge management (KM) process and resolved some typical problems in the KM process. Upgrading the traditional resources of safety databases into the KBs can improve the interaction between the users and knowledge repository.

  6. System analysis with improved thermo-mechanical fuel rod models for modeling current and advanced LWR materials in accident scenarios

    NASA Astrophysics Data System (ADS)

    Porter, Ian Edward

    A nuclear reactor systems code has the ability to model the system response in an accident scenario based on known initial conditions at the onset of the transient. However, there has been a tendency for these codes to lack the detailed thermo-mechanical fuel rod response models needed for accurate prediction of fuel rod failure. This proposed work will couple today's most widely used steady-state (FRAPCON) and transient (FRAPTRAN) fuel rod models with a systems code TRACE for best-estimate modeling of system response in accident scenarios such as a loss of coolant accident (LOCA). In doing so, code modifications will be made to model gamma heating in LWRs during steady-state and accident conditions and to improve fuel rod thermal/mechanical analysis by allowing axial nodalization of burnup-dependent phenomena such as swelling, cladding creep and oxidation. With the ability to model both burnup-dependent parameters and transient fuel rod response, a fuel dispersal study will be conducted using a hypothetical accident scenario under both PWR and BWR conditions to determine the amount of fuel dispersed under varying conditions. Due to the fuel fragmentation size and internal rod pressure both being dependent on burnup, this analysis will be conducted at beginning, middle and end of cycle to examine the effects that cycle time can play on fuel rod failure and dispersal. Current fuel rod and system codes used by the Nuclear Regulatory Commission (NRC) are compilations of legacy codes with only commonly used light water reactor materials, Uranium Dioxide (UO2), Mixed Oxide (U/PuO 2) and zirconium alloys. However, the events at Fukushima Daiichi and Three Mile Island accident have shown the need for exploration into advanced materials possessing improved accident tolerance. This work looks to further modify the NRC codes to include silicon carbide (SiC), an advanced cladding material proposed by current DOE funded research on accident tolerant fuels (ATF). Several

  7. Paragliding accidents in remote areas.

    PubMed

    Fasching, G; Schippinger, G; Pretscher, R

    1997-08-01

    Paragliding is an increasingly popular hobby, as people try to find new and more adventurous activities. However, there is an increased and inherent danger with this sport. For this reason, as well as the inexperience of many operators, injuries occur frequently. This retrospective study centers on the helicopter rescue of 70 individuals in paragliding accidents. All histories were examined, and 43 patients answered a questionnaire. Nineteen (42%) pilots were injured when taking off, 20 (44%) during the flight, and six (13%) when landing. Routine and experience did not affect the prevalence of accident. Analysis of the causes of accident revealed pilot errors in all but three cases. In 34 rescue operations a landing of the helicopter near the site of the accident was possible. Half of the patients had to be rescued by a cable winch or a long rope fixed to the helicopter. Seven (10%) of the pilots suffered multiple trauma, 38 (54%) had injuries of the lower extremities, and 32 (84%) of them sustained fractures. Injuries to the spine were diagnosed in 34 cases with a fracture rate of 85%. One patient had an incomplete paraplegia. Injuries to the head occurred in 17 patients. No paraglider pilot died. The average hospitalization was 22 days, and average time of working inability was 14 weeks. Fourteen (34%) patients suffered from a permanent damage to their nerves or joints. Forty-three percent of the paragliders continued their sport despite the accident; two of them had another accident. An improved training program is necessary to lower the incidence of paragliding accidents. Optimal equipment to reduce injuries in case of accidents is mandatory. The helicopter emergency physician must perform a careful examination, provide stabilization of airways and circulation, give analgesics, splint fractured extremities, and transport the victim on a vacuum mattress to the appropriate hospital.

  8. Information Extraction for System-Software Safety Analysis: Calendar Year 2007 Year-End Report

    NASA Technical Reports Server (NTRS)

    Malin, Jane T.

    2008-01-01

    This annual report describes work to integrate a set of tools to support early model-based analysis of failures and hazards due to system-software interactions. The tools perform and assist analysts in the following tasks: 1) extract model parts from text for architecture and safety/hazard models; 2) combine the parts with library information to develop the models for visualization and analysis; 3) perform graph analysis on the models to identify possible paths from hazard sources to vulnerable entities and functions, in nominal and anomalous system-software configurations; 4) perform discrete-time-based simulation on the models to investigate scenarios where these paths may play a role in failures and mishaps; and 5) identify resulting candidate scenarios for software integration testing. This paper describes new challenges in a NASA abort system case, and enhancements made to develop the integrated tool set.

  9. Comparative analysis of the countermeasures taken to mitigate exposure of the public to radioiodine following the Chernobyl and Fukushima accidents: lessons from both accidents

    PubMed Central

    Uyba, Vladimir; Samoylov, Alexander; Shinkarev, Sergey

    2018-01-01

    Abstract In the case of a severe radiation accident at a nuclear power station, the most important radiation hazard for the public is internal exposure of the thyroid to radioiodine. The purposes of this paper were (i) to compare countermeasures conducted (following the Chernobyl and Fukushima accidents) aimed at mitigation of exposure to the thyroid for the public, (ii) to present comparative estimates of doses to the thyroid and (iii) to derive lessons from the two accidents. The scale and time of countermeasures applied in the early phase of the accidents (sheltering, evacuation, and intake of stable iodine to block the thyroid) and at a later time (control of 131I concentration in foodstuffs) have been described. After the Chernobyl accident, the estimation of the thyroid doses for the public was mainly based on direct thyroid measurements of ~400 000 residents carried out within the first 2 months. The highest estimates of thyroid doses to children reached 50 Gy. After the Fukushima accident, the estimation of thyroid doses was based on radioecological models due to a lack of direct thyroid measurements (only slightly more than 1000 residents were measured). The highest estimates of thyroid doses to children were a few hundred mGy. Following the Chernobyl accident, ingestion of 131I through cows’ milk was the dominant pathway. Following the Fukushima accident, it appears that inhalation of contaminated air was the dominant pathway. Some lessons learned following the Chernobyl and Fukushima accidents have been presented in this paper. PMID:29415268

  10. Analysis on the Role of RSG-GAS Pool Cooling System during Partial Loss of Heat Sink Accident

    NASA Astrophysics Data System (ADS)

    Susyadi; Endiah, P. H.; Sukmanto, D.; Andi, S. E.; Syaiful, B.; Hendro, T.; Geni, R. S.

    2018-02-01

    RSG-GAS is a 30 MW reactor that is mostly used for radioisotope production and experimental activities. Recently, it is regularly operated at half of its capacity for efficiency reason. During an accident, especially loss of heat sink, the role of its pool cooling system is very important to dump decay heat. An analysis using single failure approach and partial modeling of RELAP5 performed by S. Dibyo, 2010 shows that there is no significant increase in the coolant temperature if this system is properly functioned. However lessons learned from the Fukushima accident revealed that an accident can happen due to multiple failures. Considering ageing of the reactor, in this research the role of pool cooling system is to be investigated for a partial loss of heat sink accident which is at the same time the protection system fails to scram the reactor when being operated at 15 MW. The purpose is to clarify the transient characteristics and the final state of the coolant temperature. The method used is by simulating the system in RELAP5 code. Calculation results shows the pool cooling systems reduce coolant temperature for about 1 K as compared without activating them. The result alsoreveals that when the reactor is being operated at half of its rated power, it is still in safe condition for a partial loss of heat sink accident without scram.

  11. Software and package applicating for network meta-analysis: A usage-based comparative study.

    PubMed

    Xu, Chang; Niu, Yuming; Wu, Junyi; Gu, Huiyun; Zhang, Chao

    2017-12-21

    To compare and analyze the characteristics and functions of software applications for network meta-analysis (NMA). PubMed, EMbase, The Cochrane Library, the official websites of Bayesian inference Using Gibbs Sampling (BUGS), Stata and R, and Google were searched to collect the software and packages for performing NMA; software and packages published up to March 2016 were included. After collecting the software, packages, and their user guides, we used the software and packages to calculate a typical example. All characteristics, functions, and computed results were compared and analyzed. Ten types of software were included, including programming and non-programming software. They were developed mainly based on Bayesian or frequentist theory. Most types of software have the characteristics of easy operation, easy mastery, exact calculation, or excellent graphing. However, there was no single software that performed accurate calculations with superior graphing; this could only be achieved through the combination of two or more types of software. This study suggests that the user should choose the appropriate software according to personal programming basis, operational habits, and financial ability. Then, the choice of the combination of BUGS and R (or Stata) software to perform the NMA is considered. © 2017 Chinese Cochrane Center, West China Hospital of Sichuan University and John Wiley & Sons Australia, Ltd.

  12. Nonstationary Extreme Value Analysis in a Changing Climate: A Software Package

    NASA Astrophysics Data System (ADS)

    Cheng, L.; AghaKouchak, A.; Gilleland, E.

    2013-12-01

    Numerous studies show that climatic extremes have increased substantially in the second half of the 20th century. For this reason, analysis of extremes under a nonstationary assumption has received a great deal of attention. This paper presents a software package developed for estimation of return levels, return periods, and risks of climatic extremes in a changing climate. This MATLAB software package offers tools for analysis of climate extremes under both stationary and non-stationary assumptions. The Nonstationary Extreme Value Analysis (hereafter, NEVA) provides an efficient and generalized framework for analyzing extremes using Bayesian inference. NEVA estimates the extreme value parameters using a Differential Evolution Markov Chain (DE-MC) which utilizes the genetic algorithm Differential Evolution (DE) for global optimization over the real parameter space with the Markov Chain Monte Carlo (MCMC) approach and has the advantage of simplicity, speed of calculation and convergence over conventional MCMC. NEVA also offers the confidence interval and uncertainty bounds of estimated return levels based on the sampled parameters. NEVA integrates extreme value design concepts, data analysis tools, optimization and visualization, explicitly designed to facilitate analysis extremes in geosciences. The generalized input and output files of this software package make it attractive for users from across different fields. Both stationary and nonstationary components of the package are validated for a number of case studies using empirical return levels. The results show that NEVA reliably describes extremes and their return levels.

  13. Report: Scientific Software.

    ERIC Educational Resources Information Center

    Borman, Stuart A.

    1985-01-01

    Discusses various aspects of scientific software, including evaluation and selection of commercial software products; program exchanges, catalogs, and other information sources; major data analysis packages; statistics and chemometrics software; and artificial intelligence. (JN)

  14. Software For Computing Reliability Of Other Software

    NASA Technical Reports Server (NTRS)

    Nikora, Allen; Antczak, Thomas M.; Lyu, Michael

    1995-01-01

    Computer Aided Software Reliability Estimation (CASRE) computer program developed for use in measuring reliability of other software. Easier for non-specialists in reliability to use than many other currently available programs developed for same purpose. CASRE incorporates mathematical modeling capabilities of public-domain Statistical Modeling and Estimation of Reliability Functions for Software (SMERFS) computer program and runs in Windows software environment. Provides menu-driven command interface; enabling and disabling of menu options guides user through (1) selection of set of failure data, (2) execution of mathematical model, and (3) analysis of results from model. Written in C language.

  15. A summary of unmanned aircraft accident/incident data : human factors implications

    DOT National Transportation Integrated Search

    2004-12-01

    A review and analysis of unmanned aircraft (UA) accident data was conducted to identify important human factors issues related to their use. UA accident data were collected from the U.S. Army, Navy, and Air Force. Classification of the accident data ...

  16. Accident investigation

    NASA Technical Reports Server (NTRS)

    Laynor, William G. Bud

    1987-01-01

    The National Transportation Safety Board (NTSB) has attributed wind shear as a cause or contributing factor in 15 accidents involving transport-categroy airplanes since 1970. Nine of these were nonfatal; but the other six accounted for 440 lives. Five of the fatal accidents and seven of the nonfatal accidents involved encounters with convective downbursts or microbursts. Of other accidents, two which were nonfatal were encounters with a frontal system shear, and one which was fatal was the result of a terrain induced wind shear. These accidents are discussed with reference to helping the aircraft to avoid the wind shear or if impossible to help the pilot to get through the wind shear.

  17. FloWave.US: validated, open-source, and flexible software for ultrasound blood flow analysis.

    PubMed

    Coolbaugh, Crystal L; Bush, Emily C; Caskey, Charles F; Damon, Bruce M; Towse, Theodore F

    2016-10-01

    Automated software improves the accuracy and reliability of blood velocity, vessel diameter, blood flow, and shear rate ultrasound measurements, but existing software offers limited flexibility to customize and validate analyses. We developed FloWave.US-open-source software to automate ultrasound blood flow analysis-and demonstrated the validity of its blood velocity (aggregate relative error, 4.32%) and vessel diameter (0.31%) measures with a skeletal muscle ultrasound flow phantom. Compared with a commercial, manual analysis software program, FloWave.US produced equivalent in vivo cardiac cycle time-averaged mean (TAMean) velocities at rest and following a 10-s muscle contraction (mean bias <1 pixel for both conditions). Automated analysis of ultrasound blood flow data was 9.8 times faster than the manual method. Finally, a case study of a lower extremity muscle contraction experiment highlighted the ability of FloWave.US to measure small fluctuations in TAMean velocity, vessel diameter, and mean blood flow at specific time points in the cardiac cycle. In summary, the collective features of our newly designed software-accuracy, reliability, reduced processing time, cost-effectiveness, and flexibility-offer advantages over existing proprietary options. Further, public distribution of FloWave.US allows researchers to easily access and customize code to adapt ultrasound blood flow analysis to a variety of vascular physiology applications. Copyright © 2016 the American Physiological Society.

  18. Data analysis and software support for the Earth radiation budget experiment

    NASA Technical Reports Server (NTRS)

    Edmonds, W.; Natarajan, S.

    1987-01-01

    Computer programming and data analysis efforts were performed in support of the Earth Radiation Budget Experiment (ERBE) at NASA/Langley. A brief description of the ERBE followed by sections describing software development and data analysis for both prelaunch and postlaunch instrument data are presented.

  19. Development of the Free-space Optical Communications Analysis Software (FOCAS)

    NASA Technical Reports Server (NTRS)

    Jeganathan, M.; Mecherle, G.; Lesh, J.

    1998-01-01

    The Free-space Optical Communications Analysis Software (FOCAS) was developed at the Jet Propulsion Laboratory (JPL) to provide mission planners, systems engineers and communications engineers with an easy to use tool to analyze optical communications link.

  20. Genitourinary injuries after traffic accidents: Analysis of a registry of 162,690 victims.

    PubMed

    Terrier, Jean-Etienne; Paparel, Philippe; Gadegbeku, Blandine; Ruffion, Alain; Jenkins, Lawrence C; N'Diaye, Amina

    2017-06-01

    Traffic accidents are the most frequent cause of genitourinary injuries (GUI). Kidney injuries after trauma have been well described. However, there exists a paucity of data on other traumatic GUI after traffic accidents. The objective of this study was to analyze the frequency and type of all GUI, by user category, after traffic accidents. Patient cases were extracted from the trauma registry of the French department of Rhone from 1996 to 2013. We assessed the urogenital injuries presented by each of road user's categories. Severity injuries were coded with the Abbreviated Injury Scale and the Injury Severity Score. Kidney trauma was mapped with the classification of the American Association for the Surgery of Trauma. Multivariate prediction models were used for analysis of data. Of 162,690 victims, 963 presented with GUI (0.59%). 47% were motorcyclists, 22% were in a car, 18% on bicycles, and 9% were pedestrians. The most common organ injury was kidney (41%) followed by testicular (23%). Among the 208 motorists with a GUI, kidney (70%), bladder (10%), and adrenal gland (9%) were the most frequent lesions. Among the 453 motorcyclist victims with GUI, kidney (35%) and testicular (38%) traumas were the most frequent and 62% of injuries involved external genitalia. There were 175 cyclists with GUI, 70% of injuries involved external genitalia; penile traumas (23%) were the most frequent. In total, there were 395 kidney injuries, most being low grade. According to the American Association for the Surgery of Trauma kidney injuries were grade I, 59%; grade II, 11%; grade III, 16%; grade IV, 9%; grade V, 3%; and indeterminate, 2%. GUI is an infrequent trauma after traffic accidents, with kidneys being the most commonly injured. Physicians must maintain a high awareness for external genitalia injuries in motorcyclists and cyclists. Prognostic and epidemiologic study, level III.

  1. Automated sequence analysis and editing software for HIV drug resistance testing.

    PubMed

    Struck, Daniel; Wallis, Carole L; Denisov, Gennady; Lambert, Christine; Servais, Jean-Yves; Viana, Raquel V; Letsoalo, Esrom; Bronze, Michelle; Aitken, Sue C; Schuurman, Rob; Stevens, Wendy; Schmit, Jean Claude; Rinke de Wit, Tobias; Perez Bercoff, Danielle

    2012-05-01

    Access to antiretroviral treatment in resource-limited-settings is inevitably paralleled by the emergence of HIV drug resistance. Monitoring treatment efficacy and HIV drugs resistance testing are therefore of increasing importance in resource-limited settings. Yet low-cost technologies and procedures suited to the particular context and constraints of such settings are still lacking. The ART-A (Affordable Resistance Testing for Africa) consortium brought together public and private partners to address this issue. To develop an automated sequence analysis and editing software to support high throughput automated sequencing. The ART-A Software was designed to automatically process and edit ABI chromatograms or FASTA files from HIV-1 isolates. The ART-A Software performs the basecalling, assigns quality values, aligns query sequences against a set reference, infers a consensus sequence, identifies the HIV type and subtype, translates the nucleotide sequence to amino acids and reports insertions/deletions, premature stop codons, ambiguities and mixed calls. The results can be automatically exported to Excel to identify mutations. Automated analysis was compared to manual analysis using a panel of 1624 PR-RT sequences generated in 3 different laboratories. Discrepancies between manual and automated sequence analysis were 0.69% at the nucleotide level and 0.57% at the amino acid level (668,047 AA analyzed), and discordances at major resistance mutations were recorded in 62 cases (4.83% of differences, 0.04% of all AA) for PR and 171 (6.18% of differences, 0.03% of all AA) cases for RT. The ART-A Software is a time-sparing tool for pre-analyzing HIV and viral quasispecies sequences in high throughput laboratories and highlighting positions requiring attention. Copyright © 2012 Elsevier B.V. All rights reserved.

  2. Road accidents caused by drivers falling asleep.

    PubMed

    Sagberg, F

    1999-11-01

    About 29600 Norwegian accident-involved drivers received a questionnaire about the last accident reported to their insurance company. About 9200 drivers (31%) returned the questionnaire. The questionnaire contained questions about sleep or fatigue as contributing factors to the accident. In addition, the drivers reported whether or not they had fallen asleep some time whilst driving. and what the consequences had been. Sleep or drowsiness was a contributing factor in 3.9% of all accidents, as reported by drivers who were at fault for the accident. This factor was strongly over-represented in night-time accidents (18.6%), in running-off-the-road accidents (8.3%), accidents after driving more than 150 km on one trip (8.1%), and personal injury accidents (7.3%). A logistic regression analysis showed that the following additional factors made significant and independent contributions to increasing the odds of sleep involvement in an accident: dry road, high speed limit, driving one's own car, not driving the car daily, high education, and few years of driving experience. More male than female drivers were involved in sleep-related accidents, but this seems largely to be explained by males driving relatively more than females on roads with high speed limits. A total of 10% of male drivers and 4% of females reported to have fallen asleep while driving during the last 12 months. A total of 4% of these events resulted in an accident. The most frequent consequence of falling asleep--amounting to more than 40% of the reported incidents--was crossing of the right edge-line before awaking, whereas crossing of the centreline was reported by 16%. Drivers' lack of awareness of important precursors of falling asleep--like highway hypnosis, driving without awareness, and similar phenomena--as well as a reluctance to discontinue driving despite feeling tired are pointed out as likely contributors to sleep-related accidents. More knowledge about the drivers' experiences immediately

  3. [An analysis of 148 outpatient treated occupational accidents].

    PubMed

    Nicaeus, T; Erb, C; Rohrbach, M; Thiel, H J

    1996-10-01

    The most common eye injuries are non-perforating. Eye injuries in the workplace are a major cause of socioeconomical damage, morbidity and disability, despite well publicised standards for industrial eye protection. This study investigates the epidemiological and clinical aspects of 148 occupational cases. At the University Eye Clinic of Tübingen, 709 non-perforating eye injuries were registered as occupational accidents between 1995 and 1996. Of these cases, 148 were analysed retrospectively per random. The 5 most common injuries of 148 patients (m/f = 138/10; mean age 33.4 +/- 12 years) were related to corneal foreign body injuries (35%), chemical burns (15.5%), sub-conjunctival foreign bodies (12%), thermal/ultraviolet injuries (11%) and contusions (7.4%). Of these patients, 22.3% were employed as construction workers and 16.2% as metal workers. At the time of examination the visual acuity of the traumatic eye was 0.9 +/- 0.3. The interval between the beginning of work and accident was 6.2 +/- 6.4 hours in average (0.5-13.5 h). Of all accidents, 8.5% were caused during the first hour of work; in contrast 45.5% of all accidents were caused after 6 hours of work. Another 12.4 +/- 14.5 hours (5min.-72 h; median 7 h) passed by until the patients arrived for eye examination at the Eye Clinic of Tübingen. Only 6% of all patients arrived within the first hour, and 29.7% after 12 hours. Of all cases, 30.4% received first-aid treatment in their company by the factory doctor or by the eye doctor before examination at the Eye Clinic. Only 6.8% of all patients had protective spectacles during work. Incapacity was seen in 30.4%; the average in total was 5.5 +/- 10 days. Despite the late examination at the Eye Clinic the functional loss was mostly little except after chemical burns. Nevertheless, most occupational accidents can be avoided with better protective devices in order to reduce the incidence of injuries and socioeconomical damage. Therefore an intense campaign

  4. Using Numerical Models in the Development of Software Tools for Risk Management of Accidents with Oil and Inert Spills

    NASA Astrophysics Data System (ADS)

    Fernandes, R.; Leitão, P. C.; Braunschweig, F.; Lourenço, F.; Galvão, P.; Neves, R.

    2012-04-01

    The increasing ship traffic and maritime transport of dangerous substances make it more difficult to significantly reduce the environmental, economic and social risks posed by potential spills, although the security rules are becoming more restrictive (ships with double hull, etc.) and the surveillance systems are becoming more developed (VTS, AIS). In fact, the problematic associated to spills is and will always be a main topic: spill events are continuously happening, most of them unknown for the general public because of their small scale impact, but with some of them (in a much smaller number) becoming authentic media phenomena in this information era, due to their large dimensions and environmental and social-economic impacts on ecosystems and local communities, and also due to some spectacular or shocking pictures generated. Hence, the adverse consequences posed by these type of accidents, increase the preoccupation of avoiding them in the future, or minimize their impacts, using not only surveillance and monitoring tools, but also increasing the capacity to predict the fate and behaviour of bodies, objects, or substances in the following hours after the accident - numerical models can have now a leading role in operational oceanography applied to safety and pollution response in the ocean because of their predictive potential. Search and rescue operation, oil, inert (ship debris, or floating containers), and HNS (hazardous and noxious substances) spills risk analysis are the main areas where models can be used. Model applications have been widely used in emergency or planning issues associated to pollution risks, and contingency and mitigation measures. Before a spill, in the planning stage, modelling simulations are used in environmental impact studies, or risk maps, using historical data, reference situations, and typical scenarios. After a spill, the use of fast and simple modelling applications allow to understand the fate and behaviour of the spilt

  5. Severe accident modeling of a PWR core with different cladding materials

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Johnson, S. C.; Henry, R. E.; Paik, C. Y.

    2012-07-01

    The MAAP v.4 software has been used to model two severe accident scenarios in nuclear power reactors with three different materials as fuel cladding. The TMI-2 severe accident was modeled with Zircaloy-2 and SiC as clad material and a SBO accident in a Zion-like, 4-loop, Westinghouse PWR was modeled with Zircaloy-2, SiC, and 304 stainless steel as clad material. TMI-2 modeling results indicate that lower peak core temperatures, less H 2 (g) produced, and a smaller mass of molten material would result if SiC was substituted for Zircaloy-2 as cladding. SBO modeling results indicate that the calculated time to RCSmore » rupture would increase by approximately 20 minutes if SiC was substituted for Zircaloy-2. Additionally, when an extended SBO accident (RCS creep rupture failure disabled) was modeled, significantly lower peak core temperatures, less H 2 (g) produced, and a smaller mass of molten material would be generated by substituting SiC for Zircaloy-2 or stainless steel cladding. Because the rate of SiC oxidation reaction with elevated temperature H{sub 2}O (g) was set to 0 for this work, these results should be considered preliminary. However, the benefits of SiC as a more accident tolerant clad material have been shown and additional investigation of SiC as an LWR core material are warranted, specifically investigations of the oxidation kinetics of SiC in H{sub 2}O (g) over the range of temperatures and pressures relevant to severe accidents in LWR 's. (authors)« less

  6. [Health care for aged victims of accidents and violence: analysis of SUS health services in Recife (PE, Brazil)].

    PubMed

    de Lima, Maria Luiza Carvalho; de Souza, Edinilsa Ramos; de Lima, Maria Luiza Lopes Timóteo; Barreira, Alice Kelly; Bezerra, Eduardo Duque; Acioli, Raquel Moura Lins

    2010-09-01

    A situational diagnosis of the health services regarding the care of aged victims of accidents and violence (AVAV) was carried out in Recife, Pernambuco, Brazil. The National Policy for Reducing Accident and Violence Related Morbidity and Mortality and the National Policy for the Aged People Health were used as references. The methodology was based on the triangulation method, with both quantitative and qualitative approaches. Questionnaires and interviews were answered by managers and health staff of hospital, prehospital and rehabilitation services; and local aged health policy managers. In 2006, only the Family Health Program reported prehospital care for AVAV, 31 cases were due to violence and 18 to accidents. The hospital care for aged people was 7.2% of the total care, 27% from accidents and 10% from violence. In the same year, there was no record of rehabilitation care of AVAV. The directives of the policies studied are only partially followed. The health care is deficient in several aspects, such as: clinical protocols; notification devices; support to the aged, caregivers and aggressors; and also continuous training. This analysis can be such a contribution to the reorganization of the local health system, recognizing the aged person as vulnerable to accidents and violence.

  7. Assessment of medium-term cardiovascular disease risk after Japan’s 2011 Fukushima Daiichi nuclear accident: a retrospective analysis

    PubMed Central

    Nomura, Shuhei; Gilmour, Stuart; Oikawa, Tomoyoshi; Lee, Kiwon; Kiyabu, Grace Y; Shibuya, Kenji

    2017-01-01

    Objective To assess the medium-term indirect impact of the 2011 Fukushima Daiichi nuclear accident on cardiovascular disease (CVD) risks and to identify whether risk factors for CVD changed after the accident. Participants Residents aged 40 years and over participating in annual public health check-ups from 2009 to 2012, administered by Minamisoma city, located about 10 to 40 km from the Fukushima Daiichi nuclear plant. Methods The sex-specific Framingham CVD risk score was considered as the outcome measure and was compared before (2009–2010) and after the accident (2011–2012). A multivariate regression analysis was employed to evaluate risk factors for CVD. Results Data from 563 individuals (60.2% women) aged 40 to 74 years who participated in the check-ups throughout the study period was analysed. After adjusting for covariates, no statistically significant change was identified in the CVD risk score postaccident in both sexes, which may suggest no obvious medium-term health impact of the Fukushima nuclear accident on CVD risk. The risk factors for CVD and their magnitude and direction (positive/negative) did not change after the accident. Conclusions There was no obvious increase in CVD risks in Minamisoma city, which may indicate successful management of health risks associated with CVD in the study sample. PMID:29275343

  8. Analysis of stationary and dynamic factors affecting highway accident occurrence: A dynamic correlated grouped random parameters binary logit approach.

    PubMed

    Fountas, Grigorios; Sarwar, Md Tawfiq; Anastasopoulos, Panagiotis Ch; Blatt, Alan; Majka, Kevin

    2018-04-01

    Traditional accident analysis typically explores non-time-varying (stationary) factors that affect accident occurrence on roadway segments. However, the impact of time-varying (dynamic) factors is not thoroughly investigated. This paper seeks to simultaneously identify pre-crash stationary and dynamic factors of accident occurrence, while accounting for unobserved heterogeneity. Using highly disaggregate information for the potential dynamic factors, and aggregate data for the traditional stationary elements, a dynamic binary random parameters (mixed) logit framework is employed. With this approach, the dynamic nature of weather-related, and driving- and pavement-condition information is jointly investigated with traditional roadway geometric and traffic characteristics. To additionally account for the combined effect of the dynamic and stationary factors on the accident occurrence, the developed random parameters logit framework allows for possible correlations among the random parameters. The analysis is based on crash and non-crash observations between 2011 and 2013, drawn from urban and rural highway segments in the state of Washington. The findings show that the proposed methodological framework can account for both stationary and dynamic factors affecting accident occurrence probabilities, for panel effects, for unobserved heterogeneity through the use of random parameters, and for possible correlation among the latter. The comparative evaluation among the correlated grouped random parameters, the uncorrelated random parameters logit models, and their fixed parameters logit counterpart, demonstrate the potential of the random parameters modeling, in general, and the benefits of the correlated grouped random parameters approach, specifically, in terms of statistical fit and explanatory power. Published by Elsevier Ltd.

  9. Cost-effectiveness analysis of telemedical devices for pre-clinical traffic accident emergency rescue in Germany.

    PubMed

    Auerbach, H; Schreyögg, J; Busse, R

    2006-01-01

    The purpose of this study is to assess the cost-effectiveness (net costs per life year gained) of telemedical devices for pre-clinical traffic accident emergency rescue in Germany. Two equipment versions of a telemedical device are compared from a societal perspective with the baseline in Germany, i.e. the non-application of telemedicine in emergency rescues. The analysis is based on retrospective statistical data covering a period of 10 years with discounted costs not adjusted for inflation. Due to the uncertainty of data, certain assumptions and estimates were necessary. The outcome is measured in terms of "life years gained" by reducing therapy-free intervals and improvements in first-aid provided by laypersons. The introduction of the basic equipment version, "Automatic Accident Alert", is associated with net costs per life year gained of euro 247,977 (at baseline assumptions). The full equipment version of the telemedical device would lead to estimated net costs of euro 239,524 per life year gained. Multi-way sensitivity-analysis with best and worst case scenarios suggests that decreasing system costs would disproportionately reduce total costs, and that rapid market penetration would largely increase the system's benefit, while simultaneously reducing costs. The net costs per life year gained in the application of the two versions of the telemedical device for pre-clinical emergency rescue of traffic accidents are estimated as quite high. However, the implementation of the device as part of a larger European co-ordinated initiative is more realistic.

  10. Analysis software can put surgical precision into medical device design.

    PubMed

    Jain, S

    2005-11-01

    Use of finite element analysis software can give design engineers greater freedom to experiment with new designs and materials and allow companies to get products through clinical trials and onto the market faster. This article suggests how.

  11. Integrated optomechanical analysis and testing software development at MIT Lincoln Laboratory

    NASA Astrophysics Data System (ADS)

    Stoeckel, Gerhard P.; Doyle, Keith B.

    2013-09-01

    Advanced analytical software capabilities are being developed to advance the design of prototypical hardware in the Engineering Division at MIT Lincoln Laboratory. The current effort is focused on the integration of analysis tools tailored to the work flow, organizational structure, and current technology demands. These tools are being designed to provide superior insight into the interdisciplinary behavior of optical systems and enable rapid assessment and execution of design trades to optimize the design of optomechanical systems. The custom software architecture is designed to exploit and enhance the functionality of existing industry standard commercial software, provide a framework for centralizing internally developed tools, and deliver greater efficiency, productivity, and accuracy through standardization, automation, and integration. Specific efforts have included the development of a feature-rich software package for Structural-Thermal-Optical Performance (STOP) modeling, advanced Line Of Sight (LOS) jitter simulations, and improved integration of dynamic testing and structural modeling.

  12. Analysis of labour accidents in tunnel construction and introduction of prevention measures.

    PubMed

    Kikkawa, Naotaka; Itoh, Kazuya; Hori, Tomohito; Toyosawa, Yasuo; Orense, Rolando P

    2015-01-01

    At present, almost all mountain tunnels in Japan are excavated and constructed utilizing the New Austrian Tunneling Method (NATM), which was advocated by Prof. Rabcewicz of Austria in 1964. In Japan, this method has been applied to tunnel construction since around 1978, after which there has been a subsequent decrease in the number of casualties during tunnel construction. However, there is still a relatively high incidence of labour accidents during tunnel construction when compared to incidence rates in the construction industry in general. During tunnel construction, rock fall events at the cutting face are a particularly characteristic of the type of accident that occurs. In this study, we analysed labour accidents that possess the characteristics of a rock fall event at a work site. We also introduced accident prevention measures against rock fall events.

  13. Spatio-temporal patterns of hazards and their use in risk assessment and mitigation. Case study of road accidents in Romania

    NASA Astrophysics Data System (ADS)

    Catalin Stanga, Iulian

    2013-04-01

    the spatial or temporal clustering of crash accidents. Since the 1990's, Geographical Informational Systems (GIS) became a very important tool for traffic and road safety management, allowing not only the spatial and multifactorial analysis, but also graphical and non-graphical outputs. The current paper presents an accessible GIS methodology to study the spatio-temporal pattern of injury related road accidents, to identify the high density accidents zones, to make a cluster analysis, to create multicriterial typologies, to identify spatial and temporal similarities and to explain them. In this purpose, a Geographical Information System was created, allowing a complex analysis that involves not only the events, but also a large set of interrelated and spatially linked attributes. The GIS includes the accidents as georeferenced point elements with a spatially linked attribute database: identification information (date, location details); accident type; main, secondary and aggravating causes; data about driver; vehicle information; consequences (damages, injured peoples and fatalities). Each attribute has its own number code that allows both the statistical analysis and the spatial interrogation. The database includes those road accidents that led to physical injuries and loss of human lives between 2007 and 2012 and the spatial analysis was realized using TNTmips 7.3 software facilities. Data aggregation and processing allowed creating the spatial pattern of injury related road accidents through Kernel density estimation at three different levels (national - Romania; county level - Iasi County; local level - Iasi town). Spider graphs were used to create the temporal pattern or road accidents at three levels (daily, weekly and monthly) directly related to their causes. Moreover the spatial and temporal database relates the natural hazards (glazed frost, fog, and blizzard) with the human made ones, giving the opportunity to evaluate the nature of uncertainties in risk

  14. Characteristics of worker accidents on NYSDOT construction projects.

    PubMed

    Mohan, Satish; Zech, Wesley C

    2005-01-01

    This paper aims at providing cost-effective safety measures to protect construction workers in highway work zones, based on real data. Two types of accidents that occur in work zones were: (a) construction work area accidents, and (b) traffic accidents involving construction worker(s). A detailed analysis of work zone accidents involving 36 fatalities and 3,055 severe injuries to construction workers on New York State Department of Transportation (NYSDOT) construction projects from 1990 to 2001 established that five accident types: (a) Struck/Pinned by Large Equipment, (b) Trip or Fall (elevated), (c) Contact w/Electrical or Gas Utility, (d) Struck-by Moving/Falling Load, and (e) Crane/Lift Device Failure accounted for nearly 96% of the fatal accidents, nearly 63% of the hospital-level injury accidents, and nearly 91% of the total costs. These construction work area accidents had a total cost of $133.8 million. Traffic accidents that involve contractors' employees were also examined. Statistical analyses of the traffic accidents established that five traffic accident types: (a) Work Space Intrusion, (b) Worker Struck-by Vehicle Inside Work Space, (c) Flagger Struck-by Vehicle, (d) Worker Struck-by Vehicle Entering/Exiting Work Space, and (e) Construction Equipment Struck-by Vehicle Inside Work Space accounted for nearly 86% of the fatal, nearly 70% of the hospital-level injury and minor injury traffic accidents, and $45.4 million (79.4%) of the total traffic accident costs. The results of this paper provide real statistics on construction worker related accidents reported on construction work zones. Potential preventions based on real statistics have also been suggested. The ranking of accident types, both within the work area as well as in traffic, will guide the heavy highway contractor and owner agencies in identifying the most cost effective safety preventions.

  15. Software Safety Progress in NASA

    NASA Technical Reports Server (NTRS)

    Radley, Charles F.

    1995-01-01

    NASA has developed guidelines for development and analysis of safety-critical software. These guidelines have been documented in a Guidebook for Safety Critical Software Development and Analysis. The guidelines represent a practical 'how to' approach, to assist software developers and safety analysts in cost effective methods for software safety. They provide guidance in the implementation of the recent NASA Software Safety Standard NSS-1740.13 which was released as 'Interim' version in June 1994, scheduled for formal adoption late 1995. This paper is a survey of the methods in general use, resulting in the NASA guidelines for safety critical software development and analysis.

  16. Using Business Analysis Software in a Business Intelligence Course

    ERIC Educational Resources Information Center

    Elizondo, Juan; Parzinger, Monica J.; Welch, Orion J.

    2011-01-01

    This paper presents an example of a project used in an undergraduate business intelligence class which integrates concepts from statistics, marketing, and information systems disciplines. SAS Enterprise Miner software is used as the foundation for predictive analysis and data mining. The course culminates with a competition and the project is used…

  17. Otorhinolaryngologic disorders and diving accidents: an analysis of 306 divers.

    PubMed

    Klingmann, Christoph; Praetorius, Mark; Baumann, Ingo; Plinkert, Peter K

    2007-10-01

    Diving is a very popular leisure activity with an increasing number of participants. As more than 80% of the diving related problems involve the head and neck region, every otorhinolaryngologist should be familiar with diving medical standards. We here present an analysis of more than 300 patients we have treated in the past four years. Between January 2002 and October 2005, 306 patients presented in our department with otorhinological disorders after diving, or after diving accidents. We collected the following data: name, sex, age, date of treatment, date of accident, diagnosis, special aspects of the diagnosis, number of dives, diving certification, whether and which surgery had been performed, history of acute diving accidents or follow up treatment, assessment of fitness to dive and special remarks. The study setting was a retrospective cohort study. The distribution of the disorders was as follows: 24 divers (8%) with external ear disorders, 140 divers (46%) with middle ear disorders, 56 divers (18%) with inner ear disorders, 53 divers (17%) with disorders of the nose and sinuses, 24 divers (8%) with decompression illness (DCI) and 9 divers (3%) who complained of various symptoms. Only 18% of the divers presented with acute disorders. The most common disorder (24%) was Eustachian tube dysfunction. Female divers were significantly more often affected. Chronic sinusitis was found to be associated with a significantly higher number of performed dives. Conservative treatment failed in 30% of the patients but sinus surgery relieved symptoms in all patients of this group. The middle ear is the main problem area for divers. Middle ear ventilation problems due to Eustachian tube dysfunction can be treated conservatively with excellent results whereas pathology of the tympanic membrane and ossicular chain often require surgery. More than four out of five patients visited our department to re-establish their fitness to dive. Although the treatment of acute diving

  18. ImatraNMR: novel software for batch integration and analysis of quantitative NMR spectra.

    PubMed

    Mäkelä, A V; Heikkilä, O; Kilpeläinen, I; Heikkinen, S

    2011-08-01

    Quantitative NMR spectroscopy is a useful and important tool for analysis of various mixtures. Recently, in addition of traditional quantitative 1D (1)H and (13)C NMR methods, a variety of pulse sequences aimed for quantitative or semiquantitative analysis have been developed. To obtain actual usable results from quantitative spectra, they must be processed and analyzed with suitable software. Currently, there are many processing packages available from spectrometer manufacturers and third party developers, and most of them are capable of analyzing and integration of quantitative spectra. However, they are mainly aimed for processing single or few spectra, and are slow and difficult to use when large numbers of spectra and signals are being analyzed, even when using pre-saved integration areas or custom scripting features. In this article, we present a novel software, ImatraNMR, designed for batch analysis of quantitative spectra. In addition to capability of analyzing large number of spectra, it provides results in text and CSV formats, allowing further data-analysis using spreadsheet programs or general analysis programs, such as Matlab. The software is written with Java, and thus it should run in any platform capable of providing Java Runtime Environment version 1.6 or newer, however, currently it has only been tested with Windows and Linux (Ubuntu 10.04). The software is free for non-commercial use, and is provided with source code upon request. Copyright © 2011 Elsevier Inc. All rights reserved.

  19. Computing and software

    USGS Publications Warehouse

    White, Gary C.; Hines, J.E.

    2004-01-01

    The reality is that the statistical methods used for analysis of data depend upon the availability of software. Analysis of marked animal data is no different than the rest of the statistical field. The methods used for analysis are those that are available in reliable software packages. Thus, the critical importance of having reliable, up–to–date software available to biologists is obvious. Statisticians have continued to develop more robust models, ever expanding the suite of potential analysis methodsavailable. But without software to implement these newer methods, they will languish in the abstract, and not be applied to the problems deserving them.In the Computers and Software Session, two new software packages are described, a comparison of implementation of methods for the estimation of nest survival is provided, and a more speculative paper about how the next generation of software might be structured is presented.Rotella et al. (2004) compare nest survival estimation with different software packages: SAS logistic regression, SAS non–linear mixed models, and Program MARK. Nests are assumed to be visited at various, possibly infrequent, intervals. All of the approaches described compute nest survival with the same likelihood, and require that the age of the nest is known to account for nests that eventually hatch. However, each approach offers advantages and disadvantages, explored by Rotella et al. (2004).Efford et al. (2004) present a new software package called DENSITY. The package computes population abundance and density from trapping arrays and other detection methods with a new and unique approach. DENSITY represents the first major addition to the analysis of trapping arrays in 20 years.Barker & White (2004) discuss how existing software such as Program MARK require that each new model’s likelihood must be programmed specifically for that model. They wishfully think that future software might allow the user to combine pieces of likelihood

  20. Software reliability experiments data analysis and investigation

    NASA Technical Reports Server (NTRS)

    Walker, J. Leslie; Caglayan, Alper K.

    1991-01-01

    The objectives are to investigate the fundamental reasons which cause independently developed software programs to fail dependently, and to examine fault tolerant software structures which maximize reliability gain in the presence of such dependent failure behavior. The authors used 20 redundant programs from a software reliability experiment to analyze the software errors causing coincident failures, to compare the reliability of N-version and recovery block structures composed of these programs, and to examine the impact of diversity on software reliability using subpopulations of these programs. The results indicate that both conceptually related and unrelated errors can cause coincident failures and that recovery block structures offer more reliability gain than N-version structures if acceptance checks that fail independently from the software components are available. The authors present a theory of general program checkers that have potential application for acceptance tests.

  1. A Flexible Hierarchical Bayesian Modeling Technique for Risk Analysis of Major Accidents.

    PubMed

    Yu, Hongyang; Khan, Faisal; Veitch, Brian

    2017-09-01

    Safety analysis of rare events with potentially catastrophic consequences is challenged by data scarcity and uncertainty. Traditional causation-based approaches, such as fault tree and event tree (used to model rare event), suffer from a number of weaknesses. These include the static structure of the event causation, lack of event occurrence data, and need for reliable prior information. In this study, a new hierarchical Bayesian modeling based technique is proposed to overcome these drawbacks. The proposed technique can be used as a flexible technique for risk analysis of major accidents. It enables both forward and backward analysis in quantitative reasoning and the treatment of interdependence among the model parameters. Source-to-source variability in data sources is also taken into account through a robust probabilistic safety analysis. The applicability of the proposed technique has been demonstrated through a case study in marine and offshore industry. © 2017 Society for Risk Analysis.

  2. NOTE: Development of modified voxel phantoms for the numerical dosimetric reconstruction of radiological accidents involving external sources: implementation in SESAME tool

    NASA Astrophysics Data System (ADS)

    Courageot, Estelle; Sayah, Rima; Huet, Christelle

    2010-05-01

    Estimating the dose distribution in a victim's body is a relevant indicator in assessing biological damage from exposure in the event of a radiological accident caused by an external source. When the dose distribution is evaluated with a numerical anthropomorphic model, the posture and morphology of the victim have to be reproduced as realistically as possible. Several years ago, IRSN developed a specific software application, called the simulation of external source accident with medical images (SESAME), for the dosimetric reconstruction of radiological accidents by numerical simulation. This tool combines voxel geometry and the MCNP(X) Monte Carlo computer code for radiation-material interaction. This note presents a new functionality in this software that enables the modelling of a victim's posture and morphology based on non-uniform rational B-spline (NURBS) surfaces. The procedure for constructing the modified voxel phantoms is described, along with a numerical validation of this new functionality using a voxel phantom of the RANDO tissue-equivalent physical model.

  3. A coverage and slicing dependencies analysis for seeking software security defects.

    PubMed

    He, Hui; Zhang, Dongyan; Liu, Min; Zhang, Weizhe; Gao, Dongmin

    2014-01-01

    Software security defects have a serious impact on the software quality and reliability. It is a major hidden danger for the operation of a system that a software system has some security flaws. When the scale of the software increases, its vulnerability has becoming much more difficult to find out. Once these vulnerabilities are exploited, it may lead to great loss. In this situation, the concept of Software Assurance is carried out by some experts. And the automated fault localization technique is a part of the research of Software Assurance. Currently, automated fault localization method includes coverage based fault localization (CBFL) and program slicing. Both of the methods have their own location advantages and defects. In this paper, we have put forward a new method, named Reverse Data Dependence Analysis Model, which integrates the two methods by analyzing the program structure. On this basis, we finally proposed a new automated fault localization method. This method not only is automation lossless but also changes the basic location unit into single sentence, which makes the location effect more accurate. Through several experiments, we proved that our method is more effective. Furthermore, we analyzed the effectiveness among these existing methods and different faults.

  4. Scientific Data Analysis and Software Support: Geodynamics

    NASA Technical Reports Server (NTRS)

    Klosko, Steven; Sanchez, B. (Technical Monitor)

    2000-01-01

    The support on this contract centers on development of data analysis strategies, geodynamic models, and software codes to study four-dimensional geodynamic and oceanographic processes, as well as studies and mission support for near-Earth and interplanetary satellite missions. SRE had a subcontract to maintain the optical laboratory for the LTP, where instruments such as MOLA and GLAS are developed. NVI performed work on a Raytheon laser altimetry task through a subcontract, providing data analysis and final data production for distribution to users. HBG had a subcontract for specialized digital topography analysis and map generation. Over the course of this contract, Raytheon ITSS staff have supported over 60 individual tasks. Some tasks have remained in place during this entire interval whereas others have been completed and were of shorter duration. Over the course of events, task numbers were changed to reflect changes in the character of the work or new funding sources. The description presented below will detail the technical accomplishments that have been achieved according to their science and technology areas. What will be shown is a brief overview of the progress that has been made in each of these investigative and software development areas. Raytheon ITSS staff members have received many awards for their work on this contract, including GSFC Group Achievement Awards for TOPEX Precision Orbit Determination and the Joint Gravity Model One Team. NASA JPL gave the TOPEX/POSEIDON team a medal commemorating the completion of the primary mission and a Certificate of Appreciation. Raytheon ITSS has also received a Certificate of Appreciation from GSFC for its extensive support of the Shuttle Laser Altimeter Experiment.

  5. Software for analysis of chemical mixtures--composition, occurrence, distribution, and possible toxicity

    USGS Publications Warehouse

    Scott, Jonathon C.; Skach, Kenneth A.; Toccalino, Patricia L.

    2013-01-01

    The composition, occurrence, distribution, and possible toxicity of chemical mixtures in the environment are research concerns of the U.S. Geological Survey and others. The presence of specific chemical mixtures may serve as indicators of natural phenomena or human-caused events. Chemical mixtures may also have ecological, industrial, geochemical, or toxicological effects. Chemical-mixture occurrences vary by analyte composition and concentration. Four related computer programs have been developed by the National Water-Quality Assessment Program of the U.S. Geological Survey for research of chemical-mixture compositions, occurrences, distributions, and possible toxicities. The compositions and occurrences are identified for the user-supplied data, and therefore the resultant counts are constrained by the user’s choices for the selection of chemicals, reporting limits for the analytical methods, spatial coverage, and time span for the data supplied. The distribution of chemical mixtures may be spatial, temporal, and (or) related to some other variable, such as chemical usage. Possible toxicities optionally are estimated from user-supplied benchmark data. The software for the analysis of chemical mixtures described in this report is designed to work with chemical-analysis data files retrieved from the U.S. Geological Survey National Water Information System but can also be used with appropriately formatted data from other sources. Installation and usage of the mixture software are documented. This mixture software was designed to function with minimal changes on a variety of computer-operating systems. To obtain the software described herein and other U.S. Geological Survey software, visit http://water.usgs.gov/software/.

  6. Improving the Accuracy of Software-Based Energy Analysis for Residential Buildings (Presentation)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Polly, B.

    2011-09-01

    This presentation describes the basic components of software-based energy analysis for residential buildings, explores the concepts of 'error' and 'accuracy' when analysis predictions are compared to measured data, and explains how NREL is working to continuously improve the accuracy of energy analysis methods.

  7. Road profiling of traffic accidents in Jos, Nigeria, 1995-1999.

    PubMed

    Bombom, Leonard S; Edino, Marcus O

    2009-09-01

    Road traffic accident data in Nigeria generally lack exact coordinate information. Accident analysis is, therefore, restricted to aggregate data on trends, magnitude and temporal dimensions. This article addresses the road accident problem in Jos between 1995 and 1999 through a road profiling approach. Results show that four gateway routes, seven multi-lane roadways (including two gateway routes) and seven road intersections accounted for 84% of all traffic accidents, 84% of injured casualties and 88% of fatalities. This approach allows for quantification of impacts of controlling for accidents by deliberate profiling of roads for close monitoring and policing. For example, reducing accident counts and fatalities by 50% each on gateway routes will amount to approximately 35 and 40% reduction in accident and fatality counts, respectively. Countermeasures must consider these roadways and intersections as important inputs in their accidents and casualty reduction targets.

  8. HITCal: a software tool for analysis of video head impulse test responses.

    PubMed

    Rey-Martinez, Jorge; Batuecas-Caletrio, Angel; Matiño, Eusebi; Perez Fernandez, Nicolás

    2015-09-01

    The developed software (HITCal) may be a useful tool in the analysis and measurement of the saccadic video head impulse test (vHIT) responses and with the experience obtained during its use the authors suggest that HITCal is an excellent method for enhanced exploration of vHIT outputs. To develop a (software) method to analyze and explore the vHIT responses, mainly saccades. HITCal was written using a computational development program; the function to access a vHIT file was programmed; extended head impulse exploration and measurement tools were created and an automated saccade analysis was developed using an experimental algorithm. For pre-release HITCal laboratory tests, a database of head impulse tests (HITs) was created with the data collected retrospectively in three reference centers. This HITs database was evaluated by humans and was also computed with HITCal. The authors have successfully built HITCal and it has been released as open source software; the developed software was fully operative and all the proposed characteristics were incorporated in the released version. The automated saccades algorithm implemented in HITCal has good concordance with the assessment by human observers (Cohen's kappa coefficient = 0.7).

  9. Major differences in rates of occupational accidents between different nationalities of seafarers.

    PubMed

    Hansen, Henrik L; Laursen, Lise Hedgaard; Frydberg, Morten; Kristensen, Soeren

    2008-01-01

    Earlier studies and statistics have shown that merchant seafarers from the South East Asia had considerable lower accident rates when compared with seafarers from Western Europe. The purposes of the study were to investigate whether the earlier observations were sustained if further sources on occurrence of accidents were used and to identify specific causes of excess accident rates among certain nationalities. Occupational accidents aboard Danish merchant ships during one year were identified from four different sources. These included accidents reported to the maritime authorities, accidents reported to a mutual insurance company, files on medical costs reimbursed by the government and finally, accidents in which there has been contact to the radio medical service. Time at risk aboard was obtained from a register on all employment periods aboard merchant ships. A total of 943 accidents causing personal injury to a seafarer directly caused by work aboard were identified. Among these accidents, 499 had taken place aboard cargo ships in international trade. Only these were used in the detailed analysis. The accident rate for all identified accidents aboard cargo ships were 84 accidents per 1,000 years aboard. The crude incidence rate ratio (IRR) for East European seafarers was 0.88 and for South East Asians 0.38 using West European seafarers as reference. In a Poisson regression analysis, the IRR for South East Asians was 0.29 (0.22-0.38). In an analysis including only more serious accidents, IRR for South East Asians rose to 0.36 (0.26-0.48). This study indicates that seafarers from South East Asia, mainly the Philippines, may have a genuine lower risk of occupational accidents in comparison with seafarers from Western and Eastern Europe. Differences in approach to safety and risk taking between South East Asian and European seafarers should be identified and positives attitudes included in accident preventing programmes. Main messages Seafarers from South East

  10. An aftermath analysis of the 2014 coal mine accident in Soma, Turkey: Use of risk performance indicators based on historical experience.

    PubMed

    Spada, Matteo; Burgherr, Peter

    2016-02-01

    On the 13th of May 2014 a fire related incident in the Soma coal mine in Turkey caused 301 fatalities and more than 80 injuries. This has been the largest coal mine accident in Turkey, and in the OECD country group, so far. This study investigated if such a disastrous event should be expected, in a statistical sense, based on historical observations. For this purpose, PSI's ENSAD database is used to extract accident data for the period 1970-2014. Four different cases are analyzed, i.e., OECD, OECD w/o Turkey, Turkey and USA. Analysis of temporal trends for annual numbers of accidents and fatalities indicated a non-significant decreasing tendency for OECD and OECD w/o Turkey and a significant one for USA, whereas for Turkey both measures showed an increase over time. The expectation analysis revealed clearly that an event with the consequences of the Soma accident is rather unlikely for OECD, OECD w/o Turkey and USA. In contrast, such a severe accident has a substantially higher expectation for Turkey, i.e. it cannot be considered an extremely rare event, based on historical experience. This indicates a need for improved safety measures and stricter regulations in the Turkish coal mining sector in order to get closer to the rest of OECD. Copyright © 2015 Elsevier Ltd. All rights reserved.

  11. Analysis of labour accidents in tunnel construction and introduction of prevention measures

    PubMed Central

    KIKKAWA, Naotaka; ITOH, Kazuya; HORI, Tomohito; TOYOSAWA, Yasuo; ORENSE, Rolando P.

    2015-01-01

    At present, almost all mountain tunnels in Japan are excavated and constructed utilizing the New Austrian Tunneling Method (NATM), which was advocated by Prof. Rabcewicz of Austria in 1964. In Japan, this method has been applied to tunnel construction since around 1978, after which there has been a subsequent decrease in the number of casualties during tunnel construction. However, there is still a relatively high incidence of labour accidents during tunnel construction when compared to incidence rates in the construction industry in general. During tunnel construction, rock fall events at the cutting face are a particularly characteristic of the type of accident that occurs. In this study, we analysed labour accidents that possess the characteristics of a rock fall event at a work site. We also introduced accident prevention measures against rock fall events. PMID:26027707

  12. Wildlife software: procedures for publication of computer software

    USGS Publications Warehouse

    Samuel, M.D.

    1990-01-01

    Computers and computer software have become an integral part of the practice of wildlife science. Computers now play an important role in teaching, research, and management applications. Because of the specialized nature of wildlife problems, specific computer software is usually required to address a given problem (e.g., home range analysis). This type of software is not usually available from commercial vendors and therefore must be developed by those wildlife professionals with particular skill in computer programming. Current journal publication practices generally prevent a detailed description of computer software associated with new techniques. In addition, peer review of journal articles does not usually include a review of associated computer software. Thus, many wildlife professionals are usually unaware of computer software that would meet their needs or of major improvements in software they commonly use. Indeed most users of wildlife software learn of new programs or important changes only by word of mouth.

  13. Performing Quantitative Imaging Acquisition, Analysis and Visualization Using the Best of Open Source and Commercial Software Solutions.

    PubMed

    Shenoy, Shailesh M

    2016-07-01

    A challenge in any imaging laboratory, especially one that uses modern techniques, is to achieve a sustainable and productive balance between using open source and commercial software to perform quantitative image acquisition, analysis and visualization. In addition to considering the expense of software licensing, one must consider factors such as the quality and usefulness of the software's support, training and documentation. Also, one must consider the reproducibility with which multiple people generate results using the same software to perform the same analysis, how one may distribute their methods to the community using the software and the potential for achieving automation to improve productivity.

  14. AN EVALUATION OF FIVE COMMERCIAL IMMUNOASSAY DATA ANALYSIS SOFTWARE SYSTEMS

    EPA Science Inventory

    An evaluation of five commercial software systems used for immunoassay data analysis revealed numerous deficiencies. Often, the utility of statistical output was compromised by poor documentation. Several data sets were run through each system using a four-parameter calibration f...

  15. Retrospection of Chernobyl nuclear accident for decision analysis concerning remedial actions in Ukraine

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Georgievskiy, Vladimir

    2007-07-01

    It is considered the efficacy of decisions concerning remedial actions when of-site radiological monitoring in the early and (or) in the intermediate phases was absent or was not informative. There are examples of such situations in the former Soviet Union where many people have been exposed: releases of radioactive materials from 'Krasnoyarsk-26' into Enisey River, releases of radioactive materials from 'Chelabinsk-65' (the Kishtim accident), nuclear tests at the Semipalatinsk Test Site, the Chernobyl nuclear accident etc. If monitoring in the early and (or) in the intermediate phases is absent the decisions concerning remedial actions are usually developed on the basemore » of permanent monitoring. However decisions of this kind may be essentially erroneous. For these cases it is proposed to make retrospection of radiological data of the early and intermediate phases of nuclear accident and to project decisions concerning remedial actions on the base of both retrospective data and permanent monitoring data. In this Report the indicated problem is considered by the example of the Chernobyl accident for Ukraine. Their of-site radiological monitoring in the early and intermediate phases was unsatisfactory. In particular, the pasture-cow-milk monitoring had not been made. All official decisions concerning dose estimations had been made on the base of measurements of {sup 137}Cs in body (40 measurements in 135 days and 55 measurements in 229 days after the Chernobyl accident). For the retrospection of radiological data of the Chernobyl accident dynamic model has been developed. This model has structure similar to the structure of Pathway model and Farmland model. Parameters of the developed model have been identified for agricultural conditions of Russia and Ukraine. By means of this model dynamics of 20 radionuclides in pathways and dynamics of doses have been estimated for the early, intermediate and late phases of the Chernobyl accident. The main results are

  16. Organizational influence on the occurrence of work accidents involving exposure to biological material.

    PubMed

    Marziale, Maria Helena Palucci; Rocha, Fernanda Ludmilla Rossi; Robazzi, Maria Lúcia do Carmo Cruz; Cenzi, Camila Maria; dos Santos, Heloisa Ehmke Cardoso; Trovó, Marli Elisa Mendes

    2013-01-01

    to analyze work accidents involving exposure to biological materials which took place among personnel working in nursing and to evaluate the influence of the organizational culture on the occurrence of these accidents. a retrospective, analytical study, carried out in two stages in a hospital that was part of the Network for the Prevention of Work Accidents. The first stage involved the analysis of the characteristics of the work accidents involving exposure to biological materials as recorded over a seven-year period by the nursing staff in the hospital studied, and registered in the Network databank. The second stage involved the analysis of 122 nursing staff members' perception of the institutional culture, who were allocated to the control group (workers who had not had an accident) and the case group (workers who had had an accident). 386 accidents had been recorded: percutaneous lesions occurred in 79% of the cases, needles were the materials involved in 69.7% of the accidents, and in 81.9% of the accident there was contact with blood. Regarding the influence of the organizational culture on the occurrence of accidents, the results obtained through the analysis of the two groups did not demonstrate significant differences between the average scores attributed by the workers in each organizational value or practice category. It is concluded that accidents involving exposure to biological material need to be avoided, however, it was not possible to confirm the influence of organizational values or practices on workers' behavior concerning the occurrence of these accidents.

  17. Analysis of the SL-1 Accident Using RELAPS5-3D

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Francisco, A.D. and Tomlinson, E. T.

    2007-11-08

    On January 3, 1961, at the National Reactor Testing Station, in Idaho Falls, Idaho, the Stationary Low Power Reactor No. 1 (SL-1) experienced a major nuclear excursion, killing three people, and destroying the reactor core. The SL-1 reactor, a 3 MW{sub t} boiling water reactor, was shut down and undergoing routine maintenance work at the time. This paper presents an analysis of the SL-1 reactor excursion using the RELAP5-3D thermal-hydraulic and nuclear analysis code, with the intent of simulating the accident from the point of reactivity insertion to destruction and vaporization of the fuel. Results are presented, along with amore » discussion of sensitivity to some reactor and transient parameters (many of the details are only known with a high level of uncertainty).« less

  18. Space station software reliability analysis based on failures observed during testing at the multisystem integration facility

    NASA Technical Reports Server (NTRS)

    Tamayo, Tak Chai

    1987-01-01

    Quality of software not only is vital to the successful operation of the space station, it is also an important factor in establishing testing requirements, time needed for software verification and integration as well as launching schedules for the space station. Defense of management decisions can be greatly strengthened by combining engineering judgments with statistical analysis. Unlike hardware, software has the characteristics of no wearout and costly redundancies, thus making traditional statistical analysis not suitable in evaluating reliability of software. A statistical model was developed to provide a representation of the number as well as types of failures occur during software testing and verification. From this model, quantitative measure of software reliability based on failure history during testing are derived. Criteria to terminate testing based on reliability objectives and methods to estimate the expected number of fixings required are also presented.

  19. Software Analysis of New Space Gravity Data for Geophysics and Climate Research

    NASA Technical Reports Server (NTRS)

    Deese, Rupert; Ivins, Erik R.; Fielding, Eric J.

    2012-01-01

    Both the Gravity Recovery and Climate Experiment (GRACE) and Gravity field and steady-state Ocean Circulation Explorer (GOCE) satellites are returning rich data for the study of the solid earth, the oceans, and the climate. Current software analysis tools do not provide researchers with the ease and flexibility required to make full use of this data. We evaluate the capabilities and shortcomings of existing software tools including Mathematica, the GOCE User Toolbox, the ICGEM's (International Center for Global Earth Models) web server, and Tesseroids. Using existing tools as necessary, we design and implement software with the capability to produce gridded data and publication quality renderings from raw gravity data. The straight forward software interface marks an improvement over previously existing tools and makes new space gravity data more useful to researchers. Using the software we calculate Bouguer anomalies of the gravity tensor's vertical component in the Gulf of Mexico, Antarctica, and the 2010 Maule earthquake region. These maps identify promising areas of future research.

  20. Analysis of Occupational Accident Fatalities and Injuries Among Male Group in Iran Between 2008 and 2012

    PubMed Central

    Alizadeh, Seyed Shamseddin; Mortazavi, Seyed Bagher; Sepehri, Mohammad Mehdi

    2015-01-01

    Background: Because of occupational accidents, permanent disabilities and deaths occur and economic and workday losses emerge. Objectives: The purpose of the present study was to investigate the factors responsible for occupational accidents occurred in Iran. Patients and Methods: The current study analyzed 1464 occupational accidents recorded by the Ministry of Labor and Social Affairs’ offices in Iran during 2008 - 2012. At first, general understanding of accidents was obtained using descriptive statistics. Afterwards, the chi-square test and Cramer’s V statistic (Vc) were used to determine the association between factors influencing the type of injury as occupational accident outcomes. Results: There was no significant association between marital status and time of day with the type of injury. However, activity sector, cause of accident, victim’s education, age of victim and victim’s experience were significantly associated with the type of injury. Conclusions: Successful accident prevention relies largely on knowledge about the causes of accidents. In any accident control activity, particularly in occupational accidents, correctly identifying high-risk groups and factors influencing accidents is the key to successful interventions. Results of this study can cause to increase accident awareness and enable workplace’s management to select and prioritize problem areas and safety system weakness in workplaces. PMID:26568848

  1. Accidents in Germany: evaluation of the german telephone health survey 2004.

    PubMed

    Saß, Anke-Christine

    2008-09-01

    In the year 2006 there were over 19 000 fatal accidents in Germany and estimates put the number of accidental injuries at more than 8 million. Detailed information on the pattern of accidents is indispensable for the definition of priorities in accident prevention. The German Telephone Health Survey 2004 provides representative cross-sectional data on the health of German residents from 18 years of age (n = 7341). Questions on the prevalence of accidents (13 items) were selected for analysis. Every tenth interviewee reported being injured in an accident in the previous 12 months. Men, particularly young men, are at greater risk of accidents than women. Almost one third of all accidents occurred at home. Social status had no influence on the probability of having an accident, but did affect where the accident happened. The survey yields information on the overall pattern of non-fatal accidents in Germany. The data point to target groups for accident prevention measures.

  2. [Analysis and evaluation of occupational accidents in dancers of the dance theatre].

    PubMed

    Wanke, E M; Groneberg, D A; Quarcoo, D

    2011-03-01

    The dance theatre is an autonomous form of presentation within the performing arts. It is a combination of dance, drama, singing and speaking. As the actors are usually professional dancers the dance theatre is associated with the professional dance. Compared with other dance styles there is an enhanced usage of props, costumes or décor to intensify the production and the expressiveness. In contrast to the defined professional dance technique the range of movements is unlimited. There has not yet been done any research on the influence of props as well as décor in terms of exogenous factors potentially favouring injuries. Aim of this study is to characterize specific injury patterns, as well as their causes and to suggest basic approaches to prevent injuries in the dance theatre. The data of this evaluation comprise occupational accident reports, accident reports of various Berlin theatres as well as case records of all Berlin State Theatres (n = 1106) of the Berlin State Accident Insurance over a 9-year period. 103 occupational accidents are accounted for the dance theatre. 44.6 % of the accidents happen during rehearsals, 42.4 % during performances, 76.7 % on stage and adjoining areas and 10.7 % in the ballet studio. Second most common movement resulting in an injury are jumps with 25.4 %. Altogether 69.7 % of the accidents have a uniquely defined exogenous cause with 30.5 % by props, 12.7 % by the floor and 17.2 % by the dance partner. 30.3 % of the accidents have multifactorial causes (e. g. the social situation, state of training and nutrition). 61 % of all accidents happen within three hours after starting work with an increase of occupational accidents between 11:00 - 12:00 hrs and 08:00- 09:00 hrs. The lower extremity is the most affected location (53.3 %), followed by the head/neck area (21.4 %) and the upper extremity (17.5 %). Contusions (26.2 %), distortions (17.5 %), muscular strains (19.4 %) and wounds (13.6 %) are the most frequent types of

  3. Of Disasters and Dragon Kings: A Statistical Analysis of Nuclear Power Incidents and Accidents.

    PubMed

    Wheatley, Spencer; Sovacool, Benjamin; Sornette, Didier

    2017-01-01

    We perform a statistical study of risk in nuclear energy systems. This study provides and analyzes a data set that is twice the size of the previous best data set on nuclear incidents and accidents, comparing three measures of severity: the industry standard International Nuclear Event Scale, the Nuclear Accident Magnitude Scale of radiation release, and cost in U.S. dollars. The rate of nuclear accidents with cost above 20 MM 2013 USD, per reactor per year, has decreased from the 1970s until the present time. Along the way, the rate dropped significantly after Chernobyl (April 1986) and is expected to be roughly stable around a level of 0.003, suggesting an average of just over one event per year across the current global fleet. The distribution of costs appears to have changed following the Three Mile Island major accident (March 1979). The median cost became approximately 3.5 times smaller, but an extremely heavy tail emerged, being well described by a Pareto distribution with parameter α = 0.5-0.6. For instance, the cost of the two largest events, Chernobyl and Fukushima (March 2011), is equal to nearly five times the sum of the 173 other events. We also document a significant runaway disaster regime in both radiation release and cost data, which we associate with the "dragon-king" phenomenon. Since the major accident at Fukushima (March 2011) occurred recently, we are unable to quantify an impact of the industry response to this disaster. Excluding such improvements, in terms of costs, our range of models suggests that there is presently a 50% chance that (i) a Fukushima event (or larger) occurs every 60-150 years, and (ii) that a Three Mile Island event (or larger) occurs every 10-20 years. Further-even assuming that it is no longer possible to suffer an event more costly than Chernobyl or Fukushima-the expected annual cost and its standard error bracket the cost of a new plant. This highlights the importance of improvements not only immediately following

  4. UTOOLS: microcomputer software for spatial analysis and landscape visualization.

    Treesearch

    Alan A. Ager; Robert J. McGaughey

    1997-01-01

    UTOOLS is a collection of programs designed to integrate various spatial data in a way that allows versatile spatial analysis and visualization. The programs were designed for watershed-scale assessments in which a wide array of resource data must be integrated, analyzed, and interpreted. UTOOLS software combines raster, attribute, and vector data into "spatial...

  5. OpenStereo: Open Source, Cross-Platform Software for Structural Geology Analysis

    NASA Astrophysics Data System (ADS)

    Grohmann, C. H.; Campanha, G. A.

    2010-12-01

    Free and open source software (FOSS) are increasingly seen as synonyms of innovation and progress. Freedom to run, copy, distribute, study, change and improve the software (through access to the source code) assure a high level of positive feedback between users and developers, which results in stable, secure and constantly updated systems. Several software packages for structural geology analysis are available to the user, with commercial licenses or that can be downloaded at no cost from the Internet. Some provide basic tools of stereographic projections such as plotting poles, great circles, density contouring, eigenvector analysis, data rotation etc, while others perform more specific tasks, such as paleostress or geotechnical/rock stability analysis. This variety also means a wide range of data formating for input, Graphical User Interface (GUI) design and graphic export format. The majority of packages is built for MS-Windows and even though there are packages for the UNIX-based MacOS, there aren't native packages for *nix (UNIX, Linux, BSD etc) Operating Systems (OS), forcing the users to run these programs with emulators or virtual machines. Those limitations lead us to develop OpenStereo, an open source, cross-platform software for stereographic projections and structural geology. The software is written in Python, a high-level, cross-platform programming language and the GUI is designed with wxPython, which provide a consistent look regardless the OS. Numeric operations (like matrix and linear algebra) are performed with the Numpy module and all graphic capabilities are provided by the Matplolib library, including on-screen plotting and graphic exporting to common desktop formats (emf, eps, ps, pdf, png, svg). Data input is done with simple ASCII text files, with values of dip direction and dip/plunge separated by spaces, tabs or commas. The user can open multiple file at the same time (or the same file more than once), and overlay different elements of

  6. [Accidents in equestrian sports : Analysis of injury mechanisms and patterns].

    PubMed

    Schröter, C; Schulte-Sutum, A; Zeckey, C; Winkelmann, M; Krettek, C; Mommsen, P

    2017-02-01

    Equestrian sports are one of the most popular forms of sport in Germany, while also being one of the most accident-prone sports. Furthermore, riding accidents are frequently associated with a high degree of severity of injuries and mortality. Nevertheless, there are insufficient data regarding incidences, demographics, mechanisms of accidents, injury severity and patterns and outcome of injured persons in amateur equestrian sports. Accordingly, it was the aim of the present study to retrospectively analyze these aspects. A total of 503 patients were treated in the emergency room of the Hannover Medical School because of an accident during recreational horse riding between 2006 and 2011. The female gender was predominantly affected with 89.5 %. The mean age of the patients was 26.2 ± 14.9 years and women (24.5 ± 12.5 years) were on average younger than men (40.2 ± 23.9 years). A special risk group was girls and young women aged between 10 and 39 years. The overall injury severity was measured using the injury severity score (ISS). Based on the total population, head injuries were the most common location of injuries with 17.3 % followed by injuries to the upper extremities with 15.2 % and the thoracic and lumbar spine with 10.9 %. The three most common injury locations after falling from a horse were the head (17.5 %), the upper extremities (17.4 %), the thoracic and lumbar spine (12.9 %). The most frequent injuries while handling horses were foot injuries (17.2 %), followed by head (16.6 %) and mid-facial injuries (15.0 %). With respect to the mechanism of injury accidents while riding were predominant (74 %), while accidents when handling horses accounted for only 26 %. The median ISS was 9.8 points. The proportion of multiple trauma patients (ISS > 16) was 18.1 %. Based on the total sample, the average in-hospital patient stay was 5.3 ± 5.4 days with a significantly higher proportion of hospitalized patients in the

  7. TLM-Tracker: software for cell segmentation, tracking and lineage analysis in time-lapse microscopy movies.

    PubMed

    Klein, Johannes; Leupold, Stefan; Biegler, Ilona; Biedendieck, Rebekka; Münch, Richard; Jahn, Dieter

    2012-09-01

    Time-lapse imaging in combination with fluorescence microscopy techniques enable the investigation of gene regulatory circuits and uncovered phenomena like culture heterogeneity. In this context, computational image processing for the analysis of single cell behaviour plays an increasing role in systems biology and mathematical modelling approaches. Consequently, we developed a software package with graphical user interface for the analysis of single bacterial cell behaviour. A new software called TLM-Tracker allows for the flexible and user-friendly interpretation for the segmentation, tracking and lineage analysis of microbial cells in time-lapse movies. The software package, including manual, tutorial video and examples, is available as Matlab code or executable binaries at http://www.tlmtracker.tu-bs.de.

  8. Quantification method analysis of the relationship between occupant injury and environmental factors in traffic accidents.

    PubMed

    Ju, Yong Han; Sohn, So Young

    2011-01-01

    Injury analysis following a vehicle crash is one of the most important research areas. However, most injury analyses have focused on one-dimensional injury variables, such as the AIS (Abbreviated Injury Scale) or the IIS (Injury Impairment Scale), at a time in relation to various traffic accident factors. However, these studies cannot reflect the various injury phenomena that appear simultaneously. In this paper, we apply quantification method II to the NASS (National Automotive Sampling System) CDS (Crashworthiness Data System) to find the relationship between the categorical injury phenomena, such as the injury scale, injury position, and injury type, and the various traffic accident condition factors, such as speed, collision direction, vehicle type, and seat position. Our empirical analysis indicated the importance of safety devices, such as restraint equipment and airbags. In addition, we found that narrow impact, ejection, air bag deployment, and higher speed are associated with more severe than minor injury to the thigh, ankle, and leg in terms of dislocation, abrasion, or laceration. Copyright © 2010 Elsevier Ltd. All rights reserved.

  9. Software safety

    NASA Technical Reports Server (NTRS)

    Leveson, Nancy

    1987-01-01

    Software safety and its relationship to other qualities are discussed. It is shown that standard reliability and fault tolerance techniques will not solve the safety problem for the present. A new attitude requires: looking at what you do NOT want software to do along with what you want it to do; and assuming things will go wrong. New procedures and changes to entire software development process are necessary: special software safety analysis techniques are needed; and design techniques, especially eliminating complexity, can be very helpful.

  10. A Human Factors Analysis and Classification System (HFACS) Examination of Commercial Vessel Accidents

    DTIC Science & Technology

    2012-09-01

    Naval Operations before the Congress on FY2013 Department of Navy posture. Heinrich , H . W. (1941). Industrial accident prevention : A scientific...Theory The core of the Domino Theory, developed by Herbert W. Heinrich who studied industrial safety in the early 1900s, is that accidents are a result...chain of events resulting in an accident . Heinrich likened the dominos to unsafe conditions or unsafe acts, where their subsequent removal prevents a

  11. [Retrospective analysis and prevention strategies for accidents associated with cervical manipulation in China].

    PubMed

    Wang, Hui-Hao; Zhan, Hong-Sheng; Zhang, Ming-Cai; Chen, Bo; Guo, Kai

    2012-09-01

    To review previously reported injuries cases which were associated with cervical manipulation in China, and to describe the risks and benefits of the therapy. Relevant case reports, review articles, surveys, and investigations regarding treatment of cervical spondylosis with cervical manipulation involving accidents and associated complication were retrived with a search of the literature from SinoMed, CNKI, CQVIP, and Wanfang digital databases between 1979 to March 2011. The data were extracted and statistically analyzed. Total of 150 cases of injury reported in 40 articles corresponded the inclusion criteria. Accidents occurred in 156 cases,of them,syncope was in 45 cases (28.85%), mild spinal cord injury or compression was in 34 cases (21.79%), nerve root injury was in 24 cases (15.38%), ineffective or symptom increased was in 11 cases (7.05%); cervical spine fracture was in 11 cases (7.05%), dislocation or semiluxation was in 6 cases (3.85%), soft tissue injury was in 3 cases (1.92%), serious accident was 22 cases (14.70%, including paralysis, death and cerebrovascular accident). In cases of serious accident, 12 cases (54.55%) had the other primary diseases. Types of related manipulation including rotation reduction (42.00%, 63 cases), rubbing points or muscle resulting strong stimulation (28.00%, 42 cases). 100 cases (66.67%) obtained cured or basically recovered results, 21 cases (14.00%) improved, 4 cases (2.67%) deterioration and 5 cases (3.33%) died. It is imperative for practitioners to complete the patients' management and assessment before manipulation. That the practitioners conduct a detailed physical examination and make a correct diagnosis would be a pivot method of avoiding accidents. Excluding contraindications and potential risks, standardizing evaluation criteria and practitioners' qualification, increasing safety awareness and risk assessment and strengthening the monitoring of the accidents could decrease the incidence of accidents.

  12. eXtended CASA Line Analysis Software Suite (XCLASS)

    NASA Astrophysics Data System (ADS)

    Möller, T.; Endres, C.; Schilke, P.

    2017-02-01

    The eXtended CASA Line Analysis Software Suite (XCLASS) is a toolbox for the Common Astronomy Software Applications package (CASA) containing new functions for modeling interferometric and single dish data. Among the tools is the myXCLASS program which calculates synthetic spectra by solving the radiative transfer equation for an isothermal object in one dimension, whereas the finite source size and dust attenuation are considered as well. Molecular data required by the myXCLASS program are taken from an embedded SQLite3 database containing entries from the Cologne Database for Molecular Spectroscopy (CDMS) and JPL using the Virtual Atomic and Molecular Data Center (VAMDC) portal. Additionally, the toolbox provides an interface for the model optimizer package Modeling and Analysis Generic Interface for eXternal numerical codes (MAGIX), which helps to find the best description of observational data using myXCLASS (or another external model program), that is, finding the parameter set that most closely reproduces the data. http://www.astro.uni-koeln.de/projects/schilke/myXCLASSInterface A copy of the code is available at the CDS via anonymous ftp to http://cdsarc.u-strasbg.fr (http://130.79.128.5) or via http://cdsarc.u-strasbg.fr/viz-bin/qcat?J/A+A/598/A7

  13. Motorcycle accident cause factors and identification of countermeasures. Volume 1 : technical report

    DOT National Transportation Integrated Search

    1981-01-01

    This report presents the data and find~ings from the on-scene, in-depth : investigations of 900 motorcycle accidents and the analysis of 3600 traffic : accident reports of motorcycle accidents in the same study area. : Comprehensive data were collect...

  14. Differences in Characteristics of Aviation Accidents during 1993-2012 Based on Flight Purpose

    NASA Technical Reports Server (NTRS)

    Evans, Joni K.

    2016-01-01

    Usually aviation accidents are categorized and analyzed within flight conduct rules (Part 121, Part 135, Part 91) because differences in accident rates within flight rules have been demonstrated. Even within a particular flight rule the flights have different purposes. For many, Part 121 flights are synonymous with scheduled passenger transport, and indeed this is the largest group of Part 121 accidents. But there are also non-scheduled (charter) passenger transport and cargo flights. The primary purpose of the analysis reported here is to examine the differences in aviation accidents based on the purpose of the flight. Some of the factors examined are the accident severity, aircraft characteristics and accident occurrence categories. Twenty consecutive years of data were available and utilized to complete this analysis.

  15. Software phantom with realistic speckle modeling for validation of image analysis methods in echocardiography

    NASA Astrophysics Data System (ADS)

    Law, Yuen C.; Tenbrinck, Daniel; Jiang, Xiaoyi; Kuhlen, Torsten

    2014-03-01

    Computer-assisted processing and interpretation of medical ultrasound images is one of the most challenging tasks within image analysis. Physical phenomena in ultrasonographic images, e.g., the characteristic speckle noise and shadowing effects, make the majority of standard methods from image analysis non optimal. Furthermore, validation of adapted computer vision methods proves to be difficult due to missing ground truth information. There is no widely accepted software phantom in the community and existing software phantoms are not exible enough to support the use of specific speckle models for different tissue types, e.g., muscle and fat tissue. In this work we propose an anatomical software phantom with a realistic speckle pattern simulation to _ll this gap and provide a exible tool for validation purposes in medical ultrasound image analysis. We discuss the generation of speckle patterns and perform statistical analysis of the simulated textures to obtain quantitative measures of the realism and accuracy regarding the resulting textures.

  16. Software Design Improvements. Part 1; Software Benefits and Limitations

    NASA Technical Reports Server (NTRS)

    Lalli, Vincent R.; Packard, Michael H.; Ziemianski, Tom

    1997-01-01

    Computer hardware and associated software have been used for many years to process accounting information, to analyze test data and to perform engineering analysis. Now computers and software also control everything from automobiles to washing machines and the number and type of applications are growing at an exponential rate. The size of individual program has shown similar growth. Furthermore, software and hardware are used to monitor and/or control potentially dangerous products and safety-critical systems. These uses include everything from airplanes and braking systems to medical devices and nuclear plants. The question is: how can this hardware and software be made more reliable? Also, how can software quality be improved? What methodology needs to be provided on large and small software products to improve the design and how can software be verified?

  17. Application of software technology to automatic test data analysis

    NASA Technical Reports Server (NTRS)

    Stagner, J. R.

    1991-01-01

    The verification process for a major software subsystem was partially automated as part of a feasibility demonstration. The methods employed are generally useful and applicable to other types of subsystems. The effort resulted in substantial savings in test engineer analysis time and offers a method for inclusion of automatic verification as a part of regression testing.

  18. Software Engineering Improvement Activities/Plan

    NASA Technical Reports Server (NTRS)

    2003-01-01

    bd Systems personnel accomplished the technical responsibilities for this reporting period, as planned. A close working relationship was maintained with personnel of the MSFC Avionics Department Software Group (ED14). Work accomplishments included development, evaluation, and enhancement of a software cost model, performing literature search and evaluation of software tools available for code analysis and requirements analysis, and participating in other relevant software engineering activities. Monthly reports were submitted. This support was provided to the Flight Software Group/ED 1 4 in accomplishing the software engineering improvement engineering activities of the Marshall Space Flight Center (MSFC) Software Engineering Improvement Plan.

  19. HDX Workbench: Software for the Analysis of H/D Exchange MS Data

    NASA Astrophysics Data System (ADS)

    Pascal, Bruce D.; Willis, Scooter; Lauer, Janelle L.; Landgraf, Rachelle R.; West, Graham M.; Marciano, David; Novick, Scott; Goswami, Devrishi; Chalmers, Michael J.; Griffin, Patrick R.

    2012-09-01

    Hydrogen/deuterium exchange mass spectrometry (HDX-MS) is an established method for the interrogation of protein conformation and dynamics. While the data analysis challenge of HDX-MS has been addressed by a number of software packages, new computational tools are needed to keep pace with the improved methods and throughput of this technique. To address these needs, we report an integrated desktop program titled HDX Workbench, which facilitates automation, management, visualization, and statistical cross-comparison of large HDX data sets. Using the software, validated data analysis can be achieved at the rate of generation. The application is available at the project home page http://hdx.florida.scripps.edu.

  20. Orbit Software Suite

    NASA Technical Reports Server (NTRS)

    Osgood, Cathy; Williams, Kevin; Gentry, Philip; Brownfield, Dana; Hallstrom, John; Stuit, Tim

    2012-01-01

    Orbit Software Suite is used to support a variety of NASA/DM (Dependable Multiprocessor) mission planning and analysis activities on the IPS (Intrusion Prevention System) platform. The suite of Orbit software tools (Orbit Design and Orbit Dynamics) resides on IPS/Linux workstations, and is used to perform mission design and analysis tasks corresponding to trajectory/ launch window, rendezvous, and proximity operations flight segments. A list of tools in Orbit Software Suite represents tool versions established during/after the Equipment Rehost-3 Project.